I am currently an AP at the School of Computer Science and Artificial Intelligence, Zhengzhou University. I received my Ph.D. from the Natural Language Processing Laboratory at Soochow University, supervised by Junhui Li and Min Zhang. During my doctoral studies, I conducted research internships at Huawei 2012 Lab and the Advanced Translation Technology (ATT) Research Group at the National Institute of Communications Technology (NICT), Japan. After completing my Ph.D., I joined the Huawei 2012 Lab. My research interests include machine translation, with a particular focus on enhancing long-text and document-level translation.

🔈 JOIN US!

Inspired by the philosophy that understanding emerges through creation, my research aims to advance the foundations and applications of large language models. I am particularly interested in exploring how foundation models can be designed, adapted, and evaluated to better support complex reasoning and long-context generation. My work seeks to bridge model capabilities with practical generation and evaluation challenges in real-world, document-level scenarios.

Do not hesitate to drop me an email for any possible collaboration if you are interested in these directions:

  • Foundation Models: large-scale pretraining, architectural design, and capability analysis
  • Long-Context Generation and Evaluation: document-level and long-text generation, coherence modeling, and automatic evaluation
  • Multimodal Foundation Models: unified modeling across text, speech, and vision, and cross-modal understanding and generation

🔥 News

  • 2025.12: Our paper “A Unified Framework for Document-level Repair and Translation with Global Awareness” is accepted by TASLP
  • 2025.12: Our paper “Enhancing LLM-based Context-Aware Machine Translation via Multi-phase and Multi-task Prompt Tuning” is accepted by TNNLS

👨‍🎓 Education

💻 Work Experience

  • 2021.10-2022.10, Research Intern, Advanced Translation Technology (ATT) Research Group at the National Institute of Communications Technology (NICT)
  • 2023.09-2024.09, Research Intern, Translation Service Center (TSC) at Huawei 2012 Lab
  • 2024.10-2025.10, Researcher, Translation Service Center (TSC) at Huawei 2012 Lab
  • 2025.12-Now, Assistant Professor, Zhengzhou University

📝 Selected Publications

  • Encouraging Lexical Translation Consistency for Document-Level Neural Machine Translation. [Paper]
    Xinglin Lyu, Junhui Li, Zhengxian Gong, Min Zhang. 2021. EMNLP.
  • Modeling Consistency Preference via Lexical Chains for Document-level Neural Machine Translation. [Paper]
    Xinglin Lyu, Junhui Li, Shimin Tao, Hao Yang, Min Zhang. 2022. EMNLP.
  • DeMPT: Decoding-enhanced Multi-phase Prompt Tuning for Making LLMs Be Better Context-aware Translators. [Paper][Code]
    Xinglin Lyu, Junhui Li, Yanqing Zhao, Min Zhang, Daimeng Wei, Shimin Tao, Hao Yang, Min Zhang. 2024. EMNLP.
  • Refining History for Future-Aware Neural Machine Translation.[Paper]
    Xinglin Lyu, Junhui Li, Min Zhang, Chenchen Ding, Hideki Tanaka, Masao Utiyama. 2022. TASLP.
  • A Survey of Document-level Neural Machine Translation. [Paper]
    Xinglin Lyu, Junhui Li, Shimin Tao, Hao Yang, Min Zhang. 2024. Journal of Software.
  • DoCIA: An Online Document-Level Context Incorporation Agent for Speech Translation. [Paper][Code]
    Xinglin Lyu, Junhui Li, DaiMeng Wei, Min Zhang, Shimin Tao, Hao Yang, Min Zhang. 2025. ACL findings.