Qunhong Zeng

profile.jpg

Beijing, China

qunhongzeng@gmail.com

I am a final-year graduate student at the School of Computer Science and Technology at Beijing Institute of Technology, advised by Prof. Yuxia Zhang. Prior to this, I received my B.S. degree in Computer Science from the same university in 2023.

My research primarily focuses on AI for Software Engineering (AI4SE). I am currently working at MiniMax AI, where I focus on improving models’ agentic coding capabilities. My recent work centers on SWE tasks (Multi-SWE-bench, SWE-bench multilingual, etc.), and I am particularly interested in improving foundation models’ generalizability across different scaffolds and enabling them to better serve software engineers.

I am also interested in MLSys and enjoy building scalable systems. I am a contributor to the verl project, and recently I have been focusing part of my work on building MiniMax’s RL infrastructure, scaling agent scaffolds for agentic RL. I believe RL represents a critical pathway toward AGI, and it’s an art of combining research and engineering.

Please feel free to contact me if you’re interested in having a discussion :)

Publications

  1. ToolTrain: Tool-integrated Reinforcement Learning for Repo Deep Search
    Zexiong Ma, Chao Peng, Qunhong Zeng, Pengfei Gao, Yanzhen Zou, and Bing Xie
    In Proceedings of the IEEE/ACM 48th International Conference on Software Engineering (ICSE), 2026
  2. Evaluating Generated Commit Messages with Large Language Models
    Qunhong Zeng, Yuxia Zhang, Zexiong Ma, Bo Jiang, Ningyuan Sun, Klaas-Jan Stol, Xingyu Mou, and Hui Liu
    In Proceedings of the IEEE/ACM 48th International Conference on Software Engineering (ICSE), 2026
  3. A First Look at Conventional Commits Classification
    Qunhong Zeng, Yuxia Zhang, Zhiqing Qiu, and Hui Liu
    In Proceedings of the IEEE/ACM 47th International Conference on Software Engineering (ICSE), 2025
  4. COLARE: Commit Classification via Fine-grained Context-aware Representation of Code Changes
    Qunhong Zeng, Yuxia Zhang, Zeyu Sun, Yujie Guo, and Hui Liu
    In 2024 IEEE International Conference on Software Analysis, Evolution and Reengineering (SANER), 2024

Experiences

  1. MiniMax, Top Talent Intern, 2024.7 - now
    • Core contributor to MiniMax-M2’s multilingual coding capabilities. Benefiting from environment scaling and scaffold scaling, MiniMax-M2 shows SOTA performance on Multi-SWE-bench and SWE-bench multilingual among all opensource models.

    • Working on MiniMax-M2.1 and more, stay tuned ~

  2. Meituan, Beidou Program Intern, 2025.6 - 2025.7
    • Researched multi-turn conversational RL to enhance LLM’s dialogue capabilities.

  3. ByteDance, Trae CodeAI Intern, 2024.9 - 2025.6
    • Conducted research on automated commit message generation and quality evaluation, with findings accepted for publication in the research track of ICSE 2026.

    • Researched RL approaches to enhance LLM’s reasoning capabilities, such as mathematical reasoning, competitive programming contest test generation, and tool-interactive reasoning. Collaborating with Zexiong Ma and mentored by Bo Jiang.

  4. Oneflow, Framework Intern, 2024.3 - 2024.9
    • Extended deep learning framework oneflow compatibility beyond CUDA to diverse hardware accelerators (NPU/XPU), implementing Ascend CANN/AscendC-based operators that enabled end-to-end ResNet, GPT2, and Llama model inference/training on Ascend chips.

    • Managed framework infrastructure including Docker containerization, CI/CD, cross-platform compilation, and Manylinux-compliant Python/C++ extension packaging.

  5. Momenta, HD-Map Backend Intern, 2022.9 - 2023.1
    • Engineered high-performance Python/C++ extensions for HD maps algorithm library, delivering 20x performance improvement.

  6. ByteDance, AI-Lab Intern, 2021.12 - 2022.6
    • Contributed to ByteDance’s machine learning platform development as part of the engineering team.