Junting Chen

Embodied AI Enthusiast

prof_pic.jpg

Ph.D. in Computer Science with President Graduate Fellowship (PGF)

National University of Singapore

Hi, this is Junting Chen (陈俊廷), currently a CS Ph.D. student at the National University of Singapore (NUS), advised by Prof. Lin Shao. Before that, he was a student researcher at Shanghai Artificial Intelligence Lab in Embodied AI and robotic systems, working with Yao Mu and Ping Luo. He earned his master’s degree in Robotics and Artificial Intelligence at ETH Zurich, with Prof. Luc Van Gool as master’s program tutor. At ETH Zurich, he worked with Dr. Suryansh Kumar and Dr. Guohao Li at CVL. Before that, he earned my BSc in Computer Science at the University of Chinese Academy of Sciences, working with Prof. Ruiping Wang and Xilin Chen during his undergraduate studies.

My current research interest is the gap between the perception and action in indoor-scene Embodied AI: 1) How to construct a unified scene representation during active exploration that supports multiple high-level embodied tasks. 2) How to make long-term decisions and predict instant actions conditioned on different intentions by leveraging the information in the scene representation and common sense knowledge.

I (and my lab) am actively recruiting intern students on-site (Singapore/ Shenzhen, China) or online. Please send me your proposal and resume for internship application. Besides, I am open to any form of discussion/ cooperation, please drop me email.

selected publications

  1. RSS
    chen2023-preview.png
    How To Not Train Your Dragon: Training-free Embodied Object Goal Navigation with Semantic Frontiers
    Junting Chen, Guohao Li, Suryansh Kumar, and 2 more authors
    In Proceedings of Robotics: Science and Systems (RSS), 2023
  2. arXiv
    chen2024-roboscript-preview.png
    RoboScript: Code Generation for Free-Form Manipulation Tasks across Real and Simulation
    Junting Chen, Yao Mu, Qiaojun Yu, and 11 more authors
    2024
  3. arXiv
    mu2024-robocodex-preview.png
    RoboCodeX: Multimodal Code Generation for Robotic Behavior Synthesis
    Yao Mu, Junting Chen, Qinglong Zhang, and 16 more authors
    2024
  4. EMOS: Embodiment-aware Heterogeneous Multi-robot Operating System with LLM Agents
    Junting Chen, Checheng Yu, Xunzhe Zhou, and 7 more authors
    In The Thirteenth International Conference on Learning Representations (ICLR), 2025
  5. OWMM-Agent: Open World Mobile Manipulation With Multi-modal Agentic Data Synthesis
    Junting Chen, Haotian Liang, Lingxiao Du, and 8 more authors
    In Advances in Neural Information Processing Systems (NeurIPS), 2025
  6. LISN: Language-Instructed Social Navigation with VLM-based Controller Modulating
    Junting Chen, Yunchuan Li, Panfeng Jiang, and 5 more authors
    2025