The summer of 2025 marked the beginning of a remarkable journey for us into the frontiers of scientific research in Singapore. It's unique energy and vibrant academic atmosphere provided the perfect backdrop for our project, which tackled a challenging, forward-looking question: How can we enable today's powerful large-scale AI models to run efficiently on everyday mobile devices like smartphones and drones, thereby realizing true Edge General Intelligence (EGI).
Soon after arriving, we were captivated by the Singapore's multicultural vibrancy. To better immerse ourselves in the local academic pulse, we studied and worked in Nanyang Technological University (NTU), one of the world's leading universities. Walking through its modern campus, we felt a palpable blend of rigorous scientific spirit and an open, international perspective. This experience deeply inspired us and set a positive tone for the work ahead.
Our research process was defined by invaluable collaboration. We had regular in-depth discussions with our supervisor, with each meeting feeling like a brainstorm that not only guided our research but also sparked new ideas on how to deeply integrate cutting-edge technology with the Agentic AI framework. We happily captured one of these moments with a group photo, preserving the memory of this valuable mentorship. Equally important were the informal exchanges with our senior mentor. We often discussed progress and challenges over meals, where conversations flowed from complex academic hurdles to personal insights. This blend of formalguidance and friendly, peer-to-peer support was instrumental to our progress and made us feel at home, even while abroad.

Our work centered on a key technology known as Knowledge Distillation (KD). The concept is elegantly simple: a large, knowledgeable teacher model transfers its core capabilities to a compact student model. This process allows the student model to operate with high efficiency on resource-constrained devices while retaining the powerful performance of its larger counterpart. Through our systematic survey, we concluded that KD empowers mobile Agentic AI with four core capabilities: achieving significant model compression while maintaining accuracy, enhancing knowledge transfer for stronger generalization in low-data scenarios, improving robustness for stable performance in dynamic environments , and optimizing interaction strategies for using tools and APIs more effectively. These findings culminated in a comprehensive review article intended to serve as a roadmap for future researchers in the field.
Looking back, this summer was about more than just producing a research paper. It was a period of unforgettable personal and academic growth. The things we saw and learned in Singapore, combined with the deep connections we forged with our mentors, have given us a new perspective on scientific inquiry. We left with a strengthened conviction that Knowledge Distillation will play an indispensable role in ushering in an era of accessible and autonomous mobile intelligence. This invaluable journey will continue to inspire us as we move forward on our path of exploring the future of artificial intelligence.
By Wu Yuxuan and Ma Linhan (undergraduate students)