Boosting Scientific Concepts Understanding: Can Analogy from Teacher Models Empower Student Models?

Abstract

Analogical reasoning plays a critical role in human cognition, enabling us to understand new concepts by associating them with familiar ones. Previous research in the AI community has mainly focused on identifying and generating analogies and then examining their quality under human evaluation, which overlooks the practical application of these analogies in real-world settings. Inspired by the human education process, in this paper, we propose to investigate how analogies created by teacher language models (LMs) can assist student LMs in understanding scientific concepts, thereby aligning more closely with practical scenarios. Our results suggest that free-form analogies can indeed aid LMs in understanding concepts. Additionally, analogies generated by student LMs can improve their own performance on scientific question answering, demonstrating their capability to use analogies for self-learning new knowledge.

Publication
In The 2024 Conference on Empirical Methods in Natural Language Processing (EMNLP 2024)
Siyu Yuan
Siyu Yuan
Ph.D. in Statistics