Abstract
This study proposes an adaptive learning framework for Big Data courses that dynamically generates personalized learning pathways by integrating multimodal knowledge representation with reinforcement learning. Traditional learning systems often fail to account for individual differences in knowledge states, cognitive traits, and learning preferences, leading to suboptimal educational outcomes. The proposed method addresses this gap by constructing a Multimodal Knowledge Graph (MMKG) that unifies diverse learning resources, including text, code, and visual materials, into a structured ontology. Dynamic learner profiles are built using Item Response Theory and clustering techniques to model knowledge mastery and cognitive styles, while an adaptation engine employs graph neural networks and reinforcement learning to optimize learning paths in real-time. The engine minimizes cognitive load and maximizes knowledge gain by dynamically adjusting resource assignments and sequencing based on continuous feedback. Furthermore, multimodal resource scheduling ensures that learners receive content tailored to their preferred modalities, such as visual diagrams for visual learners or interactive code sandboxes for kinesthetic learners. The novelty of this work lies in the synergistic integration of MMKG with reinforcement learning, enabling fine-grained personalization that adapts to both static and evolving learner needs. Experimental validation demonstrates significant improvements in learning efficiency and engagement compared to conventional methods, highlighting the framework’s potential for scalable and adaptive education in complex domains like Big Data.

This work is licensed under a Creative Commons Attribution 4.0 International License.
Copyright (c) 2025 Nanjun Ye (Author)