A Dynamic Evaluation System for Applied Regression Analysis in Graduate Applied Statistics Education
Abstract
Applied Regression Analysis is a core course in Master of Applied Statistics programs. It consolidates students' statistical modeling foundations and develops their ability to analyze real-world data. Yet conventional course evaluation relies on final exams, lab reports, and project scores--emphasizing results over process and technique over problem-solving. Instructors struggle to identify where students struggle: data governance, model construction, diagnostics, communication. This design article proposes a dynamic evaluation system that uses a regression modeling competency map and process-oriented assessment to capture evidence from quizzes, code submissions, model outputs, case reports, presentations, and online behavior. The system converts this evidence into actionable feedback: problem localization, diagnostic attribution, and modeling prescription. Supported by automated code analysis, model diagnostics extraction, text analysis, and AI-assisted feedback, it is designed to evaluate students' modeling competencies throughout the full regression workflow. The system is intended to improve evaluation timeliness, specificity, and interpretability; support instructors in evidence-based teaching adjustments; and help students refine their modeling strategies.