Test-size Reducation via Sparse Factor Analysis

TitleTest-size Reducation via Sparse Factor Analysis
Publication TypeJournal Article
AuthorsD. Vats, C. Studer, A. S. Lan, L. Carin, and R. G. Baraniuk
Abstract

In designing educational tests, instructors often have access to a question bank that contains a large number of questions that test knowledge on the concepts underlying a given course. In this setup, a natural way to design tests is to simply ask learners to respond to the entire set of available questions. This approach, however, is clearly not practical since it involves a significant time commitment from both the learner (in taking the test) and the instructor (in grading the test if it cannot be automatically graded). Hence, in this paper, we consider the problem of designing efficient and accurate tests so as to minimize the workload of both the learners and the instructors by substantially reducing the number of questions, or—more colloquially—the test size, while still being able to retrieve accurate concept knowledge estimates. We refer to this test design problem as TeSR, short for Test-size Reduction. We propose two novel algorithms, a non-adaptive and an adaptive variant, for TeSR using an extended version of the SParse Factor Analysis (SPARFA) framework for modeling learner responses to questions. Our new TeSR algorithms finds fast approximate solutions to a combinatorial optimization problem that involves minimizing the uncertainly in assessing a learner’s understanding of concepts. We demonstrate the efficacy of these algorithms using synthetic and real educational data, and we show significant performance improvements over state-of-the-art methods that build upon the popular Rasch model

Year of PublicationSubmitted
JournalPreprint
Publication File: 

Rice University, MS-380 - 6100 Main St - Houston, TX 77005 - USA - webmaster-dsp@ece.rice.edu