Understanding differential item functioning (DIF) is critical for ensuring fairness in assessments across diverse groups. A recent study by Li et al. introduces a method to enhance the interpretability of DIF items by incorporating response process data. This approach aims to improve equity in measurement by examining how participants engage …
Integrating SDT and IRT Models for Mixed-Format Exams
Lawrence T. DeCarlo’s recent article introduces a psychological framework for mixed-format exams, combining signal detection theory (SDT) for multiple-choice items and item response theory (IRT) for open-ended items. This fusion allows for a unified model that captures the nuances of each item type while providing insights into the underlying cognitive …
Rotation Local Solutions in Multidimensional Item Response Models
Nguyen and Waller’s (2024) study provides an in-depth analysis of factor-rotation local solutions (LS) within multidimensional, two-parameter logistic (M2PL) item response models. Through an extensive Monte Carlo simulation, the research evaluates how different factors influence rotation algorithms’ performance, contributing to a deeper understanding of multidimensional psychometric models. Background The study …
Refining Reliability with Attenuation-Corrected Estimators
Jari Metsämuuronen’s (2022) article introduces a significant advancement in how reliability is estimated within psychological assessments. The study critiques traditional methods for their tendency to yield deflated results and proposes new attenuation-corrected estimators to address these limitations. This review examines the article’s contributions and its implications for improving measurement precision. …
Assessing Missing Data Handling Methods in Sparse Educational Datasets
The study by Xiao and Bulut (2020) evaluates how different methods for handling missing data perform when estimating ability parameters from sparse datasets. Using two Monte Carlo simulations, the research highlights the strengths and limitations of four approaches, providing valuable insights for researchers and practitioners in educational and psychological measurement. …
Evaluating Factor Retention in Exploratory Factor Analysis
Determining the optimal number of factors to retain in exploratory factor analysis (EFA) has long been a subject of debate in social sciences research. Finch (2020) addresses this challenge by comparing the performance of fit index difference values and parallel analysis, a well-established method in this field. The study offers …