# Dmitry I. BelovLaw School Admission Council · Assessment Sciences

Dmitry I. Belov

PhD in Computer Science

## About

48

Publications

16,176

Reads

**How we measure 'reads'**

A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more

808

Citations

Introduction

Additional affiliations

September 2002 - present

**LSAC**

Position

- Researcher

March 2000 - July 2002

Education

September 1995 - June 1999

**Institute of Engineering Cybernetics**

Field of study

- Computer Science

September 1990 - June 1995

## Publications

Publications (48)

In educational practice, a test assembly problem is formulated as a system of inequalities induced by test specifications.
Each solution to the system is a test, represented by a 0–1 vector, where each element corresponds to an item included (1)
or not included (0) into the test. Therefore, the size of a 0–1 vector equals the number of items n in a...

The Kullback-Leibler divergence (KLD) is a widely used method for measuring the fit of two distributions. In general, the distribution of the KLD is unknown. Under reasonable assumptions, common in psychometrics, the distribution of the KLD is shown to be asymptotically distributed as a scaled (non-central) chi-square with one degree of freedom or...

This article presents a new algorithm for computerized adaptive testing (CAT) when content constraints are present. The algorithm is based on shadow CAT methodology to meet content constraints but applies Monte Carlo methods and provides the following advantages over shadow CAT: (a) lower maximum item exposure rates, (b) higher utilization of the i...

Test collusion (TC) is sharing of test materials or answers to test questions before or during the test (important special case of TC is item preknowledge). Because of potentially large advantages for examinees involved, TC poses a serious threat to the validity of score interpretations. The proposed approach applies graph theory methodology to res...

The general case of item preknowledge (IP) is studied, where groups of examinees had access to compromised subsets of items from an administered test prior to the exam. Nothing is known about these groups and subsets (terms groups and subsets are chosen to clearly distinguish between subsets of examinees and subsets of items). When only item scores...

A 0-1 program is studied where linear objective function with uniformly distributed coefficients is maximized under arbitrary linear and/or non-linear constraints. Solving this problem a given number of times results in a sampling from the feasible set. Based on Slepian’s inequality, we prove that when this set has certain combinatorial properties,...

Existing estimators of parameters of item response theory (IRT) models exploit the likelihood function. In small samples, however, the IRT likelihood oftentimes contains little informative value, potentially resulting in biased and/or unstable parameter estimates and large standard errors. To facilitate small-sample IRT estimation, we introduce a n...

A test of item compromise is presented which combines the test takers' responses and response times (RTs) into a statistic defined as the number of correct responses on the item for test takers with RTs flagged as suspicious. The test has null and alternative distributions belonging to the well‐known family of compound binomial distributions, is si...

The objective of item difficulty modeling (IDM) is to predict the statistical parameters of an item (e.g., difficulty) based on features extracted directly from the item (e.g., number of words). This paper utilizes neural networks (NNs) to predict a discrete item characteristic curve (ICC). The presented approach exploits one-to-one mapping from mo...

Recently, Belov & Wollack (2021) developed a method for detecting groups of colluding examinees as cliques in a graph. The objective of this article is to study how the performance of their method on real data with item preknowledge (IP) depends on the mechanism of edge formation governed by a response similarity index (RSI). This study resulted in...

Modern detectors of speededness are based on assumption that speeded examinees perform increasingly worse as test progresses. However, this assumption may be often violated in practice due to various test taking behaviors. A new asymptotically optimal detector of speeded examinees is introduced here that is not based on this assumption.

Test collusion (TC) is the sharing of test materials or answers to test questions (items) before or during a test. Because of the potentially large advantages for the test takers involved, TC poses a serious threat to the validity of score interpretations. The proposed approach applies graph theory methodology to response similarity analyses to ide...

This paper presents the latest developments since the publication of the seminal book by van der Linden (2005) on general types of test assembly (TA) problems, major automated test assembly (ATA) methods, and various practical situations in which a TA problem arises. With the power of modern combinatorial optimization (CO) methods, multiple practic...

In standardized multiple-choice testing, examinees may change their answers for various reasons. The statistical analysis of answer changes (ACs) has uncovered multiple testing irregularities on large-scale assessments and is now routinely performed at many testing organizations. This article exploits a recent approach where the information about a...

This report addresses a general type of cluster aberrancy in which a subgroup of test takers has an unfair advantage on some subset of administered items. Examples of cluster aberrancy include item preknowledge and test collusion. In general, cluster aberrancy is hard to detect due to the multiple unknowns involved: Unknown subgroups of test takers...

The statistical analysis of answer changes (ACs) has uncovered multiple testing irregularities on large-scale assessments and is now routinely performed at testing organizations. However, AC data has an uncertainty caused by technological or human factors. Therefore, existing statistics (e.g., number of wrong-to-right ACs) used to detect examinees...

Item preknowledge describes a situation in which a group of examinees (called aberrant examinees) have had access to some items (called compromised items) from an administered test prior to the exam. Item preknowledge negatively affects both the corresponding testing program and its users (e.g., universities, companies, government organizations) be...

Item preknowledge occurs when some examinees (called aberrant examinees) have had access to a subset of items (called a compromised subset) from an administered test prior to an exam. As a result, aberrant examinees might perform better on compromised items as compared to uncompromised items. When the number of aberrant examinees is large, the corr...

Text similarity measurement provides a rich source of information and is increasingly being used in the development of new educational and psychological applications. However, due to the high-stakes nature of educational and psychological testing, it is imperative that a text similarity measure be stable (or robust) to avoid uncertainty in the data...

The development of statistical methods for detecting test collusion is a new research direction in the area of test security. Test collusion may be described as large-scale sharing of test materials, including answers to test items. Current methods of detecting test collusion are based on statistics also used in answer-copying detection. Therefore,...

Educational measurement practice (item bank development, form assembly, scoring of constructed response answers, etc.) involves the development and processing of an enormous amount of text. This requires large numbers of people to write, read through, evaluate, classify, edit, score, and analyze the text. Not only is this process time consuming and...

This article presents the Variable Match Index (VM-Index), a new statistic for detecting answer copying. The power of the VM-Index relies on two-dimensional conditioning as well as the structure of the test. The asymptotic distribution of the VM-Index is analyzed by reduction to Poisson trials. A computational study comparing the VM-Index with the...

This article presents a new method to detect copying on a standardized multiple-choice exam. The method combines two statistical approaches in successive stages. The first stage uses Kullback-Leibler divergence to identify examinees, called subjects, who have demonstrated inconsistent performance during an exam. For each subject the second stage us...

The recent literature on computerized adaptive testing (CAT) has developed methods for creating CAT item pools from a large master pool. Each CAT pool is designed as a set of nonoverlapping forms reflecting the skill levels of an assumed population of test takers. This article presents a Monte Carlo method to obtain these CAT pools and discusses it...

This article presents an application of Monte Carlo methods for developing and assembling multistage adaptive tests (MSTs). A major advantage of the Monte Carlo assembly over other approaches (e.g., integer programming or enumerative heuristics) is that it provides a uniform sampling from all MSTs (or MST paths) available from a given item pool. Th...

A valuable resource for any testing agency is the item pool which yields the questions for its tests. This paper considers the problem of developing the supply chain that provides these items. In particular, the life cycle of items used in the Law School Admission Test (LSAT) is reviewed. Guidelines for the optimal development of items over a finit...

This paper introduces a novel approach for extracting the maximum number of non-overlapping test forms from a large collection
of overlapping test sections assembled from a given item bank. The approach involves solving maximum set packing problems
(MSPs). A branch-and-bound MSP algorithm is developed along with techniques adapted from constraint p...

Statistical analysis of patients previously operated on may improve our methods of performing subsequent surgical procedures. In this article, we introduce a method for studying the functional properties of cerebral structures from electrophysiological and neuroimaging data by using the probabilistic functional atlas (PFA). The PFA provides a spati...

Determination of distorted brain anatomy surrounding a tumor causing a mass effect is much more difficult than interpretation of normal brain scans, particularly because this distortion is not easily predictable a tumor may be located in any place and vary substantially in size, shape, and radiological appearance. The objective of our work is to pr...

A new test assembly algorithm based on a Monte Carlo random search is presented in this article. A major advantage of the Monte Carlo test assembly over other approaches (integer programming or enumerative heuristics) is that it performs a uniform sampling from the item pool, which provides every feasible item combination (test) with an equal chanc...

A 0-1 program is studied where linear objective function with uniformly distributed coefficients is maximized under arbitrary linear and/or non-linear constraints. Solving this problem a given number of times results in a sampling from the feasible set. Based on Slepian’s inequality, we prove that when this set has certain combinatorial properties,...

Standardized tests are useful for assessing an individual's potential to succeed in various endeavors. In addition, institutions use them to measure student achievement and to measure the efficacy of pedagogical approaches. Operations research tools can help those developing rigorous standardized tests. Our mixed-integer program (MIP) provides a mo...

We have previously introduced a concept of a probabilistic functional atlas (PFA) to overcome limitations of the current electronic stereotactic brain atlases: anatomical nature, spatial sparseness, inconsistency and lack of population information. The PFA for the STN has already been developed. This work addresses construction of the PFA for the v...

This paper introduces a method for generation and validation of a probabilistic functional brain atlas of subcortical structures from electrophysiological and neuroimaging data. The method contains three major steps: (1) acquisition of pre, intra, and postoperative multimodal data; (2) selection of an accurate data set for atlas generation; and (3)...

The article introduces an atlas-assisted method and a tool called the Cerefy Neuroradiology Atlas (CNA), available over the Internet for neuroradiology and human brain mapping. The CNA contains an enhanced, extended, and fully segmented and labeled electronic version of the Talairach-Tournoux brain atlas, including parcelated gyri and Brodmann's ar...

The paper introduces an optimal algorithm for rapid calculation of a probabilistic functional atlas (PFA) of subcortical structures from data collected during functional neurosurgery procedures. The PFA is calculated based on combined intraoperative electrophysiology, pre- and intraoperative neuroimaging, and postoperative neurological verification...

The problem of reconstructing the pattern of heart excitation from body surface potentials is simulated. The problem is well known as the inverse problem of electrocardiography and in a general case this problem has a non-unique solution. The relationship of the problem with the inverse problem of potential theory is shown. From this relationship a...

This paper describes an Internet portal for stereotactic and functional neurosurgery with a probabilistic functional atlas (PFA) calculated from electrophysiological and neuroimaging data. This portal enables (1) data sharing among neurosurgeons; (2) the calculation of probabilistic functional maps of stereotactic target structures from a neurosurg...

The mathematical aspects of the problem of the reconstruction of domain of the heart excitation from the body surface potentials
are investigated. We propose modeling algorithms allowing to reduce endless set of solutions of the problem. These algorithms
take into account local and global time restrictions of the function whose support is equal to...

The determination problem of the electrostatics sources carrier
given on different sets potentials is numerically solved. The results of
experiments with computer models are shown. On the basis of the obtained
results the new approach for solving the inverse problem of
electrocardiography is suggested

In the present article, the problem of determining the support of sources of the electrostatic field is approximately solved on the basis of potentials specified on different sets. The results of experiments with computer models are displayed. On the basis of the results obtained, a new approach is proposed to the statement and solution of the inve...