Methodology

Measures for Justice has developed a detailed methodology to standardize criminal justice data across jurisdictions across the United States.The development of the Measures is an iterative process, including updates to and improvement of Measure calculation, ongoing data collection and management, quality control, calculation, and engagement with data providers and county stakeholders. Our full methodology is available for download at the link below. It is updated periodically as our Measures expand and additional data become available.

Methodology Summary

The development of the Measures for Justice is an iterative process that involves six general steps: (1) conceptual development; (2) data collection and storage; (3) data management; (4) data quality control; (5) measure calculation; and (6) measure visualization. We seek input from stakeholders at every stage of the process, which sometimes leads to changes in how the measures are calculated. This document represents the most recent version of our methodology at the time of publication.

Source Data

Measures for Justice (MFJ) works with data extracted from administrative case management systems (CMS). These data are originally collected by the source agencies for the purpose of tracking the processing of individual cases and usually involve manual data entry into the CMS. As such, they may be subject to errors at any stage of the collection and recording process. MFJ excludes unreliable values (e.g., a filing date that is in the future—04/30/2025) and data elements (e.g., the initial appearance date is missing in 80% of cases) from all analyses.

Standardizing Data Across Jurisdictions

Statutory laws, agency practices, terminology, and case management systems vary across and within states. MFJ has developed a Standard Operating Procedure (SOP) to match all data to a uniform coding schema that allows for apples-to-apples comparisons. This includes, but is not limited to:

  • CASE is defined as all charges associated with the same individual defendant that were filed in court (or referred for prosecution, in the case of declinations) on the same date. MFJ assumes that when a prosecutor files multiple charges together, even when they stem from separate incidents, they intend to resolve these charges simultaneously. This may differ from how each agency defines case.
  • CASE SERIOUSNESS is defined by the most serious charge, according to the state’s offense severity classification, that was present at each stage of charging: referral, filing, and conviction.
  • CHARGE DESCRIPTIONS are standardized using a crosswalk that ensures that statutory definitions across states match a uniform code.
  • PRETRIAL RELEASE DECISION represents the court’s initial ruling regarding whether to release the defendant pending case disposition, and whether the release should be subject to monetary or nonmonetary conditions.
  • CASE DISPOSITION indicates the type of action that removed the case from the prosecutor’s or the court’s docket, excluding any actions stemming from appeals or violations of probation. Case disposition categories are defined as follows:
    • Prosecution declined: The prosecutor declined to file all the referred charges.
    • No or unknown disposition: The case was still pending at the time of data extraction or, if it had already been closed, no disposition was recorded in the raw data.
    • Dismissed: All charges that were filed in court were dismissed or withdrawn.
    • Deferred or diverted: The defendant entered a pretrial diversion or deferred prosecution program for at least one of the charges.
    • Not guilty at trial: The defendant was found not guilty of all charges in a jury or bench trial.
    • Guilty at trial: The defendant was found guilty of at least one charge in a jury or bench trial.
    • Guilty plea: The defendant pleaded guilty to at least one charge.
    • Guilty – unknown method: The defendant was guilty of at least one charge but the raw data did not indicate by which method (i.e. trial vs. plea).
    • Transferred: The case was transferred to another jurisdiction. This includes extraditions and changes of venue.
    • Other: Includes other dispositions such as bond estreature and bond forfeiture.
  • TIME TO DISPOSITION is calculated in two ways: (1) the number of days between arraignment and case disposition/sentencing, and (2) the number of days between filing and case disposition/sentencing. For declinations, it is calculated as the number of days between case referral and the prosecutor’s decision not to file. For diversions, it is calculated as the number of days between both case filing and arraignment in court and the defendant entering into a pretrial diversion agreement.
  • ATTORNEY TYPE reports the last attorney of record and includes the following categories: self-represented, private attorney, public defender, court-appointed private attorney, and other.
  • TOP SENTENCE identifies the type of punishment imposed by the court that was the most restrictive of personal liberties according to the following hierarchy:
    • Death penalty
    • Life in prison
    • State prison
    • Jail or county detention facility
    • Lifetime supervision
    • Extended supervision/split sentence with confinement portion in prison
    • Extended supervision/split sentence with confinement portion in jail
    • Extended supervision/split sentence with confinement type unknown
    • Other confinement (e.g., mental health institution, home confinement)
    • Probation
    • Fine
    • Restitution
    • Other (e.g., community service)
    • Time served sentence with no additional confinement time, supervision, or fines.

Data Quality Control

MFJ goes to great lengths to ensure that the data published are as accurate as possible and that the data management process does not become a source of error. MFJ’s data quality control process involves five general stages: (1) assessing the quality and completeness of the raw data delivered by the sources; (2) cleaning the data to remove invalid values and unreliable data elements; (3) conducting several rounds of internal audits of the cleaned case-level data; (4) sending the county-level data out to an independent external auditor to assess the data for face validity; and (5) validating the county-level data with state and local stakeholders.

Measure Calculation

All Measures are calculated at the county level because that is where charging, disposition, and sentencing decisions are made. They are estimated using multiple years of data (five years for most Measures, and two years for those that require controlling for prior convictions) to: (1) increase the number of cases included in the analysis and avoid suppressing smaller jurisdictions that may have few criminal cases on an annual basis; (2) protect the privacy of defendants in small jurisdictions; and (3) reduce the potential effect of temporal instability. The operational definitions, case exclusions, calculations, and sources are provided in all publications of the data.

Data Publication and Suppression Rules

MFJ publishes county-level results on a performance measure only when they conform to the following rules:

  • STATE AVERAGES The counties with available data must represent 50 percent or more of the state’s population for the state averages to be published.
  • NUMBER OF CASES At least 30 cases are needed to generate any performance measure. Performance measures for counties with fewer than 30 cases in the denominator or in the pool to calculate the median are suppressed from publication. Once measures have been filtered by groups (e.g., across race categories), the results are suppressed if the cell contains fewer than 5 cases.
  • MISSINGNESS The maximum permissible percentage of cases with missing values for any given measure is 10 percent. Performance measures for counties with more than 10 percent of cases missing values in the numerator or in the pool to calculate the median are suppressed from publication. In addition, performance measures for counties with more than 5 percent and up to 10 percent of cases with missing values display a “high missing rate” warning.
  • MISSINGNESS BIAS MFJ uses statistical simulations to estimate the amount of bias that may result from missing data. The bias depends both on the percentage of missing data and the actual value of the measure being estimated. For example, in a county where the pretrial diversion rate is low (e.g., 3%) and there is a considerable proportion of cases missing data (e.g., 7%), the estimate of the pretrial diversion rate could be inaccurate. Bias is estimated as a function of the sample mean and the percentage of missing data. Whenever the sample mean and the percentage of missing data suggest a level of bias greater than 5 percent, MFJ suppresses the data from publication.

Disparities

MFJ uses a Relative Rate Index (RRI) to assess disparities in case processing outcomes between white defendants and defendants of color, males and females, and indigent and non-indigent defendants. The RRI compares how two groups fare on the same outcome by dividing the results of one group by those of the other. An RRI equals to 1 indicates that there is no disparity in outcomes between the two groups. Disparities are not calculated when there are fewer than fewer than four cases in the numerator or denominator of the rate for either group. We also test the statistical and substantive significance of disparities. Disparities that are neither statistically nor substantively significant are suppressed from publication.

  • STATISTICAL SIGNIFICANCE MFJ estimates confidence intervals to test whether the disparity in outcomes for the two groups is beyond what could be expected by random chance. In this sense, statistical significance provides information about the precision and certainty of the measurement. Statistically significant disparities are noted with an asterisk (*).
  • SUBSTANTIVE SIGNIFICANCE Because statistical significance is affected by sample size, MFJ also evaluates whether the size of the disparity merits attention irrespective of statistical significance. Disparities equal to or greater than 1.05 are considered substantively significant and attempts should be made to understand and address them.

Outliers

MFJ uses a standard approach for calculating outliers. A county is flagged as an outlier when its value for a Measure is a discernibly large distance from the values of all other counties in the state. Outliers are classified into minor and major based on the magnitude of this distance. The magnitude is calculated using the interquartile range (IQR), which is the difference between the 75th (Q3) and 25th (Q1) percentiles.

  • MINOR OUTLIERS Minor outliers are values that fall below Q1 or above Q3 by 1.5 times the IQR.

     < Q1 - 1.5 IQR or > Q3 + 1.5 IQR
    
  • Major Outliers Major outliers are values that fall below Q1 or above Q3 by 3 times the IQR.

     < Q1 - 3 IQR or > Q3 + 3 IQR
    


Our measures and methodology have been vetted by two councils of experts: Methods and Measurement Council, and Benchmarking Council. If you have further questions about our methodology, please contact MFJ Research.

User Notes

When viewing the measures users will be asked to keep the following in mind:

  • A Starting PointOur Measures are meant to be a starting point for a conversation about the criminal justice system that addresses what’s working well and what needs further attention.
  • Adult Criminal CasesOur system measures only the performance of counties on the processing of adult criminal cases. Therefore, we do not measure how juvenile, family, civil, and other cases may fare. Nonetheless, our Measures can be filtered by the age group of the defendant, including those under 18 (juvenile defendants who were waived to adult court).
  • CausationMFJ’s research is descriptive and does not, by definition, tell us why things happen. As such, we do not test hypotheses about the reasons for the patterns the data reveal. When our Measures show differences between states, counties, or groups (e.g., in medians, percentages, or rates), we make no claim about the reasons for these differences.
  • ContextEach Measure sheds light on a corner of a local criminal justice system, but to evaluate the health of that system in a more comprehensive way, all available Measures should be assessed together and interpreted with county context in mind.
  • CountyWe measure criminal justice performance at the county level because it is usually at this level that charging, disposition, and sentencing decisions are made.
  • More DataMFJ continues to seek out more data—especially law enforcement data—as part of our effort to measure all corners of the criminal justice system.
  • TimelineIf you’ve given us data and don’t see them represented in the Portal yet, it’s because we are still working on them to ensure accuracy. Thank you for your participation and patience.
  • Portal UpdatesWe provide a complete history of portal updates that allows you to track when data changes or new data have been released to the portal or when new versions of the portal are made available.