Panel 2 Summary

Panel 2 Summary – Advancing the Evidence and Innovation Agenda: The Power of IDS for Experimental and Comparative Effectiveness Research

Jon Baron, President, The Coalition for Evidence-Based Policy

Jon Baron, President of the Coalition for Evidence-Based Policy, advocates increasing the number of social programs that are rigorously evaluated based on high standards of evidence. Baron argues that integrated data systems can play a vital role in this process. In many instances, such systems can dramatically reduce the cost of data collection, enabling rigorous evaluations – including large randomized controlled trials – to be conducted at low or modest cost.

The Coalition for Evidence-Based Policy argues that in the medical field, policy based on rigorous scientific research has produced extraordinary advances in health over the past 50 years. By contrast, in most areas of social policy – such as education, poverty reduction, and crime prevention – government programs often are implemented with little regard to evidence, costing billions of dollars yet failing to address critical social problems. However, rigorous studies have identified a few highly-effective program models and strategies (“interventions”), suggesting that a concerted government effort to build the number of these proven interventions, and spur their widespread use, could bring rapid progress to social policy similar to that which transformed medicine.[1]

Integrated Data Systems Optimize Spending

Thomas J. Watson’s famous axiom states: “If you want to increase your success rate, double your failure rate.” Baron reasons that it would be advantageous to greatly increase the number of programs that are rigorously tested, so as to more rapidly grow the subset that are shown to work. Randomized trials can cost millions of dollars; with integrated data systems (IDS), such studies can sometimes be done at much lower cost – saving research as well as program funds.

[1] (n.d.) Our Mission. The Coalition for Evidence-Based Policy. Retrieved January 13,
2014, from http://coalition4evidence.org/

Dave Patterson, Chief, Health, and Demographics; South Carolina Budget and Control Board Division of Research and Statistics

AISP NETWORK EXEMPLAR: Integrated Data Systems, Program Delivery, and Quasi-experimental Program Evaluation

Dave Patterson is excited about how integrated administrative data helps South Carolina policymakers make “leaps of understanding” far beyond the knowledge that can be gleaned from single data sets. South Carolina, an AISP Network Site, uses integrated data to better understand the complex problems faced by individuals in need and to evaluate the impact of services.

As illustrated by the below diagram, which South Carolina has termed the “Circle of Love,” the South Carolina IDS incorporates data from all human service areas.

South Carolina leverages their IDS to improve quality of life for citizens who access state and community services, including patients who need emergency mental healthcare. Most emergency rooms, particularly in rural parts of the state, are inadequately staffed to handle such patients. Visits are costly and hard on the patients, who often wait hours to get assigned to a bed when a simple adjustment to their medication would have sufficed.

South Carolina’s IDS facilitates the operation and evaluation of the state’s telepsychiatry initiative, where behavioral health practitioners located in 18 major hospitals consult with patients in remote locations via video conferencing. The Data Warehouse provides Medicaid and Department of Mental Health (DMH) data, integrates DMH records with the state’s Health Information Exchange, and links program-specific data to the IDS. Early results, validated by the University of South Carolina and Emory University, show that the telepsychiatry program increases admissions, reduces length of stay, improves follow-up, and yields cost savings up to $2,000 per consultation. The program logged its 15,000th consult in April 2013.

Evaluation strategies include propensity scoring with optimal matching of patients treated at intervention and non-intervention sites, and comparison of the two groups on utilization and cost outcomes using standard econometric techniques. There are limitations, such as unmeasured differences in the severity of patient illness and in hospital expertise, and these will be examined in future studies. The key finding here is that “integrated data systems can close the loop between practitioners, applied analysts, and basic researchers,” says Patterson. “They help create and sustain public, private, and nonprofit partnerships around important social issues, and produce economies of scale by repurposing existing administrative data.”


John Laub, Distinguished University Professor of Criminology and Criminal Justice, University of Maryland

Advancing Research on Crime and Justice

John Laub uses integrated data systems to understand the connections between juvenile delinquency, adult crime, and other problem behaviors in adulthood. His work builds on the pioneering Glueck Project, a longitudinal study (using what was essentially a paper-based IDS) that found connections between juvenile delinquency, adult crime, alcohol and drug use, and job and marital stability later in life.

In his previous role as director of the National Institute of Justice in the Department of Justice, Laub recognized a disconnect between the research and statistics divisions. Inspired by the Glueck Project and his own research, he set about to link administrative and operational data from criminal justice agencies at all levels, and to make these data suitable for statistical and research purposes. Among other steps, this involves the “nontrivial” matter of standardizing and automating criminal “rap” sheets. Perhaps, most importantly, is creating institutional arrangements that will generalize to other information exchanges and other places.

Laub, now a Distinguished University Professor in the Department of Criminology and Criminal Justice at the University of Maryland, argues that crime predicts poverty as much as poverty predicts crime.  Thus, crime should be seen as both an independent and dependent variable. Such a paradigm-shifting discovery could only be made using integrated administrative data, and he hopes to use IDS to go much deeper. He ended his presentation by stating, “The greatest potential for IDS-based research is in moving beyond knowing what works toward understanding why.”


Betsey Stevenson, Member, White House Council of Economic Advisors

Budget-Enforced Innovation

Betsey Stevenson, a member of the White House Council on Economic Advisors, describes evidence-based research as key to Administration efforts to design smarter, more innovative and accountable government. The Office of Management and Budget (OMB) is “the enforcer,” says Stevenson, since it is through the budgeting process that OMB promotes interagency collaboration and prioritizes programs with a demonstrated commitment to evidence-based evaluation.

“We expand the approaches that work best, fine-tune the ones that show mixed results, and shut down those that are failing,” says Stevenson. “Rigorous program evaluation is uncomfortable, but it’s what we have to do.”

Typically, program evaluations are based on soft data such as intentions, anecdotes, and at best, outputs. However, in today’s tight fiscal climate, agencies must make smaller, more targeted investments, and soft evaluations must give way to hard data. This is a challenge for agency planners, in part because evaluation – and hence evaluation budgets – must be built into programs from the beginning.

OMB strives to promote agencies’ capacity for using evidence through organization-wide evaluations, common evidence guidelines, cross-agency learning networks, and “what works” clearinghouses. Tools include outcome-focused grant design, Pay for Success programs, and tiered-evidence evaluations, with different levels of funding for each level of evaluation. Stevenson offers a few examples of what can be done:

  • The College Scorecard, a set of key indicators about the cost and value of colleges, demonstrates the benefits – and the limitations – of leveraging administrative data. Congressional curbs on gathering data about college graduates foreclose the inclusion of value added indicators.
  •  High quality, low cost evaluations and iterative experimentation point to the effectiveness of behavioral prompts suggestive of towel-saving strategies used by big hotels. Apparently the news that “90 percent of people with delinquent tax bills like yours pay them promptly” encourages people to pay up.
  • Hawaii’s Opportunity Probation with Enforcement Program (HOPE), which measures outcomes using repurposed administrative data, finds that HOPE group members are 55 percent less likely than those in a control group to be re-arrested after one year.

“We need a body of work to properly assess whether programs and interventions are working,” saysStevenson. She knows this is not easy. However, OMB is already hard at work on the 2015 budget, devising new ways to harness administrative data for high-quality, low-cost evaluations.

This site is registered on Toolset.com as a development site.