11.2 Cautions and Limitations

Despite the utility of performance management and outcome data, they are subject to some limitations. The following caveats must be understood:

Recognize the limitations of the data.

The data does not tell us why the findings might be bad (or good), because determining why something is or is not working requires complex research and evaluation studies. Even large evaluation studies are limited in what they can learn about what causes particular findings or outcomes, and are likely to be costly to undertake.

One way to overcome this is to work with the grant writers in your office to identify opportunities to secure funding from (g., National Institute of Justice or private funders) for evaluations of the effectiveness of practices related to criminal justice response. Even absent externally-funded evaluations, data can be analyzed in more nuanced ways (e.g., by subgroup or by case complexity), by asking staff and partners about their perceptions of why something might be the way it is, or by conducting case reviews and corresponding roundtables as described in suggestion 3(d) above.

Know your limits.

A prosecutor’s office may have difficulty finding the necessary resources to conduct the basic data analyses recommended in this report. Data entry automation, processing, and analysis may be needed to avoid over-burdening prosecution staff. As prosecutors’ offices adopt case management systems and administrative tasks inevitably become more automated, incorporating the collection of data relevant to performance and outcome measures identified in this resource will help ensure that the data collected and analyzed by offices provides meaningful insight into effectiveness.

Know that the data is limited to your jurisdiction alone.

External benchmarks are not available for most of the outcome measures discussed in this report. Few prosecutors’ offices are regularly collecting and reporting such information; therefore, relative comparisons cannot yet be made. As this process is adopted by more offices, external benchmarks may become available to enable similarly situated jurisdictions to compare practices, moving the field closer to identifying and sustaining effective practices in sexual violence cases.

Be sure to avoid using the data as a “gotcha” device.

Using the data to point fingers or assign blame can create discord among sexual violence response partners, who need to rely on and trust one another to adequately address these crimes. In addition to undermining trust and cooperation, assigning blame provides an incentive to manipulate data, undermining the accuracy necessary for the data to be useful. The focus of the performance and outcome measurement process is to improve services and the effectiveness of the sexual violence response.

Exhibit 11-1
Actions for Using Outcome Data to Improve Programs110

 

  1. Compare performance measures and outcomes across categories and sub-categories, using the findings for targeting training and resources needs.
  2. Provide regular, frequent, and timely performance management and outcome reports (perhaps quarterly or biannually) that are made available to office and partner staff.
  3. Put reports into a user-friendly format that engages the reader.
  4. Attach brief “report highlight” statements that call out particular findings to reports.
  5. Compare the latest data to previous time periods to identify patterns of change.
  6. Hold “How Are We Doing” meetings with staff and partners as reports become available.
  7. Examine explanatory information provided by staff and partners, considering victim input based on responses to the victim survey.
  8. Use reports to encourage staff to suggest improvements.
  9. Use the findings to test new or different procedures.
  10. Provide the public with regular reports to enhance citizen knowledge and support. Ensure the information released does not compromise victim safety/privacy and that it complies with professional obligations.
  11. Use the outcome findings in developing, and subsequently justifying, budget requests.

 

110 Adapted from ”Performance Indicators: Getting Started,” National Academy of Public Administration, Washington DC, Not dated.