Resource
Improve | Implementation ToolsRecommendations for Improving Data Collection and Data Definitions

Purpose

This resource is intended to provide recommendations to health systems on making progress in the IMPROVE element of Zero Suicide. The information that follows was compiled through targeted discussions with an advisory group of heath and behavioral health care (HBH) organizations convened by the Suicide Prevention Resource Center (SPRC). Discussion topics included organizational factors that support improvements in data collection, and the importance of applying thoughtful and specific definitions to metrics that are commonly used to track the impact of Zero Suicide implementation.

It is also important to recognize that the data collected as part of Zero Suicide implementation represent people. These are individuals who may be risk of suicide, who are receiving care for suicide risk, who made a suicide attempt, or who died by suicide. Pursuing continuous quality improvement is challenging and necessary for Zero Suicide implementation, but more importantly it is essential for improving the care delivered to people at risk of suicide. For simplicity, the term “client” is used below to refer to individuals receiving care.

Key Questions and Recommendations

The key questions and recommendations that follow address challenges related to key organizational and systems factors that were identified by the advisory group convened by SPRC. These key questions and recommendations should be discussed by Zero Suicide implementation teams and include senior organizational leadership, clinical, quality, evaluation and operations staff, and information technology (IT) staff involved in data collection.

Question: How does our organization define the clients we serve and count them in our metrics, and for what time intervals?

Recommendations:

  • When considering any kind of client care data over time, determining who counts as part of the client population is vital.
  • Establishing which individuals count toward certain metrics is key to determining the impact, or lack thereof, of changes that are meant to be care improvements.
  • Example:
    • Depending on your organization, you might utilize these qualifiers for counting clients:
      • “Pre-care” (appointment scheduled, but not yet seen by a provider)
      • “In-care” (seen in the past 30, 60, 90, or 180 days)
      • “Post-care” (not seen in the past calendar, fiscal, or reporting year)

 

Question: How does our organization define the time intervals for reporting on different data?

Recommendations:

  • Time intervals should be deliberate and allow for the balance of time to show impact, but also the opportunity to catch harmful or ineffective practices in the near term so that improvements and changes can be made.
  • For a metric like suicide incidence rate a longer time interval is necessary, but for a metric like screening rate a shorter time interval that allows you to improve or change practices may be appropriate.
  • Example:
    • Based on the services your organization provides and the metric you are defining, your time interval could be:
      • Annual (calendar or fiscal year)
      • Quarterly (calendar or fiscal year)
      • Monthly
    • When calculating rates your numerator and denominator should represent fully overlapping time intervals.

 

Question: Are our organization’s data definitions in alignment with our operational definitions for clinical care expectations?

Recommendations:

  • Identify metrics that represent the organization’s Zero Suicide implementation focus and scope.
  • Establish specific operational definitions for the expected clinical care.
  • Example:
    • If the expectation is that you will screen all clients:
      • Does screening happen at every visit for every client, with every client who is “at risk”, at certain interactions (e.g. admission, discharge), and what time interval is appropriate for this metric?
    • This operational definition has significant implications for the data definition you apply (how you count clients, and the time interval) to track fidelity to expected care.
    • A common error to avoid in calculating screening rates is, for example, counting the number of clients in your numerator and the number of visits in your denominator or vice versa. Whether you count clients or visits the numerator and denominator should match (and the time interval should fully overlap as noted above).
    • Sample definition:
      • Denominator: Total number of visits where screening should have occurred based on organizational protocol during reporting period.
      • Numerator: Number of visits where screening occurred as expected based on organizational protocol during reporting period.

 

Question: For data that are challenging to collect or that you are currently unable to obtain at all, are there other ways that your organization could obtain information that indicates quality care is being delivered?

Recommendations:

  • In some cases it may be challenging to obtain data that are important to your Zero Suicide implementation. This could be related to the scope of the services offered by your organization or data access challenges, but if cannot collect data systematically it is still important to learn whether care is being delivered as expected.
  • Consider what partnerships and working relationships required to grow or develop to fill gaps in your data. These may be external partners such as coroner’s and medical examiner’s offices, and the network of other health systems where individuals in your care might also receive services.
    • For example: If your organization relies on claims data for information on suicide attempts recorded by an emergency department (ED), is there still a way to ensure follow-up care was delivered or support follow-up efforts by establishing an agreement with that ED?

 

Question: What are our organization’s data collection priorities, and do they match our Zero Suicide implementation needs?

Recommendations:

  • Data collection may be closely aligned with other strategic priorities (e.g. accreditation, insurance reimbursements) or guided by existing reporting requirements established at the state or national level (e.g. HEDIS follow-up measures, suicide incidence rate), and as a result be done more consistently and with designated staff and resources.
  • Data definitions for metrics that inform clinical practice improvements may be more variable, and not directly tied to external reporting requirements, so data collection may be done less consistently and with fewer staff and resources.
  • Due to this potential misalignment, it is critical to assess whether you need more organizational buy-in to help navigate barriers to collecting data on clinical processes and fidelity to protocols.

 

Question: Does our organization have the necessary staffing and resources to achieve progress in the IMPROVE element of Zero Suicide?

Recommendations:

  • Establishing the data collection infrastructure necessary to make significant progress in IMPROVE may require dedicated staff.
  • Include data collection, research and evaluation, or quality improvement staff on your Zero Suicide implementation teams.
    • These staff should be familiar with existing electronic health record (EHR) usage and reporting, and internal and external reporting expectations.
    • These staff may also facilitate obtaining data from external sources such as coroners, medical examiners, and other care settings (e.g. if an organization relies on claims data for information on suicide attempts recorded by an emergency department).
  • Engage senior leadership to establish data collection priorities to ensure staff time and resources are appropriately allocated to meet Zero Suicide implementation needs and external reporting requirements.

 

Acknowledgements:

SPRC thanks the following organizations for their contributions to this resource:

  • Aspire Indiana Health
  • AtlantiCare
  • Chickasaw Nation Departments of Health and Family Services
  • Children’s Hospital of Philadelphia
  • Compass Health Network
  • Henry Ford Health System
  • Intermountain Healthcare
  • Kaiser Permanente
  • Lawton Indian Hospital
  • Mental Health Center of Greater Manchester
  • Mercy
  • Suicide Prevention Center of New York
  • Sweetser
  • University of Massachusetts Medical School

 

Tags

SPRC and the National Action Alliance for Suicide Prevention are able to make this web site available thanks to support from Universal Health Services (UHS), the Zero Suicide Institute at EDC, and the Substance Abuse and Mental Health Services Administration (SAMHSA), U.S. Department of Health and Human Services (DHHS) (grant 1 U79 SM0559945).

No official endorsement by SAMHSA, DHHS, or UHS for the information on this web site is intended or should be inferred.