Data Quality

Having and Using Quality Outcomes Data

Type: 
Conference Session
Author/Presenter: 

Linda Goodman (CT Part C) and Nancy Skorheim (ND 619)

Year: 
2010
Abstract: 

Presented at the 2010 Measuring Child and Family Outcomes Conference.

Two states provided information focused on promoting consistent quality outcome data and the use of outcomes data in the monitoring process. North Dakota Section 619 shared the State'€™s COSF Quality Assurance Checklist and other outcome resources that have been provided to North Dakota special education units. The Connecticut Part C program shared their Results Based Accountability that includes the child and family outcomes data as selection criteria for focused monitoring.

Got Data? A Workshop on Early Childhood Outcomes

Type: 
Conference Session
Author/Presenter: 

Kathy Hebbeler, ECO at SRI International; Robin Rooney, ECO at FPG/UNC; and Christina Kasprzak, ECO at FPG/UNC

Year: 
2008
Abstract: 

Held in Edwardsville, IL and Chicago, IL. This two-day training provided a review of child outcomes measurement and data collection using the COSF. The training also introduced strategies for assuring the quality of COSF data and ways to analyze and use outcome data for program improvement.

From Print Shop to Print: Moving from Building a Longitudinal Data System to Reporting Your Data

Type: 
Conference Session
Author/Presenter: 

Elizabeth Laird and Nick Ortiz

Year: 
2009
Abstract: 

Presented at the 2009 Measuring Child and Family Outcomes Conference.

This session provided an overview of the Data Quality Campaign's experiences with longitudinal data and implications in shaping work in early childhood. A powerful approach to reporting/presenting longitudinal child outcomes data was illustrated by one state. Embedded in the presentations were strategies to communicate and promote use of longitudinal data for effective decision making.

Do My Data Count? Questions and Methods for Monitoring and Improving Our Accountability Systems

Type: 
Conference Session
Author/Presenter: 

Sara Gould (ECO at Univ of Kansas), Charles Greenwood (ECO at Univ of Kansas), Margy Hornback (KS Birth-5), Dale Walker (ECO at Univ of Kansas), Marybeth Wells (ID 619), and Tina Yang (ECO at Univ of Kansas)

Year: 
2008
Abstract: 

Presented at the 2008 Measuring Child and Family Outcomes Conference.

The purpose of this session was to explore the range of questions that can be asked to assist states in establishing the validity of their child outcomes accountability system. Presenters discussed ways to gather, interpret and use evidence to improve accountability. Examples from two states were shared.

Data Drill Down: Supporting Local Programs in Realizing the Possibilities for Using Data

Type: 
Conference Session
Author/Presenter: 

Christina Kasprzak, David Lindeman, Chelie Nelson, and Phoebe Rinkel 

Year: 
2012
Abstract: 

Presented at the 2012 Measuring and Improving Child and Family Outcomes Conference. How can local programs implement a process for drilling down into their data to ensure data quality and program quality? In this session, Kansas  shared their Data Drilldown Guide and training developed for supporting local programs in looking at child outcomes data and planning for improvement. ECO shared a national resource with suggested drill down questions developed for child outcomes (C3/B7) and family outcomes (C4).

Child Outcomes Data Pre-Meeting Workshop

Type: 
Conference Session
Author/Presenter: 

Kathy Hebbeler and Donna Spiker, ECO at SRI

Year: 
2007
Abstract: 

Presented at the 2007 OSEP National Early Childhood Conference. The objective of these presentations and handouts was to help participants gain an understanding of how to examine the validity of state outcome data, how to interpret and use outcome data, and how to talk about early and future data with the media.

California's Approach to Supporting High Quality Data

Type: 
Conference Session
Author/Presenter: 

Larry Edelman, Anne Kuschner, Steve Lohrer, Mary McLean, Patty Salcedo, and Cornelia Taylor Bruckner

Year: 
2009
Abstract: 

Presented at the 2009 Measuring Child and Family Outcomes Conference.

This session described and illustrated activities put in place by California's Desired Results system to support practicioners and administrators to maintain the highest possible quality of data for the DRDP Assessment System for preschool special education. The session addressed a number of dimensions of quality, including fidelity, agreement, completeness, reliability, validity, and utility.

Building State Systems to Produce Quality Data on Child Outcomes

Type: 
Conference Session
Author/Presenter: 

Presented by Kathleen Hebbeler, Jim Lesko, Kim Carlson, and Lynne Kahn.

Year: 
2010
Abstract: 

This session, held at the Division for Early Childhood (DEC) Conference, described the overall status of state reporting on child outcomes and introduced a self-assessment tool that examines the components of a quality measurement system. Two states described their experiences in measuring outcomes.

Assuring the Quality of Data from Family Surveys

Type: 
Conference Session
Author/Presenter: 

June DeLeon and Elaine Eclavea (GU CEEDERS), Suzanne Lizama (CNMI Part C), and Maureen Sullivan (VT Part C)

Year: 
2007
Abstract: 

Presented at the 2007 Measuring Child and Family Outcomes Conference.

States shared ways they are planning to validate the results of their family outcomes surveys as part of their monitoring process, and facilitated a discussion about considerations, challenges and other possible strategies for confirming the soundness of survey results.

Assuring the Quality of Child Assessment Data

Type: 
Conference Session
Author/Presenter: 

Cornelia Bruckner, Meredith Cathcart, and Patricia Salcedo (CA 619), and Susan Smith (CO 619)

Year: 
2007
Abstract: 

Presented at the 2007 Measuring Child and Family Outcomes Conference.

Presenters discussed strategies and systems for assuring quality assessment data through training and TA, as well as data validation. Training and TA strategies include, for example, a consistent message to personnel through well-developed training modules and standards for trainers delivering professional development. For data validation, strategies include mechanisms in the data system to identify missing data and contradictory responses.

Syndicate content