Data Quality

Strategies for Improving Response Rates for Family Surveys

Type: 
Conference Session
Author/Presenter: 

Lisa Backer (MN 619) and Jim Henson, David Mills and Wendi Wilson-John (VA Part)

Year: 
2007
Abstract: 

Presented at the 2007 Measuring Child and Family Outcomes Conference.

Want to improve your survey response rate? Two states shared why they selected certain procedures for the distribution and collection of family outcome surveys and will lead a discussion about what factors may influence response rates.

Results Not Demonstrated

Type: 
Conference Session
Author/Presenter: 

Lynne Kahn (NECTAC and ECO at UNC/FPG), Christina Kasprzak (NECTAC and ECO at UNC/FPG), Judy Swett (PACER Center), and Jennifer Tschantz (OSEP)

Year: 
2008
Abstract: 

Plenary session presented at the 2008 Measuring Child and Family Outcomes Conference.

Provider Perceptions of the COS Process

Type: 
Conference Session
Author/Presenter: 

Lauren Barton and Cornelia Taylor (ECO at SRI)

Year: 
2012
Abstract: 

Presented at the 2012 Measuring and Improving Child and Family Outcomes Conference. ECO staff presented findings from ENHANCE, a project studying how well the Child Outcomes Summary (COS) process produces meaningful data. Results from a survey of providers (N=850) were described, including questions about training experiences, provider knowledge, COS process approaches, perceived accuracy, and impact of the COS process on practice.

Patterns in Child Outcomes Summary Data: Analytic Approaches and Early Findings from the ENHANCE Project

Type: 
Conference Session
Author/Presenter: 

Lauren Barton, Donna Spiker, Cornelia Taylor 

Year: 
2011
Abstract: 

Presented at the 2011 Measuring Child and Family Outcomes Conference. This session presented a brief update on the status of ENHANCE, a research project investigating the validity of data from the Child Outcomes Summary (COS) process and identifying factors related to quality data. Presenters shared preliminary findings from the analysis of state data being conducted as part of ENHANCE. Content focused on techniques being used in the study for interpreting patterns to understand the validity of the data. Materials were provided to support states in analyzing the quality, consistency, and meaning of their own COS data.

Overview of ENHANCE: Research Underway on the Validity of the Child Outcomes Summary Form (COSF)

Type: 
Conference Session
Author/Presenter: 

Lauren Barton and Donna Spiker (ECO at SRI)

Year: 
2010
Abstract: 

Presented at the 2010 Measuring Child and Family Outcomes Conference.

Are COSF data valid and reliable? This session provided an overview of a research project underway to investigate the validity of the COSF. Plans for using information learned to provide better guidance about the COSF and implementing the COSF process were shared. The group discussed the kinds of validity information needed by States and specific content that might be important to investigate further in the studies.

OSEP Project Directors' Meeting

Type: 
Conference Session
Author/Presenter: 

Kathy Hebbeler and Lynne Kahn

Year: 
2011
Abstract: 

This presentation provided an overview of the reporting requirements for the child outcomes data, and presented results from the 2011 APR analysis. The methods for analysis, data quality, and meaning of the data were discussed. The ECO Center's child and family framework and self-assessment tools were also shared. 

Levels of Representativeness: How to Examine and Use Family Survey Data to Plan for Program Improvement

Type: 
Conference Session
Author/Presenter: 

Siobhan Colgan (NECTAC and ECO at UNC/FPG), Batya Elbaum (DAC FL), and Melissa Raspa (ECO at RTI)

Year: 
2010
Abstract: 

Presented at the 2010 Measuring Child and Family Outcomes Conference.

Participants were provided an opportunity to examine sample data and discuss different aspects of the issue of representativeness. These included:
1) response rates- did everyone who was supposed to respond to the survey actually respond?
2) proportional representation- how close do the response rate percentages match the comparison data for different variables of interest?
3) within subgroups, are respondents "representative" of their group? ; and
4) how do we use this information to target program improvement issues?

Is Your Family Survey Data Representative?

Type: 
Conference Session
Author/Presenter: 

Lynne Kahn and Anne Lucas (The ECO Center)

Year: 
2009
Abstract: 

Presented at the 2009 Measuring Child and Family Outcomes Conference.

In this session, participants gained an understanding of the variation in the population states are surveying, what criteria states are using to determine representativeness, and what data they are using for the analysis. Participants also had an opportunity to use sample data to explore whether or not family survey response data and representative of the population depending on various survey distribution methods. Participants identified issues related to representativeness through analyses of the data and discuss potential improvement activities.

Identifying and Remedying Missing Data Issues

Type: 
Conference Session
Author/Presenter: 

Ruth Littlefield (NH 619), Vanya Mabey (UT Part C), and Maureen Sullivan (VT Part C)

Year: 
2010
Abstract: 

Presented at the 2010 Measuring Child and Family Outcomes Conference.

A key step in ensuring reliable and valid data is to make certain all data are reported. In a roundtable format, participants discussed issues related to how to identify missing child outcomes data as well as strategies for reducing problems of missing data.

Helping Local Administrators and Providers Analyze and Use Their Outcome Data

Type: 
Conference Session
Author/Presenter: 

Jane Atuk, Lisa Backer, Fauna Hubble, and Lisa Balivet

Year: 
2009
Abstract: 

Presented at the 2009 Measuring Child and Family Outcomes Conference.

In this session, two states shared how they are preparing for local programs to use their child outcomes data to improve local Part C and 619 Services and results for children and families. Alaska shared how their data system is designed to assist local EI programs in reporting, understanding, and eventually using their data for program improvements. Minnesota shared their activities to support local leadership in conducting simple analyses for the purpose of pattern checking the quality of ECSE data.

Syndicate content