Archived Resources

The Resource Library houses tools and products that were developed by IDC, developed with its collaborators, or submitted by IDC stakeholders. Search and filtering tools are available to help users navigate through the library.

Archived Resources 29 - 35 of 222

View Current Resources
Archived Indicators
Archived Topics
Archived Format
    An IDC Resource

    Format: Presentations

    Using the Success Gaps Toolkit to Support Improvement Activities

    LEAs in all states have many improvement initiatives underway at any one time. This workshop described how state and local staff can use the Success Gaps Toolkit to align various needs assessments and improvement strategies and use the data generated to support improved results for students with diverse learning needs. One state shared how it uses the Success Gaps materials with LEAs and some of the lessons it has learned.

    An IDC Resource

    Format: Presentations

    Addressing Success Gaps to Support Program Improvement

    This session described success gaps and a process for identifying root causes of success gaps in achievement, graduation rates, or other results. This process can help teams dig deeply into a variety of data sources to identify and develop a plan to address gaps when not all students are experiencing equitable opportunities for learning and progressing.

    Format: Presentations

    A Critical Look at SSIP Phase III Reporting

    This workshop provided an opportunity for participants to critique and discuss sample SSIP report excerpts. Guided table discussions focused on identifying strengths and ways to improve the presentation of data for a robust and succinct SSIP. States had an opportunity to develop action steps for completing their SSIP Phase III Year 2 report.

    Format: Presentations

    It’s All About the Data: Keys to Creating a Robust & Succinct SSIP Report

    This session focused on OSEP’s guidance for the 2018 SSIP Phase III, Year 2 reporting. The presenters addressed the importance of data quality for assessing progress implementing the SSIP and achieving the SiMR. The presenters reviewed the updated measurement language for Indicator B17 in the state’s APR and highlighted resources available to states as they organize the SSIP submission. This presentation emphasized the importance of having, using, and understanding high-quality data to inform decisionmaking at all levels of the education system. Such informed decisionmaking is necessary to support ongoing implementation of the SSIP; justify changes to strategies, timelines, and outcomes; and help communicate the state’s SSIP activities in a robust yet succinct report.

    An IDC Resource

    Format: Presentations

    Assessing and Improving Your SSIP Data Quality to Support Your Results

    This workshop engaged participants in identifying data quality issues that interfere with assessing progress in SSIP implementation and improvement toward the SiMR. A facilitated discussion focused on strategies for improving data quality to support decisionmaking and achievement of results.

    Format: Presentations

    Demystifying the IDC Part B Data System Framework

    This workshop provided participants firsthand experience working with the IDEA Data Center Part B Data System Framework to explore factors impacting data quality or ways to expand capacity in the SEA. Because the Framework is based on a whole systems approach, its use can encourage discussion and provide direction for how to tackle everyday problems and issues. Through hands-on exercises, participants explored different approaches to how states might apply the Framework to improvement efforts, planning, or organizational activities.

    Format: Presentations

    Step by Step: Overcoming Challenges by Documenting MOE Reduction and CEIS Processes

    This workshop explored the benefits of documenting MOE Reduction and CEIS processes for collecting and submitting data to OSEP. Presenters described some of the common challenges that SEAs face with these data. States shared the steps in their processes, who is responsible for the processes, what methodologies they use to collect and review data, and how they consistently validate and report the data.