Interactive Institutes 2021 Logo

Interactive Institutes 2021

The Interactive Institutes 2021 – Building and Sustaining a Culture of High-Quality Data provided opportunities for participants to take a deep dive into data quality topics to learn about data culture change. Participants gained fresh ideas and insights into best practices to improve data collection, reporting, analysis, and use. Through innovative and interactive virtual experiences, they engaged with their state team, peers, and TA providers across the country about trending data quality topics. Each session presented actionable ideas and strategies to improve work processes and sustain high-quality data work.

 

Topic Selection

Selected Topic Presentations

Select a topic to see the related sessions, presenters, and downloads.

 
  • Session: A1 Sampling for Part B Indicators 8 and 14: Requirements, Strategies, and Lessons Learned

    Planning for the FFY 2020–2025 State Performance Plan/Annual Performance Report provides an opportunity for states to update, refine, or even reconsider how they collect data. Sampling can provide an effective means for conserving resources and improving data quality for Part B Indicators 8 and 14. During this session, presenters discussed the requirements for designing and implementing an effective sampling strategy for these indicators and highlighted both the benefits and challenges of sampling. Participants had an opportunity to discuss common practical and methodological issues and strategies to inform their data collection plans.

  • Session: N2 LEA Determinations in the Time of COVID-19

    With the impact of COVID-19 on State Performance Plan/Annual Performance Report (SPP/APR) data, are states considering changes to their LEA determinations process? States may be thinking about a variety of adjustments to their determination systems to mitigate the impact of COVID-19 on LEAs. School closures have affected all aspects of data quality—timeliness, accuracy, and completeness—although the disruptions have affected states differently across the country. This session helped answer the question about what states may be thinking of doing to their determination systems by providing different state perspectives and considerations as states navigate their own approaches to LEA determinations. Participants heard directly from one of their state colleagues about the state’s deliberations for LEA determinations in a COVID-19 world.

  • Session: C1 Documenting Your Data Processes to Support High-Quality Data

    This presentation provided a brief overview of the IDEA Part B 618 SEA Data Processes Toolkit followed by a facilitated panel discussion among states that have documented their data processes using the toolkit protocols. Several states shared their experiences with creating data processes for the State Performance Plan/Annual Performance Report (SPP/APR) indicators and building capacity of data stewards through development of data processes. Panelists described how documenting data processes can help state teams instill a culture of high-quality data. IDC staff facilitated a discussion as panelists answered questions participants identified as their top concerns. Listening to the experiences of other state staff helped participants understand the value of data processes documentation, and interactive polls afforded them an opportunity to reflect on the needs of their own states.

  • Session: D1 Mind the Virtual Gap! Make the Most of Your Stakeholder Collaborations

    Collaboration is hard. Virtual collaboration with data is even harder! This session brought that challenge to the forefront and helped states build capacity to engage stakeholders in data conversations using technology and soft skills. Presenters addressed characteristics of quality collaboration and answered the question, “Why collaborate?” They focused on specific challenges of collaboration in a virtual environment and specific ways to overcome the obstacles. The need for stakeholder engagement in the State Performance Plan/Annual Performance Report (SPP/APR) is growing in importance, while limitations from COVID-19 conditions continue to restrict in-person engagements. Participation in this session helped to grow the capacity of state teams to close the virtual gap!

  • Session: E1 Indicator 8 Survey Design: Practical Suggestions to Increase Response Rates and Representativeness

    With the implementation of the new Part B FFY 2020–25 State Performance Plan/Annual Performance Report, school closures and reopenings, and the move in many districts to virtual learning environments, more and more states are interested in redesigning their parent involvement surveys for Indicator 8. This session was a practical “how-to” that highlighted best practices that can increase survey response rates and data representativeness. Presenters discussed common challenges states encounter related to survey design and described strategies for collecting meaningful and useful data from families. A colleague from a state that has redesigned its survey described the steps the state took, lessons learned, and how the changes affected data quality. Participants had the opportunity to discuss and share ideas about survey changes they are considering and the process involved in making those changes. This presentation focused on Indicator 8, but the content also was highly relevant for Indicator 14.

  • Session: F1 Supporting Districts Identified With Significant Disproportionality: States Tell Their Stories

    IDC staff provided an overview of various ways states can support districts identified with significant disproportionality—e.g., helping districts implement a process for determining the factors contributing to significant disproportionality, implementing comprehensive coordinated early intervening services (CCEIS), collecting and reporting CCEIS fiscal and student data. Then, state panelists described how they support their districts that are identified with significant disproportionality and how districts are benefitting from the support and implementing their CCEIS. Panelists invited participants to engage in a discussion about the challenges they have experienced and solutions for providing technical assistance to districts identified with significant disproportionality in the current environment.

  • Session: G1 Uncovering the Story Behind the Data: Supporting Effective Data Analysis and Use

    Creating a data analysis and use process is critical to promoting a data culture because having a process helps develop common practice and language to allow for more in-depth and consistent data analysis. Presenters detailed how a shared data analysis and use process supports the capacity of staff to use data more effectively to make program and policy decisions. Participants heard how IDC and states can work together to use IDC tools to create a customized data use process and generate individualized solutions for specific state needs. Hawaii Department of Education staff shared their story of embarking on this process with IDC staff and how they are using their customized tools within the state.

  • Session: H1 Unearthing Root Causes: How Data Analysis Processes Can Target State and LEA Improvement Activities

    Are you struggling with data-based decisionmaking? Do you feel like you’re spinning your wheels when working with LEAs to improve outcomes for children and youth with disabilities? Understanding how to use data for unearthing root causes that contribute to poor student outcomes will help states and LEAs identify strategies that address systemic challenges. During this session, presenters shared a systems planning approach to guide data analysis and use, as well as information on a process state and LEA staff can use to delve deeply into data and generate questions and hypotheses about the information. Participants learned how to inquire about data, create and test hypotheses, and use high-quality data to target improvement activities and achieve desired educational outcomes for children and youth with disabilities.

  • Session: Plenary2 OSEP Updates

    The OSEP Updates plenary presented information on how COVID-19 has affected nearly all aspects of OSEP’s work including the State Performance Plan/Annual Performance Report (SPP/APR) and Differentiated Monitoring and Support (DMS) 2.0. The presentation detailed OSEP’s response to the pandemic, such as issuing a series of Question and Answer documents and support from OSEP-funded TA centers; the President’s Executive Order on Supporting the Reopening and Continuing Operation of Schools and Early Childhood Education Providers; and the government’s American Rescue Plan. Updates on the SPP/APR included an explanation of instructions to states to provide a narrative for any SPP/APR indicators that COVID-19 specifically affected. Details on the FFY 2020 –2025 SPP/APR included information on the expanded stakeholder engagement requirements and Assessment flexibilities for the FFY 2020 SPP/APR reporting. The presentation concluded with OSEP responding to several state-generated questions.

  • Session: W3 Newbie Too (B2): Changes in the Dropout Rate Under the New Measurement Table

    In this session, presenters discussed the transition to using IDEA Section 618 Exiting data for calculating dropout rates, provided examples of how states’ dropout rates may change, and explored strategies for reviewing baselines and setting targets for the years ahead. Participants heard from a state colleague who shared the challenges of moving to the new methodology, including working with stakeholders in setting targets based on the new formula. The next two years will bring significant changes to the measurement of the dropout rate for children and youth with disabilities (Part B Indicator 2). States have had two options for calculating dropout rates. Option 1 is an exiter or leaver rate based on 618 Exiting data. Option 2 is an annual event rate based on the National Center for Education Statistics (NCES) Common Core of Data. Beginning in FFY 2021 (the 2023 Annual Performance Report [APR] submission), all states must use Option 1. While the new calculation will yield more uniform and comparable data across states, it also will likely increase dropout rates in many of the 40 states currently using Option 2. This will necessitate setting new baselines and targets using the new formula.

  • Session: J2 SEA and LEA Partnerships Improve Data Quality

    This interactive session explored how state data managers can connect with and help build capacity of LEAs to affect IDEA data quality. LEA personnel oversee LEA data collections and submissions; therefore, it is important for state data managers to develop methods of communication that enable them to disseminate information about the various elements of IDEA data that will affect all aspects of LEA data collection and submission. For example, the quality of the data entry is often a result of past practices. State data managers can share knowledge of tools and practices that can improve data entry at the source. State data managers also can share knowledge, provide resources, and help develop data competencies for LEAs staff through intentional communication and professional development to improve data quality and confidence in the data LEA staff collect and report. Presenters highlighted three resources, IDC’s 618 Data Collection Calendar, Data Meeting Toolkit , and IDEA Data Quality: Outlier Analyses Tools that states can use to help staff connect with and build LEA capacity for high-quality data. Presenters highlighted how states can use the resources to plan for their 618 data collection reports and effectively communicate with LEA personnel about upcoming data submissions. States also can use the resources to guide conversations with LEAs about using data to make decisions. Session participants had the opportunity to explore these resources and hear from a panel of state data managers about the tools, resources, and practices they have used and found to be successful in improving communication with LEAs and increasing data quality.

  • Session: K2 Exploring IDEA 618 Data and Beyond

    Staff from the Office of Special Education Programs (OSEP) and IDC are teaming up to help states take a closer look at their OSEP Data Quality Reports and how states can use the information to manage data collection challenges. OSEP staff presented an overview of the Data Quality Review Process and items included in the Data Quality Reports, including whether the data are timely, accurate, and complete; year-to-year changes; and other items. They also reviewed the trends in the Data Quality Reports. IDC staff focused on the year-to-year changes in the reports. Presenters answered questions including (1) Why is it important to review the year-to-year changes? (2) What can the IDEA 618 and other state data tell states? (3) How can state staff share the data with LEAs and other SEA staff to facilitate understanding and use? Presenters explored ways SEA staff can use their IDEA 618 data and other data sources to analyze the year-to-year changes. They featured use of IDC’s IDEA Data Quality: Outlier Analyses Tools as one way to review year-to-year changes. Participants engaged in a discussion reflecting on scenarios that demonstrated the challenges of collecting the IDEA 618 data during the COVID-19 pandemic and solutions they implemented to alleviate the problems the challenges presented.

  • Session: L2 One State’s Journey to Improve Preschool Environments Data

    LEAs are asked to provide preschool special education services in the least restrictive environment (LRE) as directed by federal policy. However, nearly one-fourth of children who participate in preschool special education are served in separate classes, while only one-third are in inclusive early care and education classrooms at least ten hours a day. Collecting, reporting, and analyzing preschool environments data can lead to increased numbers of children who receive services in the LRE and support sound decisions by program leaders and policymakers. As a state struggling to improve its performance on Part B Indicator 6, Preschool Environments, Kansas, along with IDC, embarked on a journey to take a “deep dive” into preschool environments data with LEAs to uncover successes and challenges they have encountered. During this session, presenters highlighted the use of IDC’s Educational Environments 3-5 Data Template: Calculating Local Data Worksheet to determine the LEAs serving children in both the most and least inclusive educational environments in Kansas to deepen the state’s understanding of its own data. Then, presenters provided an overview of conversations with the Kansas LEAs to determine the accuracy of the data, learn how LEAs serving children in more inclusive settings have been successful, and identify barriers they faced to improve their preschool environments data.

  • Session: M2 Is It Time to Revisit Your State-identified Measureable Result (SiMR)?

    Is your state considering changing its State-identified Measurable Result (SiMR) statement for the FFY 2020–2025 SPP/APR? Participants who joined in this session discovered the key considerations for changing or refining their SiMRs. Presenters discussed the requirements related to SiMR revisions and shared tips and strategies for making changes to the SiMR. Participants also had an opportunity to learn from their peers’ experiences and discussed the pros and cons of revising the SiMR.

  • Session: B1 Communication Is Key to Improving Data Quality: It Starts With State Directors of Special Education

    High-quality state data depends on high-quality data collection at the local level. Often LEAs do not have accurate or up-to-date information regarding data collection requirements and practices and data use. Intentional communication that starts with the state director of special education and flows to the local director of special education and data staff was the focus of this session. Presenters highlighted effective strategies for communicating data requirements and elements of high-quality data to LEAs through the use of a communication plan that centers on LEA stakeholders and the information they should receive. Participants had the opportunity to share experiences and successes for effectively engaging with LEAs to improve the quality of their data and, in turn, outcomes for children and youth with disabilities.

  • Session: O2 Navigating Uncharted Waters: Engaging Stakeholders in Indicator 3 Baseline and Target Setting

    The Part B FFY 2020–25 State Performance Plan/Annual Performance Report (SPP/APR) package includes important changes to the way states will report Indicator 3 data. States must now report (1) statewide assessment performance separately for the general assessment and the alternate assessment, (2) the gap in proficiency rates between children and youth with individualized education programs (IEPs) and all children and youth on the regular assessment, and (3) all Indicator 3 data at grades 4 and 8 and high school. These changes to state-level reporting will occur at the LEA level as well. This constellation of changes represents a significant shift in the way states report assessment data for children and youth with IEPs and also requires states to consider setting new baselines and to set new targets. Furthermore, states must engage stakeholders in the process of establishing baseline and target data for all components of Indicator 3, an endeavor complicated both by the lack of statewide assessment data from spring 2020 and the need for social distancing due to COVID-19 that affects engaging stakeholders in the process. Presenters provided an overview of the upcoming reporting changes to Indicator 3 and offered strategies and suggestions around authentically and meaningfully engaging stakeholders, probably virtually, in the baseline and target setting process. They asked participants to share successful and meaningful stakeholder engagement strategies that they have found helpful in the past. All session participants had an opportunity to explore the implications of the lack of spring 2020 statewide assessment data and how this may affect their baseline and target setting process. Finally, there was an opportunity for states with an assessment-focused State-identified Measurable Result (SiMR) to share whether the changes to Indicator 3 will affect their SiMR.

  • Session: P2 Uncovering the Mysteries of Significant Disproportionality Through Data Visualization

    Understanding significant disproportionality, the underlying data, and how significant disproportionality affects students is critical to identifying strategies for addressing the disproportionality. How a state displays and communicates its significant disproportionality data can help LEAs and other stakeholders increase their understanding of the problem and how it is affecting students and families. Presenters explained foundational principles of data visualization and applied them to displays about significant disproportionality. Participants heard from two states that shared examples of how they visually present significant disproportionality data to state and local staff and stakeholders to engage them in meaningful discussions about the data and the issues the data present.

  • Session: Plenary 3 State Tool Showcase

    Looking for new practices, tools, or resources to improve data quality? Participants visiting the State Tool Showcase, discovered state-developed solutions to improve data collection, analysis, and reporting as well as use of data for decisionmaking. Participants' peers highlighted a variety of tried-and-true tools and other resources they are using to improve data quality. Participants also heard a brief overview of the tool or resource and engaged in dialogue with the state presenter about the data quality challenge the tool or resource addresses. Participants learned about an array of multi-purposed tools and resources and had the opportunity to check out multiple state presentations. They were inspired when they saw and heard about what other states are doing to improve IDEA data quality!

  • Session: Q3 Where Did Those Data Come From? Using the IDC LEA Data Processes Toolkit

    Have you ever heard an LEA ask the SEA where it got its data? This was an interactive session where IDC staff identified some of the challenges related to collecting high-quality LEA data and explored one solution found in the IDC LEA Data Processes Toolkit. Presenters demonstrated how the toolkit resources support improved LEA data quality through use of the toolkit’s LEA data collection protocols. In this session, participants learned from a state that has used the protocols to improve data collection processes, including validating the data and ensuring timely data submissions to the SEA. Presenters discussed the value of clearly defining and documenting data processes that capture institutional knowledge of how to produce better data. Participants heard how this work also builds staff understanding of the federally required data collections. Participants learned more about how consistently applying written data processes will lead to increased confidence in the data and, ultimately, improved data-informed decisionmaking at the local and state levels.

  • Session: R3 It’s 2021! What Do States Need to Know About IDEA Exiting Data?

    Have you wondered if you are reporting IDEA Exiting data correctly? What rules apply to multiple exits, catchment areas, and reaching maximum age in your state? How do the changes to the definition of alternate diploma affect students in your state? Are you prepared for the new requirement to use Exiting data for Indicators 1 and 2 in the Part B State Performance Plan/Annual Performance Report (SPP/APR) for FFY 2020? Participants joined a guided conversation that unpacked the data elements states report in File Specification (FS)009. There was discussion about how they use the alternate diploma in the calculation of Indicator 1 (graduation rate) and how this calculation differs from other types of graduation rates calculated in K-12 education. Presenters also explored some of the intricacies of Exiting data to help them have more meaningful conversations with stakeholders as states move into the new SPP/APR cycle.

  • Session: S3 Building Blocks of a Data-Driven Culture

    Recent events, including the COVID-19 pandemic, may have illuminated for some SEAs the lack of a data-driven culture in their state. Having a data-driven culture is key to unlocking the success that comes with using data to drive decisions. SEAs build and enhance their data systems to collect an enormous amount of data for federal and state reporting. However, SEAs often find that they often don’t use the data beyond reporting, thus depriving the state of opportunities to improve state initiatives and outcomes by using quality data to drive decisions. Participants engaged with presenters and peers during this session and explored the true meaning of a data-driven culture, addressed data literacy issues and the data literacy gap, and described building blocks needed to establish an environment that embraces data-driven decisionmaking.

  • Session: T3 Representativeness in Indicators B8 and B14 Data: Why You Need It and Tools to Get It

    Because of changes in requirements in the new Part B State Performance Plan/Annual Performance Report (SPP/APR), states will be required to report more precisely than before on the measurement of representativeness for Indicators 8 and 14 data. However, gathering representative data for these indicators presents a perennial challenge for many states. Presenters from IDC and the National Technical Assistance Center on Transition (NTACT:C) explained the importance of collecting representative data and introduced states to resources they can use to support their efforts to make sure the data they collect are representative. Presenters also shared a tool states can use to measure the representativeness of the data they gather. Presenters highlighted the benefits of collecting and using representative data for reporting and for making program and policy decisions. Recognizing how difficult it can be to gather representative data, presenters shared strategies for reporting data that are not representative, including acknowledging limitations to the conclusions that can be drawn from such data.

  • Session: U3 Using Gamification to Build an Engaging IDEA Data Training

    Helping your staff understand and report IDEA data can be difficult. How can you make it easier? Incorporate fun and interactive games into your training! This session helped participants better understand the concept of “gamification” and how it can be a powerful training tool. Presenters shared examples of how to train staff on IDEA reporting concepts using games and other engaging techniques.

  • Session: V3 Do You Need Help Preparing for the FFY 2020–2021 SPP/APR?

    As SEAs prepare for the State Performance Plan/Annual Performance Report (SPP/APR) submission in February 2022, they may be looking for supports to meet the new and varied requirements. Participants learned how two states are using IDC tools to inform their staff and their work related to tasks, timelines, target setting, stakeholder engagement, changes in data collection, and specific reporting requirements. Participant colleagues from North Carolina and Vermont shared how they have prepared for stakeholder input for target setting, data analysis, developing improvement strategies, and evaluating progress, using IDC tools and resources, including FFY 2020–2025 Part B SPP/APR Changes at a Glance; For FFY 2020 SPP/APR, What Data Will States Report?; and SPP/APR Tasks and Timelines. In this interactive session, participants explored and practiced using additional IDC resources including the Organizer Template for Part B SPP/APR Target Setting and the Indicator Organizer for Stakeholder Engagement and Target Setting that allow teams to identify the data year, data source, data availability, and any impacts of COVID-19 on data. These tools help users identify the tasks of stakeholder meeting preparation, develop visualizations to support stakeholder data analysis, and plan for stakeholder input on proposed changes to targets. Participants left the session knowing about resources that can help them meet requirements of the new SPP/APR and prepare for their submission next February.

  • Session: I2 Integrating Qualitative Data on the Fly: Qualitative Data as a Substitute for Missing Data

     As the past year has shown, forces far outside of state control may interrupt state plans for data collection. When expected quantitative data is unavailable, how can states know what progress they are making on state initiatives and outcomes for students? Qualitative data can help bridge the missing data gaps as a “stand-in” for missing quantitative data or to enhance or support any available quantitative data. Presenters discussed how to collect and present qualitative information and strategies for integrating qualitative data into existing initiatives, even if qualitative data collection was not originally included in the plan. Presenters shared examples of including qualitative data to guide participants in considering this information for their own data collection efforts.

  • Session: X3 The Equity Requirements of IDEA: To Align or Not to Align?

    Are you debating whether to align your State Performance Plan/Annual Performance Plan (SPP/APR) indicators regarding equity (Part B Indicators 4A, 4B, 9, and 10) with significant disproportionality? States should consider where the indicators and significant disproportionality overlap and how they differ to communicate effectively with stakeholders. States engage stakeholders in making recommendations about various measurement approaches and strategies for addressing both equity indicators and significant disproportionality. In addition, LEAs must understand the differences among the requirements of the indicators and significant disproportionality in order to implement and appropriately address the equity issue. Because of similarities among the indicators and significant disproportionality and the overall focus on equity, states may consider aligning some or most requirements for clarity and cohesion. This session reviewed areas to consider such as thresholds, cell sizes, n sizes, timing of determinations, and processes. Additionally, all of these areas require policy, procedure, and practice reviews and responses to noncompliance. Presenters talked with two states that have worked through alignment decisions differently and discussed lessons learned.