Interactive Institutes 2022 Logo

Interactive Institutes 2022

The occasion: IDC’s 2022 Interactive Institutes. The question: Are you a data-quality influencer? Participants from across the country came together both in person and virtually to ponder that very question, connecting (or reconnecting) with their peers and learning more about their power to influence data quality. The takeaway: States participated in innovative, engaging, and highly interactive learning formats and left with greater knowledge about proven practices to improve data collection, reporting, analysis, and use.

 

Topic Selection

Selected Topic Presentations

Select a topic to see the related sessions, presenters, and downloads.

 
  • Session: Helping Them Help You: Building LEA Capacity for High-Quality Data

    Is LEA data quality in your state the result of the organizational system or hero-driven? Are your LEAs reliant upon outdated data collection practices? In this topical burst, we explored the benefits of facilitated documentation of LEA data processes and how it can help states improve the quality of both LEA and SEA data. State data managers can share their knowledge, provide resources, and offer assistance to LEAs in improving their data. Ultimately, however, it is LEA staff who must oversee data collections and develop local data processes. During this session, we presented IDC’s LEA Data Processes Toolkit, designed to give SEA staff a resource to assist LEAs in documenting their own data processes. Participants learned how using this toolkit can be the next step in ensuring LEA data processes in their state are the result of systemic organizational standards, not past practices or the work of hero data clerks.

  • Session: Painting a Picture of LEA Outcomes Utilizing MOE Reduction and CEIS Data

    Do you struggle to find a practical use for the Maintenance of Effort (MOE) Reduction and Coordinated Early Intervening Services (CEIS) submission? Are you looking for the value-add of this report? Well, look no further because, in fact, this rich data collection contains fiscal, program, and data elements all in one package. Having these three types of information in one place provides data managers with a palette for painting a picture of outcomes within and across LEAs. In this session, presenters walked participants through a process of observation and interpretation to reveal how the MOE Reduction and CEIS submission data are meaningful. Participants engaged in a discussion about how to understand and use the data and reflect on other data sources available to help complete the picture. By the end of the session, participants had a picture in mind of how they could use the MOE Reduction and CEIS data palette to influence state systems, understand LEAs, and improve student outcomes.

  • Session: Six Data Quality Issues That Could Undermine Your SSIP Work

    If you are not getting the results you expected from your SSIP, overlooked data quality issues may be to blame. Data quality issues not only affect your reporting, they also could be preventing you from obtaining the results you want or from seeing where problems exist in your state implementation. This session functioned as a guided tour of the new IDC resource Checklist to Identify and Address SSIP Data Quality Issues. During this session, participants learned about six key data quality issues to look for, how those issues might specifically manifest themselves in SSIPs, and ways to combat their negative effects both immediately and in the future.

  • Session: Effective Strategies for Engaging Families in the SPP/APR Process: What Works?

    Are you wondering how other states engaged parents and families in the development of their SPP/APR? How did states ensure representation from diverse families across the state? How did they engage families meaningfully in analyzing data, developing improvement strategies, and evaluating progress? And how do states plan to involve diverse stakeholders meaningfully and continually in the work of the SPP/APR moving forward? The FFY 2020 SPP/APR increased expectations for stakeholder engagement, specifically calling out the need for data about how states engaged parents and requesting information on strategies states used to involve families, including parents. In this presentation, an IDC staff member shared strategies for effective and ongoing engagement. One state shared its story—including specific strategies to identify and engage parents—and talked about successes, lessons learned, and how the state plans to improve parent engagement.

  • Session: Data Provide Answers, but People Drive Change

    “Culture eats strategy for breakfast.”—Peter Drucker. Even the best data quality strategies are doomed without a data-informed culture to support efforts to enact them. Before states can use data-informed decisionmaking to improve student outcomes, they must create a culture in which people are willing to ask the hard questions and routinely use data to inform decisions at all levels. The COVID-19 pandemic has dramatically affected how people work, and there has been little time to reflect on how these changes have affected data culture. In this session, IDC staff discussed attitudes and actions that are part of the “new normal” and strategies to promote a data-informed culture. Participants also shared some of the challenges and solutions that have affected their data culture.

  • Session: “How-To” Session on Leveraging Initiatives to Increase Inclusive Opportunities for Preschoolers

    Preschool is where the path to graduation and post-secondary opportunities begins. However, have you ever wondered how we actually promote high quality inclusive environments for preschoolers with disabilities? Understanding children’s early childhood educational settings helps begin the story of success related to how state and local initiatives can affect positive change for preschoolers with disabilities. The individualized education program (IEP) provides information about preschool placements and environments, where children receive services, and how children receive services. State and local initiatives often influence the IEP process and how preschoolers develop skills they need as they begin to enter into school settings. In this “how-to” session, a state leader shared how their state tackled issues related to Indicator 6 data quality, data collection, and analyses to monitor initiatives to increase inclusive opportunities for the state’s preschoolers.

  • Session: What Would Bones and Booth Do? Examining Levels of Data to Find Out “Whatdunnit”

    Are you sleuthing to determine why your current systems are not producing effective outcomes? As in the TV series Bones, where a multi-disciplinary team collects and examines different types of data from a crime scene to find out "whodunnit," in this session, SEAs engaged in an investigative process to try to find an answer. Using different levels of data, including satellite data, map data, and street data, teams explored the systemic factors that often contribute to poor achievement for children with disabilities—the "whatdunnit." Instead of a deficit mindset of believing data tell us what is wrong with students and communities, this session proposed that participants "flip the script" and amplify the voices, experiences, and strengths of the most important stakeholders—students and families. SEAs had an opportunity to work together to identify how to use the satellite, map, and street data to identify the "whatdunnit" to create real and lasting systems change.

  • Session: Identifying the “Why” and Improving the “How” of Collecting High-Quality Parent Involvement Data

    Actively engaging stakeholders and getting meaningful feedback can be difficult, especially when education jargon and data are involved. Too often, it’s easy to get caught up in the science behind the data and forget what draws others in—the reasons “why” to get involved and provide feedback. In this session, IDC presenters shared strategies to obtain stakeholder feedback related to two commonly challenging areas of Indicator 8—parent survey development and use of the data for improvement. Participants heard from state staff about their work with IDC as they sought to uncover “why” it’s important to collect parent involvement data. Participants listened as state staff described how they collaborated with partners to engage stakeholders in a survey redesign process and explored ways to better share and use their data to ensure they meet the needs of students with disabilities and their families. Presenters also highlighted resources states could use to improve the quality of their parent involvement data in these areas.

  • Session: On Your Mark, Get Ready, Get Set, Go: Documenting State Data Processes in Preparation for DMS 2.0

    Data are at the heart of general supervision systems and a key component of OSEP’s Differentiated Monitoring and Support (DMS 2.0) process. So, the big question is: How is the preparation for DMS 2.0 going? Having state-level data collection, validation, reporting, and analysis processes clearly and comprehensively documented will be an invaluable support in planning for the DMS monitoring experience. In this session, participants learned the value of documenting state data processes in the context of DMS preparation as well as how documentation can contribute to other state goals, including deepening staff knowledge of IDEA 616 and 618 data requirements and creating a culture of high-quality data. The session featured IDC’s SEA Data Processes Toolkit and explored how states could leverage this resource to support their DMS planning. Presenters provided an overview of the toolkit, and participants had an opportunity to practice “preparing” for this DMS work.

  • Session: Breaking Out of the “Here We Are Again” Syndrome in Significant Disproportionality

    Does the significant disproportionality and comprehensive coordinated early intervening services (CCEIS) cycle resemble a hamster wheel for some districts in your state? This session invited participants to learn how IDC’s updated Success Gaps Toolkit can help turn repeated root cause analysis and ineffective action planning into effective plan-do-study-act cycles that can bring about real change for children. Participants heard from those who have worked with the toolkit and from states with multi-year significant disproportionality experience. This session was for state directors and others responsible for state-level significant disproportionality planning.

  • Session: Mastering the Juggling Act of Writing a High-Quality SPP/APR

    The SPP/APR isn’t your typical federal report—preparing it is a juggling act like no other! It requires data analysis, narrative development, stakeholder engagement, and knowledge of the nuanced components for 17 indicators. When writing the SPP/APR, states must strike a balance between checking necessary boxes and producing an accessible report that reflects the important work states do to support and educate children and youth with disabilities. Designed for all the different state staff who work with the SPP/APR, this session focused on tips and tricks for states to master this juggling act, focusing on challenging areas like documenting correction of noncompliance; developing clear and concise responses to prompts; and analyzing data and communicating this analysis. Participants in this session came away with strategies to create a comprehensive SPP/APR that both meets OSEP requirements and communicates the state’s progress toward the targets for all 17 indicators.

  • Session: The 7-Year Itch: Discussing Strategies for Aligning SSIP Components to Monitor Progress

    Over the last 7 years, through all the phases of the State Systemic Improvement Plan (SSIP), states have worked diligently toward meeting SSIP targets. Are you feeling a 7-year itch? It might be time to update, realign, or shift the focus of your SSIP. While change for the sake of change is not necessary, now—after so many years and at the beginning of a new SPP/APR package—may be a great time to revise your SSIP and make sure all your implementation and evaluation efforts are in alignment. In this session, a panel of state colleagues engaged in a guided discussion about the strategies they have used to ensure alignment of key SSIP components. During this interactive discussion, state representatives shared lessons learned about how continuous monitoring and realignment of SSIP components and making necessary changes when needed helped their states progress toward their long-term goal—the State-identified Measurable Result (SiMR).

  • Session: Connecting the Cubicles: Making Post-School Outcomes Connections

    Are you “siloed” in your cubicle, limited in your ability to make connections between the SPP/APR indicators and, specifically, how the data for Indicators 1, 2, 13, and 14 actually relate to each other? Efforts to break down silos and build connections require intentional communication between SEA staff, or contractors, who collect and report the data and SEA staff responsible for professional development and data-related procedures. These efforts should enable staff to consider relationships among SPP/APR indicators, which can result in longer-term outcomes. Data managers, SPP/APR coordinators, indicator leads, and directors attended this session in order to consider ways to strengthen partnerships between data and program staff so they could make the connections between performance on Indicators 1, 2, 13, and 14 and programming. They also heard from a state whose collaborations around the data have changed conversations and practice to improve outcomes for students with disabilities.

  • Session: Keep the Wheels Turning: Stakeholder Engagement Moving Forward

    What will stakeholder engagement look like in the FFY 2021 SPP/APR—the same, a little, or a lot different? The FFY 2021 SPP/APR will include the same stakeholder engagement components that states need to address, but it will require states to provide new or updated information about how they are supporting year-round efforts to engage stakeholders in all aspects of the SPP/APR. How are states moving forward to address the stakeholder engagement requirements? Are they trying new approaches? In this session, IDC staff shared ideas about ongoing stakeholder engagement and a tool that helps states build their stakeholder engagement plans. State panelists also shared how they are addressing the requirements for engaging stakeholders by analyzing data, implementing improvement strategies, and evaluating their SPP/APR work. Participants heard ideas from colleagues that they could use to enhance their state’s strategies for addressing meaningful stakeholder engagement in the SPP/APR.

  • Session: Where Did the Money Go? Building LEA Capacity to Use High-Quality Data

    Are you making the most out of your work with LEAs? This interactive session focused on building state staff capacity for assisting LEAs with ensuring their LEA data are high quality. IDC staff engaged with session participants to discuss avenues for building LEA capacity for collecting, analyzing, and using high-quality data. Presenters highlighted IDC tools, including the LEA Data Processes Toolkit, the newly revised Part B Indicator Data Display Wizard, and the Data Meeting Toolkit, that states can use when providing technical assistance (TA) and professional development to build LEA capacity. Using these IDC tools, session participants learned how to help LEAs document data collection processes, lead meetings with LEAs, and provide TA on analyzing data to make decisions about use of funds and other resources in order to improve their local outcome data.

  • Session: De-escalating: How to Calm Down and Get to Work on Rising Discipline Trends

    Don’t let your emotions about increasing suspension rates immobilize you! In this session, participants heard from state panelists as they shared their strategies for addressing concerning discipline data trends. State staff responded to questions from the audience in addition to discussing questions they themselves had and what they found when they took a close look at their data. Panelists also discussed what they have identified as important supports for districts and how they are building support for their work with their districts.

  • Session: The Assessment Ghosts of Past, Present, and Future: What’s Next for Indicator 3?

    The classic Charles Dickens tale, A Christmas Carol, represents the power of considering the past, present, and future to better understand the choices available in any given moment. The lesson can apply to Indicator 3 as well! The assessment landscape has been fraught with difficulties due to the lack of longitudinal data, given the interruption in statewide assessment in school year 2019–20 and the continuing impact of COVID-19. Against this challenging backdrop, states faced the difficult task of determining baseline data and setting future targets for Indicator 3—an indicator that also underwent a complete overhaul in the FFY 2020 SPP/APR package. During this roundtable discussion, participants learned how three states navigated these challenges and partnered with stakeholders to prepare for the FFY 2020 SPP/APR submission. States also discussed their future plans for this indicator. Adapting to this current reality will require looking back as well as forward—this session helped participants prepare to do both.

  • Session: Exhibiting Data So Your Audience Gets Your Message

    We all know that keeping your stakeholders’ interest when presenting them with important program data is challenging. And these days, written context for data is more important than ever—due to the text-only format of the SPP/APR and State Systemic Improvement Plan (SSIP) as well as increased use of data visualizations. Missing context, unclear text, and incomplete labels can cause people to overlook key elements in a data presentation, misunderstand your meaning, and miss out on vital information. Clearly written texts about data can promote shared understanding, add depth to discussions, and support planning. This session included a gallery walk with examples and exercises designed to highlight best practices for labeling data visualizations and writing narratives about state special education data. Participants came away with new strategies and a checklist they could use to review and refine their data presentations to sustain stakeholder engagement and support effective messaging.

  • Session: No Response Is a Response: How Nonresponders Can Influence Your Data

    Are you a nonresponder when asked to complete a survey? Have you considered that by not responding you may be making a difference, but not in a good way? This presentation explored nonresponse bias and how differences between survey responders and nonresponders can influence survey results. You may recall that states have a new ask from OSEP for Indicators 8 and 14 of the SPP/APR. States now must analyze their Indicator 8 and Indicator 14 response rates and nonresponse bias. They also must describe the steps they will take to reduce any nonresponse bias, making their data more meaningful and useful. During this session, participants worked through examples to better understand nonresponse bias and shared ideas about improving their response rates, including responses from a broad cross section of parents of children with disabilities for Indicator 8 and former students for Indicator 14.

  • Session: Lost Your Way in Data Land? Creating a Map for Efficient and Effective Data Collection

    Overwhelmed with data and still not sure how to show you are making progress toward your destination? This session helped participants identify the data they could use as milestones to map their journey and track their progress by tying each data collection to a specific project outcome. Washington’s SSIP staff described how they surveyed their data, determined important milestones, identified landmarks, and mapped the route that would lead them to their destination. They shared the state’s customized data dashboard, which started as a storage mechanism and evolved along the way into a centralized tool that streamlined data collection. These efforts are making data relevant and accessible to all users, increasing data literacy and data quality, increasing investment of resources in data collection, and encouraging data informed decisions.

  • Session: I Am a Data Quality Influencer

    You have more power in influencing your state’s data quality than you may realize. No matter your title, job duties, or experience, you have a critical role in ensuring your state’s IDEA data is of high quality so everyone who needs to can use the data for program and policy decisions affecting students with disabilities. In this session, participants found out what it means to be a data quality influencer, how every person can contribute to influencing IDEA data quality, and what steps they could take to start positively influencing data quality in their state.

  • Session: Practice Makes Perfect: Strategies for Reviewing LEA Data to Improve Practices

    Inconsistent and inappropriate practices can lead to inequitable outcomes for children with disabilities. When data identifies an LEA for one of the equity requirements under IDEA (Indicators 4, 9, and 10, and significant disproportionality), then the SEA must ensure there is a review of the LEA’s practices in addition to its policies and procedures. How do SEAs ensure each LEA has consistent and appropriate implementation of practices? How can SEAs develop a process that effectively reviews LEA practices, and how can they be confident that the practices align with policies and procedures? What data can inform the implementation of consistent and appropriate practices? In this session, participants got answers to these questions and discovered approaches for reviewing quantitative and qualitative data to identify LEA practices that may be contributing to inequitable outcomes for children with disabilities.

  • Session: Power to the People: Tools and Processes for Collaborative Data Use

    States are stepping up their game by engaging stakeholders in target and baseline setting, evaluating their SPP/APR, analyzing data, and developing and implementing improvement strategies. Doing this virtually, during a time when all normalcy has gone out the window, means increasing difficulty to level 11! Collaborative data use for decisionmaking can be really challenging! It can feel awkward and leave some feeling that their voices aren’t being heard. Why is that? In this session, presenters examined the nature of the data states have and common dynamics that come along and influence the power to make decisions. They discussed a process for building stakeholder capacity for collaborative data use that involves tools for reflection and analysis, effective communication, and steps for a more natural discussion that allows all voices to be heard. Participants took away a new understanding of collaborative data use and a process for improving their stakeholder engagement practices.

  • Session: Connecting the Dots: Examining the Intersectionality Among Indicators to Inform System Improvement

    Has your system had its yearly check-up? It might be time to schedule an appointment! Health checks often start with screeners, like weight, blood pressure, and temperature checks. You won’t stand on a scale, but it is still important to check in on your data! Collecting high-quality data for each SPP/APR indicator provides the foundation for assessing the interplay between and among indicators and using the data to inform next steps in improvement efforts. Understanding how the indicators affect one another can influence state performance planning and lead to greater cohesiveness across the system. In this session, presenters highlighted the Tree of Influence and discussed the interplay between high-quality data and indicators, specifically the State Systemic Improvement Plan (SSIP). Data managers, special education directors, and SSIP coordinators reviewed the Tree of Influence as a tool to complete a systems check to increase the type and quality of data to measure implementation goals and successful progress.