
Best Education For Design
Rimply dummy text of the printing and typesetting industry
lorem Ipsum has been the industry's standard
START A COURSE

Best Education WordPress Theme
Rimply dummy text of the printing and typesetting industry
lorem Ipsum has been the industry's standard
START A COURSE

Best Education Into PHP
Rimply dummy text of the printing and typesetting industry
lorem Ipsum has been the industry's standard
START A COURSE
Step 1. Determine the criteria for schools to participate in universal behavior screening.
(Winter before the first screening year)
Depending on the size of the district, when a decision has been made to adopt a universal behavior screening initiative, it is recommended that the district pilots the screening process with a subset of their schools in the first year. A pilot allows the district to learn the screening and data-based decision-making process, generate buy-in for screening, and understand their students’ trends in behavioral and emotional risk before scaling up district-wide (SRI International, 2021). The size of the pilot will vary based on district resource capacity. Some districts may choose to pilot screening in one grade in one school; other districts may choose to pilot screening in a third of their schools. Smaller is better when determining the number of classrooms or schools that will participate in a pilot. Although each district’s criteria for identifying pilot schools may be unique, several variables could be considered by all districts.
- Administrator Buy-In. Universal screening can be time- and resource-intensive, especially for the pilot schools implementing this process as part of a new district initiative. Different school staff, including teachers, school leaders, and district personnel, will be involved in implementing screening. However, due to the potentially time-intensive nature of being the first schools to implement screening in a district, districts should select pilot schools that have full support from the building-level administrators (Humphrey & Wigelsworth, 2016; Siceloff et al., 2017). This will ensure that personnel time is protected for training, completing the screening, and data-based decision-making.
- Strong Literacy in Data-Based Decision Making. Another criterion for selecting pilot schools is the existence of a data-literate, data-based decision-making team. Schools without strong data literacy may not know how to use the screening data to inform student support. Data-based decision-making is often a task of an MTSS team, Positive Behavioral Interventions and Support (PBIS) team, attendance team, or another Tier 1 team that analyzes schoolwide data to inform services and supports (Lane et al., 2018). Behavior screening should be integrated into the data review activities of these existing Tier 1 data teams.
- At Least Two Social, Emotional, and Behavioral Supports for Students at Each Tier. Ethical concerns exist with collecting and not using screening data to inform student support. Therefore, prior to screening, pilot schools should have social, emotional, and behavioral resources available to students at every tier. This criterion will be discussed further in Steps 2 (Resource Mapping) and 4 (Confirm Supports).
Step 2. Complete Resource Map with Each School.
(Spring before the first screening year)
School-based screening data should be linked to appropriate resources for follow-up (Blasé & Fixsen, 2013; Levitt et al., 2007). Without this linkage to available supports, the data are unlikely to be used for data-based decision-making with the tiered system framework. To avoid this ethical dilemma, schools should complete a resource map to create an inventory of their available resources and their capacity for addressing students’ emotional and behavioral concerns (see Table 1). Resource maps can include prompts for school teams such as: What resources are currently in place at our school? How do students access the resources? How many students have been served by those resources (to understand capacity)? The multidisciplinary team will categorize all the social, emotional, and behavioral programs present at each school, identifying the type of program, appropriate tier(s) and core components. Table 2 provides an example of Happy High School’s resource map.
Step 3. Complete a Gap Analysis
(Spring before the first screening year)
After completing a school-level resource map, teams should meet to review the map for gaps in interventions across the tiers. The school team can then identify programs to serve their students’ emotional and behavioral needs (Davis-Ajami et al., 2014; Golden et al., 2017). It can be beneficial to identify previous programs implemented in the school because revitalizing a program may be more feasible than beginning a new one. For example, a school may have had a male mentoring program implemented in the past. That school may have discipline data suggesting that 9th-grade males have a higher number of referrals than other groups. The school could explore restarting its discontinued male mentoring program if existing staff still have institutional knowledge and community contacts needed to support this program.
Discussions about gaps are best when they are data-informed. For example, a school may review their student data to identify high rates of absenteeism and suspension for violent offenses. Cross referencing those trends in student data with the inventory of resources on the map may illuminate a gap in attendance-focused interventions or preventative programming for violent behavior. Table 3 includes Happy High School’s gap analysis. The prompts on the gap analysis form are as follows:
- Describe gaps in Tiers 1, 2, and 3 social, emotional, and behavioral interventions.
- How will those gaps be filled?
- Review your “contact person” column on your resource map. Are more than 75% of the interventions coordinated by the same staff member? If so, how can you increase the personnel resources available to implement and/or coordinate the interventions?
Resource Mapping Gap Analysis Document
Discussions about gaps are best when they are data-informed. For example, a school may review their student data to identify high rates of absenteeism and suspension for violent offenses. Cross referencing those trends in student data with the inventory of resources on the map may illuminate a gap in attendance-focused interventions or preventative programming for violent behavior. Table 3 includes Happy High School’s gap analysis. The prompts on the gap analysis form are as follows:
- Describe gaps in Tiers 1, 2, and 3 social, emotional, and behavioral interventions.
- How will those gaps be filled?
- Review your “contact person” column on your resource map. Are more than 75% of the interventions coordinated by the same staff member? If so, how can you increase the personnel resources available to implement and/or coordinate the interventions?
Step 4. Confirm that Each School has Adequate Social, Emotional, and Behavioral Supports.
(Spring before the first screening year)
Step 5. Select a Screening Tool.
(Spring before the first screening year)
There are many considerations when selecting a screening tool for a school district. Screeners range in cost, construct, informant, timing, and technical adequacy (Glover & Albers, 2007). See Table 3 for a list of select screening measures.
Cost. The cost of universal screeners is often per student/per screen and can vary considerably across measures due to licensing costs (e.g., Center for PBIS, n.d.) or personnel time dedicated to the screening process (Kuo et al., 2009). While there are high-quality free screening tools (e.g., Lane et al., 2012), it is important to note that sometimes, the free screening measures require more personnel resources to score and report the screening data in a way usable to school teams. Districts may build a custom data dashboard that allows for automated scoring of the free screening measure and integration of the screening data with other student administrative data. These custom-built data dashboards can be more economical than paying for a for-purchase screening measure. Another option for districts is to pilot screening with a free measure and shift to a different for-purchase measure to sustain the process if they are not interested in creating a custom-built data dashboard or if they do not have the personnel resources to score and generate reports. Over time, behavior screening saves schools money because the process facilitates the prevention and early identification of behavioral concerns that would be more costly over time (Dever et al., 2012).
Targeted constructs. A screener’s targeted construct refers to the general risk type it measures, such as social-emotional learning skills or early signs of behavioral and emotional risk (Kamphaus, 2012). A district should align the screening construct with its intervention programming. For example, if a district has adopted a tiered intervention program informed by CASEL’s SEL competencies (CASEL, 2020), the screening measure should screen for students’ SEL competencies (e.g., Social Skills Improvement System SEL: Elliott & Gresham, 2017) so that the screening data will measure the same construct taught in the district’s SEL curriculum.
Informants or Respondents. Screening forms may be completed by teachers, parents (Dowdy & Kim, 2012), or students (Jones et al., 2020). The informant will be determined by the age and developmental level of the student and the screening measure selected. The available literature suggests that children under the ages of 7-8 years generally do not have the necessary skills to complete screening tools (Sharp et al., 2006); however, this can vary depending on the type of scale used (Conjin et al., 2020). Although it is important to incorporate student perspectives, it is recommended to rely on adult informants in earlier grades when students may not accurately report their risk. Students tend to report their internalizing concerns more accurately than teachers or parents (Aebi et al., 2017; Margherio et al., 2019; Atzaba-Poria et al., 2004). In a sample of adolescents, Kuhn et al., (2017) found that while parent reports alone were more predictive of mental health status than student reports alone, combined parent and youth reports provided the most predictive validity. When student self-report is compared to parent-report, children tend to identify more difficulties but perceive their impact as less severe than their parents (Van Roy et al., 2010). Teacher informants tend to more accurately estimate externalizing concerns than internalizing concerns but have the added value of observing children’s behavior in the school context alongside their peers (Dowdy et al., 2010; Jones et al., 2020; Humphrey & Wigelsworth, 2016). Based on guidance from existing research, schools may decide to administer a teacher-completed screener at the elementary level and shift to a student-completed screener at the middle and high school level.
Timing/Frequency. Timing and frequency of screening can depend on specific school context and construct being measured; one option is to align screening administration with the schedule used to collect academic screening or benchmark data. The following is one example of the timing for screening: Fall (four to six weeks after the start of the school year), Winter (two to three weeks after the winter break), and Spring (six to eight weeks before the end of the school year). Schools may find three screening time points a year too frequent for personnel to fully utilize the data to inform decision-making (Lurie & Swaminathan, 2009; Yu et al., 2022). To allow more time to use the screening data between screening time points, districts may opt to conduct one or two screenings per year (Dowdy et al., 2014). When determining the frequency of screening across the year, there are a few questions the multidisciplinary team could consider: (1) Is the school’s risk base rate at or above 20%? (2) Does the school have more than 5% of its students scoring in the “High Risk” range? (3) Has the number of students who score in the “High Risk” range been higher at Time 2 than at Time 1 in past years? And (4) Does the school want mid-year screening data for any other purpose?
Technical Adequacy. Technical adequacy refers to the reliability, validity, sensitivity, and specificity of a measure (Ma et al., 2017). Districts should also understand the population that was used to validate a screening tool to ensure that there is alignment with the targeted student population. This is particularly critical for school districts with a large population of English Learners, individuals with disabilities, and students from communities historically underrepresented in measurement research (Glover & Albers, 2007; Moore et al., 2023; SAMHSA, 2019). School district leaders may consult with university partners with measurement expertise to learn about the populations on which the tool has been validated. It is best to avoid developing an internal screening measure, but if circumstances necessitate that choice, please consult a measurement expert at a partner university to support your district’s survey development.
Step 6. Consult with the District Legal Consultant about using Active or Passive Consent.
(Spring before the first screening year)
Your district’s screening activity may be protected under the Protection of Pupil Rights Amendment (PPRA; Protection of Pupil Rights Amendment, 1974), also referred to as the Hatch Amendment. PPRA was drafted to protect the rights and privacy of the pupil or family from school actions. It states that students are not required to engage with Department of Education-funded surveys inquiring about sensitive subjects without their guardians first being allowed to view the survey and opt their child out of participation (passive consent). Some districts may choose to use an active consent process before screening.
Active vs. Passive Consent. Active parental consent requires parents to sign a permission form and return it to the school if they allow their child to participate. In contrast, passive parental consent involves the parents signing and returning the form only if they refuse to allow their child to participate (Range et al., 2001). Passive consent may be considered ethically questionable due to the possibility that parents may not see the consent form and, therefore, remain unaware of the screening activity. However, passive consent increases participation rates because no parental action is required if they agree with their child’s participation (Range et al., 2001). A study comparing conditions of parental consent found that when active consent was required, participation dropped from 85 to 66 percent, and this lower consent rate could impact the universal nature of the data collection (Chartier et al., 2008).
Student Assent. In addition to securing parental consent, if a student is at a developmental stage to make informed decisions, they should assent or agree to participate in the screening. The next step describes more information about student assent.
Step 7. Draft Parent Consent Form. Distribute consent forms at the beginning of the academic year.
After deciding between active and passive consent, the consent form should be drafted. While each form will contain details unique to the district and screening goals, certain components should be included to align with PPRA (Protection of Pupil Rights Amendment, 1974). (1) Screening Objective. This section should include a brief overview of the screening initiative’s purpose and goals, as well as descriptions of the screening tool and the constructs measured. (2) Screening Process. This section will explain how screening will be conducted in the school. Pertinent information could include the estimated length of the screening form, date and time of administration, and respondent. (3) Data Analysis and Use. This section will explain how the information collected from the screening survey will be used to improve school initiatives. This section should also detail the student privacy and data security protocol. (4) Contact Information. If a parent or caregiver has additional questions or concerns, contact information for the screening lead or team member moderating this process should be provided. (5) Return Date. Regardless of the parental consent method, a date highlighting when the form should be returned to permit or opt out of participation should be included to provide a window of time to respond. Consent forms should be written at an 8th-grade reading level to increase parent and caregiver comprehension. Forms should be distributed at the beginning of the year. The distribution method will vary, but it should be sent early enough to obtain consent and start screening 4-6 weeks after the start of the school year.
Student Assent. To ensure that students have the knowledge and information to make an informed decision, teachers, school leaders, or other appropriate personnel should describe the screening objective, type of questions, steps for data security, and the degree to which there is response anonymity to students. The specific way this information is described to a student will vary (e.g., read aloud to student; or written in user-friendly language). Moreover, depending on developmental stage and communication preference, students may indicate their assent for participation differently. If students do not assent to complete the screener, they should be allowed not to complete the form. A trusted adult could follow up with the student to understand why they are uncomfortable completing the screening form.
Considerations for English Learners. Rodrigues et al. (2016) highlight that families whose primary language is not English are less likely to receive screening or direct services than their English-speaking counterparts. To mitigate this, districts should provide the parental consent and screening forms in all languages represented in their community. Districts may also provide a screening informational session through interpretation to families whose primary language is not English.
Considerations for Students with Disabilities. Students with disabilities may require a different protocol for student assent and screening to ensure their rights and the validity of the collected data. Students who receive accommodations through their individualized education programs should receive those same accommodations during the assent and screening process (Breaux & Smith, 2023).
Step 8. Complete a Universal Screening Action Plan with each School.
Step 9. Introduce the Social, Emotional, and Behavioral Initiative to the School Community.
(Beginning of the first screening year and ongoing throughout the year and in subsequent years)
Step 10. Train teachers on how to collect universal screening data.
(On a professional development day before the first day of school)
- Informal screening occurs every day.Teachers may not feel qualified to complete a behavior screener due to the misconception that professional training as a mental health professional is required. On the contrary, teaching involves observing students, collecting data on those students, and using those data to inform instruction. The screening measure is a more systematic way to collect the observation data that they collect every day (Gregory & Oetting, 2018; Kamphaus et al., 2007).
- Document behavioral concerns before screening. Another teacher-reported concern is that the information they enter into the screening tool will result in negative feedback from parents. To address this concern, teachers are trained to document all behavioral concerns about students starting at the beginning of the year and before completing the screening form. The behavior screener should not be the first documentation of significant behavioral concerns. Teachers should share early concerns about student behavior with those students’ families. Often, parent-teacher nights (e.g., curriculum nights, parent-teacher conferences, parent report card nights) are scheduled prior to the first screening and provide teachers with an opportunity to share early behavioral concerns.
- Objective ratings. Rating student behaviors and concerns can be complex and introduce bias. To combat bias, teachers should ensure they remain objective, have documentation to support their ratings of significant risk, and not discuss student ratings with their colleagues or the students’ previous teachers. Research has supported that overall, teachers can remain objective on screening measures (Barger et al., 2022), yet due to the overrepresentation of Black students and students with disabilities who receive teacher-initiated office discipline referrals, schools must be vigilant to track racial bias on adult completed screeners (Weathers, 2023). They may do this by analyzing school-wide screening data by race, disability, and gender to examine trends in overrepresentation or underrepresentation of risk (Jackson, 2021). If within a school, students of one demographic are disproportionately at risk on a screener, the school team should collect additional data to understand and address the root causes of the disproportionality. In these situations, a Tier 1 response to the data could be to support teacher coaching on equitable classroom practices.
- Documenting internalizing behaviors.Externalizing behaviors are often easier to document because they tend to be more observable (Nikstat & Riemann, 2020). In contrast, students with internalizing concerns may not be as disruptive, but these behaviors still present a concern and should be noted. Teachers should receive training on how to observe, operationalize, and document internalizing concerns. Then, they can be coached on how to share information about early behavioral concerns with families.
Step 11. Conduct the screening.
(Timing described in detail in Step 5)
The logistics of the day of the screening will depend on several factors, including the respondent and the time allocated for the screening. For the purposes of this article, we focus on students and teachers as possible respondents, although parents may also be respondents in some educational settings. (1) Respondent. If students are the respondent, they may complete the screening after the first month of school. For teacher-completed screeners, teachers should have 30 school days in the classroom with the student before completing the screening to allow ample time to learn their students’ behaviors and dispositions (Lane et al., 2011). (2) School schedule. When students are completing the screener, it has been our recommendation to schools that students complete the screener all together during the same class period (for example, all students would complete the screening during a social studies or an advisory class). For a teacher-completed screener, administrators should use a staff meeting or professional learning day to complete the measure. Scheduling the teacher-completed screening during a time the teachers already have allocated for another meeting or training may increase buy-in from the teachers for completing the screener because the teachers would not be asked to complete the screener during a planning period or their personal time. For student-completed screeners at the secondary level, the school should select the period of the day during which most students are in the building to ensure the screening is as universal as possible. The day, subject, and period will be determined by the administrator during the summer planning; it should also be recorded in the universal screening action plan.
Step 12. Score the screening forms.
(Immediately after screening)
After screening is complete, the data team indicated on the universal screening action plan will score it. It is important to promptly score and return screening data to the school to facilitate rapid data-based decision making (Schildkamp, 2019). This quick turnaround will allow the school to use the data while they are still relevant to the students’ needs. In addition, it is critical to immediately follow up with the students who score in the highest risk range.
Step 13. Review school-, grade-, gender-, race, and classroom-level data and decide how to improve Tier 1 supports.
(Within 1 week after each screening time point)
Step 14. Review student-level data and decide how to provide Tier 2 or Tier 3 support.
(Within 1 week after each screening time point)
When reviewing student-level data, school teams should prioritize follow-up with two groups: students who scored in the at-risk range across multiple subscales and students who scored in the highest-risk range on the total score or within a single subscale (Roth & Erbacher, 2022). More information about each student will be collected to confirm the student’s level of risk. Confirming that a student is at risk requires data review and clinical decision-making (Sheldrick & Garfinkel, 2017). Here are suggested steps for confirming the level of risk. (1) Engage the Teacher. If the teacher was the respondent on the screener, the team would meet with the teacher to review their responses on the screener for the students who scored in the high-risk range. (2) Review Student File. The team should confirm the district’s consent process for accessing the student’s file and then review the file for additional indicators of concern (e.g., high ODRs, high absences, frequent visits to the nurse, low grades). The team also should review current interventions the student may be receiving. Many students with screening scores in the high-risk range on the teacher- or student-completed screeners may already receive Tier 2 or 3 services. (3) Meet with Students. After evaluating the student’s information, a team member should meet with the student to confirm the student’s level of risk. The trusted adult who meets with the student may ask the student about their responses on the screener and about any existing social or emotional concerns. (4) Team Discussion. The multidisciplinary team may then review all the data (e.g., screening scores, file review, a summary of the discussion with the teacher, a summary of the discussion with the student) to determine whether a student needs Tier 2 or 3 support. (5) Parent Consent. Students recommended for Tier 2 or 3 support should have parental consent prior to receiving those services.
- Universal Behavior Screening Data Review Excel Data
- Data-based Decision Making with Universal Screening
- Checklist for Following up with Students
- Checklist for Student Meetings Excel
- List of Targeted Screeners
- Parent Screening Results Letter
- Universal Behavior Screener Student Interview Form (student self-report screener)
- Universal Screening Data Review Questions Example
- Universal Screening Data Review Questions
Step 15. Complete the second screening.
(Winter/Spring)
Finally, the last step is to complete a second (and possibly third) screening later in the academic year. If a district completes two screenings per academic year, they should complete the second screening 4-6 weeks after the start of the second semester. If a district determines three screening time points will provide them with the data they need for data-based decision-making, then the second screening would occur mid-year, and the third screening would occur near the end of the academic year (Yu et al, 2022). This timeline will vary by the district’s calendar.