SAfETy

Addressing the gap in school climate assessment tools.

Background

Creating a positive and supportive school climate has been linked to improvement in school safety perception as well as benefitting student mental health, academic performance, and a reduction in disciplinary referrals.

However, efforts to assess school climate have proved challenging. Many existing evidence based surveys used to assess climate are cumbersome, time consuming, and the resulting data is often difficult for schools to use. Data collected during school walkthroughs are rarely recorded in ways that allow for schools to assess and track climate issues.

“School climate is the product of a school’s attention to fostering safety, promoting a supportive academic, disciplinary, and physical environment, and encouraging and maintaining respectful, trusting, and caring relationships throughout the school community.”

- U.S. Department of Education

Project overview

Dr. Sarah Lindstrom-Johnson’s research focuses on design and implementation of preventative interventions in schools with an interest in using technology to bring programs and system change to schools. She has developed several measures used to assess domains and components in schools, doing initial testing with a shortened assessment that has shown positive response from school administrators in being able to assess their schools and gather useful data. This data can then be used to create action plans to address school climate concerns.

Using Dr. Lindstrom-Johnson’s expertise and 3C Institute’s technology solutions we will create the interactive software system, SAfETy (School Assessment for Environmental Typography) that will allow for the collection of school observation data from multiple observors, aggregation of that data, and creation of reports that will include action plans and recommendations for improving and monitoring school climate concerns.

The goal for Phase 1 of this project was to create a software prototype of the observational data entry system and mockups of the data displays and data-driven recommendations for improving school climate. User-centered testing will then be conducted to determine the feasibility of the proposed SAfETy product.

My role

I worked together with the project team to develop wireframes depicting the flow of both the mobile data entry system and the web based interactive toolkit including data displays and generated reports. We then worked together to flesh these out into high fidelity mockups of the web-based software with data entries and scores that would be accurate to the live site in order to capture a geniune experience for the users who would be testing the product. The mockups created for the mobile data entry system were used to create a prototype of the system using 3C’s proprietary survey system, Quest.

September 2020 - ongoing

Role

UX/UI

Visual Identity

Tools

Adobe XD

Illustrator

Team

Dr. Sarah Lindstrom Johnson – Co-Principal Investigator

Dr. Melissa DeRosier – Principal Investigator

Roy Goulet – Chief Technology Officer

Chris Hehman – Solution Architect

Kelly Kocher – Project Manager

Kim Pifer – Content/Editorial

Kelly Fish – Design

Dr. Deb Childress – Research

Thea Cox – Research

Identifying users

Users were initially identified in the research strategy as being school staff, eg. administrators, teachers, PBIS (Positive Behavior Interventions and Supports), and other school personnel. Taking this information, we worked to identify who would be using each tool that makes up the SAfETy system: the mobile based data entry system and the web-based interactive toolkit. Each would provide their own unique context for use and constraints.

“Teacher” flow

The project team referred to the mobile data entry as the “Teacher” flow. The users would most often be school staff who would either record observations assigned to them from school administration or unassigned observations when issues are noticed.

Observers using the mobile data system would be on their feet, moving around a school with a limited amount of time as well as potentially spotty internet connection. With this in mind, if the project were to be funded for Phase 2, the eventual vision for the data entry system would be as an installed mobile application. However, for the Phase 1 prototype we used the web-based Quest system.

“Admin” flow

The project team referred to the web-based interactive toolkit as the “Admin” flow. The anticipated users for this tool are school administrators and/or 1-2 identified personnel from the school team. This tool had several anticipated features to consider:

  • Assigning observations to system users
  • Rostering new system users
  • A to-do list for addressing issues
  • Data comparisons across multiple reporters
  • Reviewing data collection changes over time
  • Recommendations and action plans based on collected data

 

Due to the complex nature of the tool, the team believed it to be unlikely for users to access it primarily on their mobile device so we focused on a desktop experience for the wireframes.

Developing the “Teacher” flow

Documenting safety concerns

Using content from the school climate administrator walk-through tool developed by Dr. Lindstrom-Johnson, I created wireframes using 3C’s survey system, Quest, as a framework for how the data collection experience would flow. Ideally the application would be able to support a variety of data collection strategies – assigned/intentional vs “trickle-in”, allowing observations from a variety of individuals, completing a walkthrough of the whole school all at once or in strategic sections throughout the year, and ability to add specific section data related to issues if needed.

Based off our discussions there were several key features I needed to address in this flow:

  • An observer will open the survey and have a clear “Add Observation” button to initiate the survey.
  • Previous observations will populate on the landing page and will include important information such as area type, date recorded, and location title given by the observer.
  • Upon entering the survey the observer will be asked the type of location from a list of defined spaces (these were populated from the existing paper version of the admin walkthrough).
  • The survey will have visibility conditions – based off the location type, the observer will be presented with a series of items associated with that location. With each item, the observer also has the ability to define priority and input additional specific information about that item.
  • Observers will be able to take a limited set of photos for each observation to provide more specific data. Observers will also be warned to avoid taking photos with people/ recognizable faces in them to protect students and staff.
  • Organize the survey items into manageable chunks, 1-2 per page rather than one long page with all of the items.

Known vs unknown reporters

After many rounds of wireframes, the team began discussing reporter types outside of the assigned observer from the school team. Specifically a consultant suggested the potential to gather observations from students.

Questions regarding having different reporter types included:

  • How would student reporters be rostered and assigned tasks?
  • Should they all be assigned or should the system allow anonymous reports?
  • How would students receive guidance for how to conduct observations?

 

After internal discussion and input from consultants, we determined the reporter types for initial testing to be:

  • Staff
  • Admin staff
  • Student
  • Other (with an open-ended text field for custom entry)
  • Anonymous/Decline to answer

 

The reporter information was determined to be less important than the aggregated data from the observations. We also raised the concern that having identified students as reporters could have a negative effect on interpersonal relationships with their peers.

The team narrowed the data collection experience to two flows: Known/Assigned and Unknown/Anonymous. The Known users would be school staff who were rostered into the system and can be assigned observations. Their assessments will include the full suite of items from the walkthrough. Anonymous users would be able to indicate their role (or decline to answer) and would have a simplified survey including only three items which would assess the domains of safety, environment, and engagement. These would also include open-ended text fields to allow for further input.

Regarding how reporter types outside of school staff would receive guidance for performing observations, the team will pose those questions during the Phase 1 user testing.

Known flow
Anonymous flow

Developing the “Admin” Flow

Accessing observation data and climate reports

My initial internally developed wireframes for the “Admin” flow were basic, needing more input from the external team. The goal with this first round of wireframes was to begin to conceptualize the observation data tab and a rough vision how and where that data would be aggregated into reports.

Setting to-do priorities

After the observation data is recorded by a reporter, how is that information used to proactively address issues in the school? In the admin flow, we determined the need for a “To-do” tab that would help administrators gather details from the reported issues and make arrangements to have them addressed.

How would this to-do list be populated? Having every area of concern from every observation made could lead to an overwhelming and cumbersome list. Instead, administrators should be able to go into each individual observation report, review high priority items, and determine which issues should be prioritized and added to their to-do tab via a toggle for each issue.

In the teacher flow mockups, reporters have the ability to determine if an issue is high priority or not. Those would be noted in the observation reports available via the admin flow. But what if some of the items that aren’t marked as high priority by a reporter should be considered high priority? The team concluded that the system should also be able to tag issues as high priority based on predetermined parameters, even if the reporter didn’t note the item as such. “High priority indicated by reporter” and “High priority indicated by system” were added to the observation report to allow administrators even more detailed information into each observation report.

Rostering users

Initially, the team thought there should be a “Reporters” tab. Here, admin would be able to add system users, monitor and edit their assigned observations, and assign more observations.

However, throughout our process the team decided that system users would be rostered via the manage tab, which is a feature that had already been created through previous work 3C has done for their proprietary software so there wasn’t a need for mockups to show this flow. The reporter tab was then dropped from our mockups.

Assigning observations

In the teacher flow, we showed how reporters could be assigned observations. We knew those observations would be assigned to system users by administration via the admin flow, so my next priority for the mockups was to conceptualize how that work.

These assigned observations needed the flexibility to be as broad or specific as the admin staff required, so we included settings for the admin to determine details such as:

  • Timeframe
  • Text area for a descriptive name
  • Locations (included specific areas or complete school observation)
  • Which reporter(s) would be assigned
  • The ability to allow anonymous reporters

 

The admin would be able to monitor these assignments through the “Observation Assignments” tab by viewing their overall status or drilling down into a specific assignment to see further details including overall progress, reporter progress, and the ability to adjust the timeframe.

Filtering observations

Considering that a school may have a large number of reports and admin will want the ability to drill down to specific time frames and locations throughout the year, we revisited the observation report screen with filters including date range, reporter type, location type and priority.

Another feature that was brought forward by the team was a “Why is this important” accordion below each item of concern. This would provide users with further details to explain why certain physical and social observations are important to address as well as resources to help.

Generating a climate report

The last major piece of the admin flow was the climate report tab, which we set to tackling last.
The flow would begin with an overall view of the three main domains of Engagement, Safety, Environment. A user could then click into one of these domains to view a detailed report including change over time, items of concern specific to each domain, and recommendations for how to address concerns.

My initial mockups of this tab included a couple directions for how construct scores under each domain (Engagement, Safety, Environment) could be visualized.

My next iteration of the climate score was to expand on the different views for the climate report.

The report would start with a high-level view that would show the overarching domain and construct scores. This view would also provide insight on anonymous feedback that was gathered. Due to the number of components not being the same for all 3 domains, we wanted a visual that would weigh each domain evenly. After some exploration, I landed on using a radar graph to display the top level score. This would provide clarity to the overall climate score of ones school by visualizing the scores of the three domains that contribute to it.

Once a user clicked into “View domain scores” they would see the second level view. This would be a more detailed display of each domain and their constructs. A third more drilled down view accessed by “View detailed domain report” would show the score’s change over time and strategy recommendations for improving that domain.

This iteration brought further refinement to each level of the climate report.

After input from the team, the scores for the Overall Climate Score and the Overall Area (or domain) Scores would range between 0-100 with defined ranges indicating poor, average, and good/excellent using a speedometer graphic to help visualize these scores. The team also wanted less emphasis on the anonymous feedback, so I brought that information below the top level and area score cards.

For the second level “Area scores” page, we expanded on the construct scores by providing the impact (low, medium, high) of each individual score in how they contribute to the overall domain score. This allows users to see how different items influence the score in different ways, some scores are more important than others in regards to the overall score. This level also shows the relevant areas where these components are seen.

For the third level detailed domain report, we brought on a graph to show data gathered from observation reports indicating high priority or satisfactory.

My next and final iterations of the climate report mockups emcompassed instructional language and rules for the climate report. A page was added to the flow indicating that in order to generate a climate report, a certain amoung of data is required. This would then lead to the user creating an observation assignment to gather that needed data. To the top level a date range picker was also added to the climate report to allow the user to generate a report based on data between a certain time point.

Language was provided to me in order to give the mockups more framing for each area of the report. Tool tip langauge was also fleshed out to provide users with detailed information regarding the domain components.

Current status

The project team is currently conducting Phase 1 user testing to determine the feasibility of the proposed SAfETy product.