We track anonymous visitor behavior on our website to ensure you have a great experience. Learn more about our Privacy Policy.

Evaluating Edtech Tools

Methods districts can use to decide whether an edtech tool is worth the investment

Overview

As districts approach their budgeting season each year, they spend significant time evaluating their digital investments in devices and platforms for students and teachers. Especially for paid platforms, districts typically have to decide whether they would like to renew a subscription, purchase a new product, convert a free tool into a paid tool (also known as a “freemium” model), or offboard a product that may not be meeting the district’s needs. Ultimately, district leaders are trying to understand the efficacy of a product within their own unique context and on behalf of the learners they are serving.

When evaluating whether an edtech tool is meeting its intended goals, there are a number of considerations that district leaders need to take into account. These range from technical considerations, such as integration with current platforms (e.g., learning management system [LMS]) and compatibility with existing district devices, to functional considerations, such as alignment with students’ learning goals and adaptability to varying student skill levels, to equity considerations, such as accessible features and the ability to support multilingual learners – and much more.

Many districts ask the following questions in their evaluation process:

  • Engagement: Are all of the intended users engaging with the tool?

  • Adoption: Is the tool being used as intended or prescribed?

  • Impact: Is the tool producing or correlated with targeted outcomes?

  • Satisfaction: Do users find the tool valuable in meeting their needs?

There are a number of methods that districts can use to collect evaluation data and information about a tool. While each method has its benefits, all methods come with their own specific considerations. It is recommended to use a combination of the following methods to paint a complete picture of a tool’s use and impact.

  • Self-Reported Experience Data: Users can self-report their level of engagement and experience with a tool and/or their perception of its impact. This data can be collected through surveys, focus groups, and other forms of one-way or two-way communication.
  • Observation Data: Information about how a tool is being used can be collected by directly observing users in context. This data provides consistent, firsthand information about actual use cases (e.g., from classroom to classroom or from school to school). This can also be embedded into other ongoing observation and coaching processes.

  • Tool-Specific Reports: Many technology tools will automatically generate reports related to their usage (e.g., who has used the tool, what they have used the tool for, and for how long). Some may also provide direct outcome-related reports such as formative assessments, student engagement, or behavior reports.

  • Goal-Related Outcome Reports: Depending on the identified need or goal for the tool, there are likely related reports that can be pulled to correlate with tool usage (e.g., attendance, interim assessment, family contact logs). These reports, coupled with additional data points to add context, can more closely link the tool’s use with the need or goal it was selected to address.

The EdTech Systems Guide, from the Massachusetts Department of Elementary and Secondary Education’s Office of Education Technology, has a wealth of resources around edtech evaluation, including articulating the “why” and “how,” identifying and managing resources, conducting data analysis, and more. The guide offers free resources and an accompanying workbook to help guide districts through the full process.


Strategy Resources


Edtech Stakeholder Evaluation Survey

This survey can be used to collect feedback from school stakeholders (e.g., teachers, students, staff... Learn More

Edtech Grading Rubric

This grading rubric (modified from LearnPlatform’s EdTech Grading Rubric) helps collect feedback from educators around... Learn More

badge equity image

Equity Focus

For all data sources, districts should make sure to disaggregate their data to ensure they are examining how a tool is being used across different contexts and with different populations.

For example, districts can consider the following questions:

  • Are students from low-income backgrounds engaging at the same level with an edtech tool as their higher-income peers?

  • Are schools that serve higher percentages of students of color or students learning English experiencing noticeably different results?

When clear discrepancies arise, it is critical to understand more deeply the experiences of the specific students, educators, or families that are not demonstrating the same outcomes. Explore the 6 Steps to Equitable Data Analysis from Edutopia for ideas on how to achieve this.