We track anonymous visitor behavior on our website to ensure you have a great experience. Learn more about our Privacy Policy.


Measurement in Action: Using existing data and staff to measure personalized learning

View of desk with laptops and papers on top of it, people sitting around the table gesture down at laptops and papers

Fauteux, Mohammed, Rychel

Leadership Public Schools, TLA, Distinctive Schools

Over the last school year, The Learning Accelerator (TLA) embarked on two researcher-practitioner partnerships to demonstrate how our Measurement Agenda, and specifically our District Guide to Blended Learning Measurement, could be put into action. TLA collaborated with Distinctive Schools in Chicago, IL, and Leadership Public Schools in the Bay Area, CA to measure their personalized learning initiatives in order to determine what and how to scale to other schools. Both of these demonstration projects used minimal time, money, and effort, with both TLA and the school systems capitalizing on existing resources to answer the most pressing questions in the most rigorous way possible.

We’ve identified five key takeaways from these demonstration projects that we hope will help other school systems realize that much can be done (at relatively little cost) to conduct sound measurement and make data-informed decisions about the teaching and learning experiences we offer.

1. One size doesn’t fit all

No matter how similar or aligned one school system’s goals may appear to be with another’s, the measurement needs of each system are diverse and unique. In our demonstration projects, both Distinctive Schools and Leadership Public Schools were interested in understanding more about their personalized learning initiatives in order to thoughtfully and successfully scale to other schools. Each project included different activities and deliverables and required different levels of contribution from TLA. For this project, Distinctive Schools worked on creating a single document aligning their personalized learning framework, their personalized learning observation walkthrough, and their system-level observation tool. Creating this alignment document shifted their thinking about their walkthrough tool as more of a rubric to more of a continuum, and has also evolved their practice to using the tool as part of a reflective process. The major output from Leadership Public Schools was a logic model (or theory of change). The activities identified in this logic model were used to develop an observation checklist for their personalized learning implementation. Their apparently similar goals were achieved in different ways, as both systems used these outputs to refine rubrics that they will use to determine how personalized learning is being implemented in their classrooms, and also to guide analyses of student and educator outputs and near-term outcomes to inform and improve both implementation and efforts to scale to new schools.

2. The data (likely) already exist

One of the most eye-opening lessons for TLA and for both school systems, was that no new data needed to be collected for the demonstration projects. In other words, systems often have more data about implementation and success than they realize. The very idea of more, new, or different data collection is often daunting for educators and systems alike, no matter their capacity for collecting and analyzing data. We were all pleasantly surprised when we realized that vast amounts of data were already being collected and stored by the system, and that the data could be accessed relatively easily for making programmatic decisions.

Examples of existing data used in these demonstration projects

Teaching and learning practices

Examples of data used:

Teacher outcomes

Examples of data used:

  • Teacher-reported progress on specific practices identified in the system’s PL Framework (visualized as a "heat map", see image below)

Student outcomes

Examples of data used:

  • Student attendance

Figure 1: Example of teacher-reported progress on specific PL practices visualized as a heat map


3. The data are locally relevant

School systems tend to have practical questions about their own (localized) implementation and their own educators and students. When embarking on measurement activities, they care little about generalizability beyond their population, and even less about statistical significance. Their interest lies in making informed decisions about their own programs, and reducing their risk of failing to support students or using resources ineffectively. What we learned from these projects is that systems’ questions often match the data and information they have…and when they don’t, they often have the capacity to gather the right data/information.

4. It doesn’t take a village...

Measuring blended learning initiatives can be accomplished in a lean, low-cost way. During our initial project meetings, each school system was concerned that it did not have enough resources (time, human capacity, infrastructure) to complete measurement activities in a rigorous way. In fact, we were able to identify existing data collection, data review, and program design procedures that measurement activities could build on, so that not a lot of extra time was required either of the system or of TLA. We used project plans and meeting agendas that facilitated an explicit, shared understanding of roles, responsibilities, and expectations. In all, we spent a total of approximately five hours in online, working meetings throughout the year for each project. Individually, each school system, and TLA, added a similar amount of time finding and sharing data, conducting analyses, and reviewing each other’s work. We learned that a small group of invested “champion(s)” can move measurement along and answer pressing, somewhat complex questions when aligned with the right supports.

5. ...but it does take a team

Perhaps the most obvious lesson we learned is that school systems do have a legitimate need for targeted, specific capacity when it comes to measurement. Systems can benefit greatly when research or other support organizations make evidence generation and data-informed decision-making accessible to them in “light touch” ways. School systems want the thought partnership and data analysis that researchers or measurement professionals can directly provide. In addition, researchers and others can share tools and resources for measurement, or provide a different perspective on measurement activities. In our demonstration projects, both systems had the data, but not the analytic capacity, to pilot rapid-cycle evaluations of their personalized learning initiatives (for both educator and student outcomes). TLA was able to do some preliminary analysis for Leadership Public Schools, and pointed both Leadership Public Schools and Distinctive Schools to the Ed Tech Rapid Cycle Evaluation Coach. External partners like NWEA, and Lea(R)n can also help supplement system capacity by doing more rigorous analysis and even sometimes providing comparison data. School systems benefit tremendously from the availability of timely, inexpensive supports such as these.

TLA engaged with Leadership Public Schools and Distinctive Schools to demonstrate the power and value of targeted, cost-effective support to school systems measuring their blended and personalized learning initiatives. What we learned through the process is that the task is not only feasible, but meaningful and invaluable both to systems, and to the sector, in growing our understanding of effective blended learning implementation for the benefit of all students.

View of desk with laptops and papers on top of it, people sitting around the table gesture down at laptops and papers

Fauteux, Mohammed, Rychel

Leadership Public Schools, TLA, Distinctive Schools

Mike Fauteux is the Director of Innovation at Leadership Public Schools, which includes LPS Richmond. There he manages personalized learning innovations and partnership development.

Saro Mohammed is a Partner at The Learning Accelerator.

Amanda Rychel is Executive Director of Art In Motion for Distinctive Schools. Previously she was Vice President of Strategy at Distinctive Schools, which includes CICS West Belden.