While students have always come to school with varying levels of experiences and academic mastery, multiple years of learning disruption have dramatically increased the challenge of understanding and responding to learner progress. Taking action to address gaps and heterogeneity requires a clear picture of where students are and where they need to go next. That said, leaders of schools and systems are wondering how they can take an asset-based approach to measuring student learning – identifying areas of progress and the associated enabling systems and structures – rather than a deficit stance that focuses primarily on gaps or loss.
This insight intends to address this question, offering concrete processes and methodology to investigate learner growth during times of disruption to student learning.
Reframing Thinking: From System-Level Learning LOSS to Individual Learning GAINS
From the outset of the pandemic, conversations about ‘learning loss’ emerged as a key concern at the national level. Researchers predicted group-level performance gaps on standardized assessments, highlighting the deleterious effects of the pandemic. However, negative or reduced rates of learner performance indicate slower achievement and growth – not a true loss of knowledge. As such, the term ‘learning loss’ can be a misnomer. Instead, at The Learning Accelerator (TLA), we have adopted the term ‘unfinished learning’ to better center and reflect learners’ experiences and acknowledge the realities of pandemic conditions.
Unfinished learning describes the notion that learning has not yet been completed, that students have not yet had access to material that should have been presented, and that they have not yet demonstrated the intended level of mastery. Instead of focusing on what students have lost or the gaps that have emerged, unfinished learning reflects what may be possible and places the onus for improvement on the system itself (The Education Trust).
By focusing on unfinished learning, districts can examine student growth over time, allowing them to better identify which students (or groups of students) have made the most progress. From there, the goal is to identify and understand the factors that contributed to student growth so that they can be extended to all students. But how might you do this?
Measuring Unfinished Learning in YOUR System
TLA had the opportunity to work with Lindsay Unified School District (LUSD) to study unfinished learning at the end of the 2021-22 School Year. Through this collaborative partnership, we learned how to take on the challenge of understanding patterns of growth within and between groups of students, as well as identifying systems of support. We’ve captured this approach in a five-step process that you can use – regardless of your current level of research experience – to better understand the degree to which learning remains ‘unfinished’ in your own context.

Decide If You Are Ready to Study Unfinished Learning
Before jumping in, make sure that you have the necessary data to measure growth in your context. Research or data teams can use the flowchart below to self-assess and determine whether it is possible to engage in a larger study of unfinished learning or if a different approach might be more appropriate.

To measure unfinished learning, you will need:
Assessment data captured multiple times during each year (i.e., formative, benchmark data rather than a single summative score such as a state assessment);
Data that can be easily compared over time; and
The ability to align assessment data with the correct student groups.
For example, LUSD used both the Scholastic Reading Inventory (SRI) and Curriculum Associates’ iReady platform to examine learners’ reading growth using the LEXILE Framework. While the assessments are different, the measures are comparable. However, the expected annual growth in LEXILE level is higher in elementary than middle or upper grades because younger students naturally make more progress. To make comparisons over time, LUSD then grouped students according to their expected level of progress – for example: elementary (3-5), middle (6-8), and secondary (9-12).
It is possible that you might not have access to the data you need. In the meantime, consider a few other ways you can begin collecting data on student progress, using the strategies below.
Step 1: Develop Your Research Questions
Developing research questions before you begin will help to guide both your investigation and how you think about your results. We used the questions below to guide our work with LUSD. You may use these same questions or modify them to better fit your context.
How did student growth vary by age group/grade level as well as between different sub-populations over time? Look at your benchmark data from the past few years and observe patterns.
What enabling systems and structures appeared to have contributed to students’ growth over time? Identify and examine various supports that different groups of students may have received.
What enabling systems might you consider implementing or extending to accelerate student growth in the future? Based on your analysis, identify successful enabling systems and structures to move learning forward in your own context.
Step 2: Collect Your Data
While there are many different ways to measure student learning, numeric and standardized data such as that found on benchmark assessments offer a concrete indicator of student growth. To determine what data you should collect and whether it will allow you to conduct the analysis, use these self-assessment questions:
Do you want to examine changes in scores over time in any subject where you have consistent data?
If so, determine which school year(s) you want to use in your analysis and then make sure that you have consistent data for each of those years (meaning data that is actually comparable). If you are making comparisons across school years, you may also want to use data from the same students so that you have an accurate picture of their progress over time.
Which students should be included in your analysis?
A sample is a group (or groups) to be investigated. With ‘unfinished learning,’ the sample could be students for whom you have consistent data across multiple years; those from specific age groups or buildings; or those who have participated in different learning contexts such as remote, hybrid, or in-person.
Do you want to look at any specific subgroups within your sample such as students classified as learning English or those receiving special education services?
Before slicing your data into several segments, first make sure to take sample size into account. If the number of students in a particular group is not large enough, then you will not be able to make valid observations at the subgroup level. For example, in LUSD, they wondered about learners living in foster care. However, the number of students was so small, they could not make inferences based on the limited availability of information.
Step 3: Organize Your Data
Often overlooked, careful attention to organizing your data will enable you to measure student learning in a logical and consistent manner. Once you know what you need to collect, begin pulling data sources into a single online location. You may have to collect data from multiple sources that will require several data files. For example, demographic data might live in a student information system or human resources database, and assessment data might reside in a platform such as iReady or NWEA MAP.
Unless you have a visual analytics platform (such as Tableau) that can analyze data across multiple systems, you will want to get all of your raw data files into a well-organized and secure location that adheres to your school or district’s student data privacy policy. This will make your analysis much easier and ensure that you have properly protected your data. To conduct your analysis, you will then want to input or import your data into your chosen database or statistical package such as Excel, Google Sheets, SPSS, or R.
Step 4: Data Analysis
Once you have organized your data, you are ready to analyze it. You don’t need any complex statistics to understand how your students have made progress over time. Data analysis can compare average (mean) scores at each assessment window and then look at the percentage of change between those scores.
First, using your chosen spreadsheet or statistical package, calculate the average score for each group and subgroup at each point in time for which you have data. Next, calculate the average percentage of change between each assessment window. Particularly when comparing growth patterns across different assessments, percentages between scores can be useful for understanding progress. This Sample Analysis provides an example of how to do these calculations.
You may also consider using a frequency table in your analysis such as the one below. A frequency table displays the average score for the indicated group or subgroup (mean and median), the total number of responses, the standard deviation, as well as the minimum and maximum scores. This can easily be accomplished using a statistical package or by using the frequency command within Google Sheets.

Finally, create your data visualization models to illustrate patterns and trends as well as easily communicate the findings of your analysis. Data visualizations can include a variety of different displays. To show changes in scores over time, we recommend using line graphs. Bar charts are more appropriate to compare the percentage of change.
Step 5: Make Sense of Your Findings
Now that you have completed your analysis, you are ready to make sense of your findings so that you can connect what you have observed in the data with actionable ideas and strategies for addressing unfinished learning.
To better understand what you found in the statistical analysis of your quantitative data, collect additional qualitative data through interviews or focus groups with educators, counselors, leaders, staff, community partners, or students. These stakeholders will be able to provide valuable feedback about their experiences and the support they received or provided.
After conducting any interview or focus group, make sure that you debrief immediately with your research team to discuss initial impressions and identify any themes that emerged from the conversation. Later, as you review your transcript, audio or video recording, or notes from the session:
List out any additional themes, categories, or major ideas.
Identify important quotes that illustrate key concepts, making sure to properly label each participant quote so that you can connect the feedback with their context.
Make sure to also look for common responses among all participants as well as varied responses among different subgroups.
As a last step, compare the results from your quantitative data analysis with your focus group/interview data. Use the guiding questions below to make sense of your findings.
Does the qualitative data support the quantitative data or does it differ?
What are the similarities and/or differences in the quantitative and qualitative data?
How does the qualitative data provide insight for or explain differences in learning performance among groups of students?
Next Steps: Implementing Data-Driven Strategies to Address Unfinished Learning
Once you have collected your data, analyzed it, and made sense of your findings, you are ready to develop strategies to address unfinished learning in your school or district. By answering the following questions, you can start to identify and develop strategies.
What enabling systems or strategies allowed students to perform well and how will you use those enabling systems to assist all students?
How will you work to ensure all students have access to successful strategies in the future?
How will you comprehensively address different developmental needs and personalize experiences to effectively engage every learner?
Final Thoughts
Measuring unfinished learning takes time; however, the data can be advantageous to your school district. The pandemic and other disruptions to learning have exacerbated existing academic gaps and created new challenges for all learners. However, by taking an asset-based approach and measuring unfinished learning, you can create both an evidence base and a springboard to advance existing enabling systems and develop new ones to support student learning.