We track anonymous visitor behavior on our website to ensure you have a great experience. Learn more about our Privacy Policy.

Surveying Teachers to Identify Edtech Professional Learning Needs

Using survey data to plan targeted and relevant professional learning opportunities

Overview

Professional development and learning can be powerful drivers of technology integration and are critical supports for both the initial rollout and long-term support of edtech tools. As such, edtech leaders must consider how and when they work directly with teachers and other stakeholders to develop the skills and knowledge needed to implement edtech tools effectively. Between tightly packed schedules that limit opportunities for professional learning and growing edtech portfolios, leaders need help understanding what tools, content, knowledge, and skills to prioritize.

This challenge of determining priorities for edtech professional learning was one the Director of Instructional Technology for Chicopee Public Schools (CPS), Nick Duell, sought to address through participation in the EdTech Peer Learning Cohort offered by the Massachusetts Department of Elementary and Secondary Education (MA DESE)’s Office of Educational Technology in partnership with The Learning Accelerator (TLA) during the 2022-23 school year. Nick, a former CPS chemistry teacher and principal, was the first one to serve in the district’s new Director of Instructional Technology role. He spent his first year developing the foundations for robust, equity-driven edtech selection, implementation, and evaluation systems. A critical early step in this process involved gathering information to help him prioritize his efforts, guide his decisions, and determine how best to support the district’s teachers.

To gather this information, Duell created a brief, seven-question survey to collect information about the edtech tools his teachers were using and how they used them, as well as their overall comfort levels and familiarity with specific devices and programs. His process for designing this survey and analyzing the data collected demonstrates practices that edtech leaders can use to determine their teachers’ professional learning needs.

Use a Survey to Collect Data from a Large Group

Surveys are only one of many tools that can be used to gather the information needed to plan edtech-focused professional learning opportunities. While school or district edtech leaders may consider convening a committee, organizing focus groups, or conducting empathy interviews to gather information about teachers’ professional learning needs, surveys are particularly well suited for collecting data that must be representative of a large group and are, therefore, ideal tools to measure teachers' attitudes, behaviors, and needs across a large school or even an entire district. These other tools require considerably more capacity and time to develop and facilitate – and may be better suited for adding nuance to survey findings. For Duell, the only full-time employee in the CPS edtech department, a survey was the only feasible way to quickly gather this information from the teachers across Chicopee’s 15 schools.

Target Survey Questions

The design and distribution of a survey are critically important. These tools do not easily lend themselves to direct follow-up with individuals, and if they’re perceived as too cumbersome or lengthy, they can quickly result in respondent frustration, incomplete responses, and low response rates. As leaders develop survey questions, they should consider the following:

  • Is the survey aligned to a specific goal? The purpose of the survey should be clear. Before survey designers can develop a survey, they must have a clear goal for their learning or a specific set of information they hope to collect.

  • Are the survey questions targeted and relevant? Shorter surveys with explicit purposes are more likely to be completed. Survey designers should aim to ensure that participants can complete their survey in 10 minutes or less to increase the likelihood of completion.

  • Are the survey questions clear and concise? Surveys should only ask one question at a time. Survey designers should create separate, specific questions about each aspect of the topic they are exploring – rather than combining multiple questions into one – to make the collected data more accurate and useful.

  • Do the survey questions avoid any potential influence on their answers? Surveys are meant to gather information about how respondents feel and think. Survey designers should present questions, possible responses, and other pertinent information in neutral ways that do not encourage respondents to answer in a specific manner.

Duell’s survey aimed to find two pieces of information to inform him about his teachers' professional learning needs. First, he was interested in uncovering which of the edtech products CPS pays for were used most often by the district's teachers. Second, he was interested in understanding how teachers would rank their proficiency or familiarity with each tool they reported using. Together, he believed, these two pieces of information could uncover bright spots where teachers often use – and feel proficient using – tools that his district invests resources into, as well as challenges in which tools the district pays for are under-utilized, and teachers report feeling less comfortable using these tools.

He could have collected this information using several questions but chose to do so using a type of question called a matrix, in which the tools paid for by the district were listed in a series of rows. Respondents were asked to rank their proficiency as “beginner,” “intermediate,” or “advanced” – or to choose a “not applicable” option for tools they did not use – within corresponding columns. Choosing this question style and providing an option for teachers to demonstrate that they do not use specific tools allowed Nick to display all of the tools paid for by the district at once using one targeted and concise question that encouraged teachers to answer honestly without biasing their responses.

Make Meaning of Data

Duell’s survey design was informed by a clear purpose and desire to uncover specific information, which made his job of analyzing survey responses easier. Using a matrix-style question allowed him to quickly identify relationships between tool usage and teacher familiarity with those tools, which he ultimately broke down into four categories that informed his next steps. The categories and how they informed his next steps are detailed below:

  • High usage, high proficiency: Teachers reported using these tools often and advanced proficiency using them. Tools that fall into this category were not priorities for Duell’s professional learning efforts.

  • High usage, low proficiency: For these tools, despite reporting using them often, teachers reported low proficiency, which highlighted for Duell the need for additional professional learning opportunities and resources.

  • Low usage, high proficiency: Despite reporting proficiency using these tools, teachers reported low usage, which indicated that something other than professional learning or training might negatively impact their usage. While they did not become priorities for professional learning, he noted that these tools warranted further investigation.

  • Low usage, low proficiency: These tools, which were not used often, and for which teachers reported not feeling confident using them, were clear priorities for professional learning and intervention.

Drill Down Further

In addition to analyzing responses to this question, Duell’s analysis process included further triangulation – or putting the data collected into a broader context to identify relationships among and between data points. For his survey, this meant cross-referencing which of the tools from the above categories were considered high-priority by his district’s leadership team and, therefore, should be prioritized for action. Further, Duell was able to hone into specific groups’ needs based on the intersections with demographic information also collected on the survey. For example, by filtering down to only responses from teachers within specific grade-level bands, he identified different professional learning needs for elementary, middle, and high school teachers based on the tools they used.

Duell’s process demonstrates one relatively low-lift option to quickly gather data that can be used to design more targeted and relevant edtech-focused professional development opportunities. Discover more information on how he used this survey to inventory the district’s edtech tools. TLA has also shared additional information on research related to professional development.