How Schools Are Holding Edtech Products to a Higher Standard

Educational technology adoption has grown significantly in the past decade, and it’s clear that K-12 schools are now comfortable with and embrace the new technology norms. The next step for school leaders is to focus on purchasing edtech strategically, ensuring that these tools genuinely make a positive difference in teaching and learning.

Susan Uram
Director of Educational Technology at Rockford Public Schools

But effectively evaluating edtech products is no small feat. Districts must balance diverse needs, ensure data privacy and align tech initiatives with educational goals. The process involves navigating budget constraints, integrating new tools with existing systems and ensuring accessibility for all students. To shed light on how districts tackle these challenges, EdSurge spoke with three leaders in educational technology.

Susan Uram, the director of educational technology for in Illinois, leverages her background as a classroom teacher, curriculum dean and instructional coach to bridge the gap between IT initiatives and classroom instruction. April Chamberlain, the technology and library supervisor for in Alabama, also began her career in the classroom before taking on a pivotal role in aligning technology initiatives with instructional needs. Jessica Peters, the director of personalized learning at , oversees the integration of educational technology across 22 schools, drawing on her experience as a classroom teacher and instructional technology coach to implement effective edtech solutions.

April Chamberlain
Technology and Library Supervisor at Trussville City Schools
Together, they provide invaluable insights into the challenges and strategies surrounding edtech procurement and implementation in their districts, including their shared excitement about their involvement with the Benchmark project. Benchmark, an ISTE research project with funding from the Walton Family Foundation and Chan Zuckerberg Initiative, aims to support districts that are trying to improve the ways in which they assess, measure and report student progress based on their needs and contexts. As part of the Benchmark project, ISTE worked with six public school districts across the United States to explore problems of practice related to assessment evaluation and selection within their districts.

EdSurge: How does your district approach edtech product evaluation and selection? And what makes the procurement process challenging?

Uram:
Rockford Public Schools is a relatively large district with 27,000 students. We balance the different needs of individual schools with a high mobility rate of almost 20 percent within the district. So we try to honor the professional choices of our educators while providing consistent education and experiences for families across the district.

Jessica Peters
Director of Personalized Learning at KIPP DC
When a new edtech product request comes in, we have checkpoints to evaluate if the tool meets our needs. Does it duplicate something in place? How is this tool different or better? Would a pilot provide a genuine trial? [Product evaluation] is not just about whether teachers or students like the tool. It needs to be a product worth investing time and effort into learning to use effectively.

Chamberlain: We ask those same types of questions. Our state has a multi-year program that helps us evaluate our current resources to decide if we need to recalibrate, remove or add something new. We use a multi-tiered system of support ( ), so it is important but challenging to have all seats at the table — all stakeholders — represented when reviewing edtech.

During the past school year, we audited the district’s programs, initiatives and projects. We had representatives from technology, student services, administration, counseling and curriculum in the room for the district meeting. Then principals turned around and conducted similar audits at the building level. First, we listed all of the edtech products being used by teachers, both instructional and operational, which revealed some surprises. We then categorized these resources by subjects like English, math, behavioral or foundational wellness, and further broke them down into the setting each product serves: Tier 1, 2 or 3. This allowed us to see the gaps and overlaps with edtech products.

Going forward, we now have a form that teachers fill out to request a new product. The teacher answers questions about the tool, such as technical details, and how it aligns with or improves instruction. That completed form goes to the school-based tech team, which discusses the product and compares it to what we know is already being used across the school and district. Once approved at the school level, we go forward with a pilot to determine if there is a sustained value for other settings across the school or district to implement the new product.

Recommended Resources:​

  • The helps standardize the way edtech decision-makers evaluate edtech products.
  • Use the to find, compare and decide between 1,500+ edtech products.
  • Check out our to evaluate edtech quality, including teacher and student usability.

Peters: KIPP DC has a few checkpoints in place. Mid-school year, around January or February when budget planning starts, I conduct a light analysis of all our current products to identify those that are underused, ineffective or redundant. Our pilot program is generally very open to requests, although we do say no to a few things if they're extremely duplicative. Every summer, we perform a thorough efficacy analysis on all core and pilot products. Occasionally, some products bypass our data review due to initiatives from the KIPP Foundation or strong endorsements from top instructional leaders, and we have to adapt accordingly.

How can the support educators and district leaders in edtech product evaluation and selection?

Peters:
The tool is much more thorough than anything we've ever used and addresses almost every question that we could come up with. If we were to walk through the tool for every product, I think there would be a lot more confidence that the product is, in fact, appropriate for us to use and meets all of our standards. It is a heavy tool, so working through the whole framework is time-consuming and not really something that I could ask a teacher or the average school leader to do. But I think it's excellent for district-level evaluation.

Uram: Right out of COVID, we were overwhelmed with the thousands of products that teachers were using. We needed a better language — a framework to address all of the products. The tool helped to cut through all the verbiage that a vendor might say about the product and ask questions like, “What are the accessibility features? Where do you find them? Is there interoperability?” It makes the evaluation more fact-based and removes the feelings and opinions.

There are a lot of questions in the tool, so we have chunked together pieces of the framework and provided guiding questions based on those pieces. If a product passes through those questions, we can dive a bit deeper. [The tool] has helped us take a deep breath when we see a shiny new product before we buy it.

Related Readings:​


Chamberlain: We learned to shift questions [we ask] vendors from “Does this product do this?” to “Show me how this product does this.” The tool guides us to ask the right questions and think about what we are trying to achieve with a product, so not saying, “I want this math product,” but instead, “I want a better way to assess my third grade students on the skills that the data shows they performed low on.” It is very empowering.

Uram: We need to think about the role of technology in school and how we evaluate whether a product is improving teaching and learning. We are at an important intersection of understanding data privacy and online presence in a way that we didn’t need to before. It was different when kids were just playing Oregon Trail. There is more at risk. We ourselves have been taken down by ransomware. So making data privacy a part of the product evaluation discussion is a necessity.

Peters: The Teacher Ready Framework removes emotion from the conversation and bases it on data instead. A big success we have seen at KIPP DC is no longer basing [product purchasing] decisions on how cool something seems. Now, we conduct efficacy analyses. The tool really highlights for us what is working and worth classroom time. It has created a huge shift in the standards we hold products to.
 
Top
AdBlock Detected

We get it, advertisements are annoying!

Sure, ad-blocking software does a great job at blocking ads, but it also blocks useful features of our website. For the best site experience please disable your AdBlocker.

I've Disabled AdBlock