Process evaluation case study: CSF EdTech
Process evaluations of 12 EdTech solutions helped the nonprofit Central Square Foundation identify which products to test for scale-up across India.
Background
The vast majority of education technology (EdTech) in India caters to students from high-income households. For EdTech to have an impact on education at scale, more products must be developed that offer vernacular languages, use appropriate cultural references, target a range of learning levels, and are sold at lower price points.
EdTech is thus a core part of the strategy of the Central Square Foundation (CSF), a nonprofit working towards ensuring quality school education for all children in India. CSF is working to create a pipeline of contextualized EdTech products for low-income students, generate evidence around the efficacy of such solutions in India, and catalyze large-scale government adoption of impactful solutions.
Question
CSF sought to identify high-performing EdTech solutions, which it would adapt for implementation by the government. These solutions would be tested, and impactful products would ultimately be scaled up across the country.
To identify which products CSF should consider for potential scale-up, IDinsight conducted rapid process evaluations of 12 promising EdTech solutions. The goals of the process evaluations were twofold:
- Evaluate how well each product was functioning currently
- Evaluate how appropriate each product would be for scale-up in government schools
Approach
Evaluation strategy
We evaluated products across five research categories: design, provision of support by product-maker, adoption of product, student engagement with product, and perception that using product would increase learning. For each research category, products were given a rating of ‘poor,’ ‘satisfactory,’ or ‘high’ performance.
Data collection
The data for the evaluation came from three sources: a) interviews and surveys with students and school staff, b) backend user data, and c) expert panel review.
1. Interviews and surveys with students and school staff
In total, we collected information from over 1,600 school staff and students in 10 states.
For EdTech solutions meant to be used in schools, research teams visited 3-5 schools per product to collect data on user perceptions and experiences. Schools were sampled purposively, targeting variation such as rural or urban settings, varying use levels, and differences in implementation models. For each product, we:
- Conducted semi-structured interviews with 10-15 teachers, 3-5 administrators, 10-15 students
- Administered paper-and-pencil surveys to 100-150 students
- Observed 3-5 product use sessions in classrooms
For the three EdTech products meant to be used at home, research teams conducted phone interviews with 183, 154 and 64 students respectively and about 30 parents in total. The sampling process was different for each of the three products due to operational constraints:
- For the first product, we sampled users purposively based on use behavior
- For the second product, a random sample from the population of users was drawn
- For the third product, the product company process requested users to sign up for interviews
2. Backend user data
We collected backend user data from product companies where possible, collecting data for over 17,000 users and 3,900 schools. The data focused on the behavior of students and teachers, such as login frequency, questions attempted, and resources accessed. This data was used to understand product use trends, such as frequency of engagement.
3. Expert review
Each product was reviewed by 4-6 education and technology experts. They reviewed the product based on content, pedagogy, instructional design, user experience, and backend technology.
Results
- CSF used these results to inform their EdTech strategy as they work to build a pipeline of promising, contextualized solutions to be tested and scaled in government schools.
- The published findings gave EdTech companies, foundations and governments a nuanced understanding of different aspects of products and how they interact with different use cases.
- The tools created for the rapid process evaluations have been used by state governments and other foundations to evaluate EdTech products. CSF continues to build on and enhance the framework created in this project in the subsequent versions of the EdTech Lab.
- The evaluations revealed gaps in the EdTech marketplace that could be met by funders and product companies, such as developing quality content in non-English languages.
- The process evaluations also revealed strengths and opportunities for improvement across products, such as:
- In many cases, EdTech content was at an appropriate level and the product was consistently used in the way intended.
- Few products matched instruction to the learning level of students. Teachers and student use patterns showed that both user types looked for capability-aligned content when available.
- Product companies rarely designed the products to work well for low-income students. For example, few products had quality non-English language navigation or content, a critical feature given limited English proficiency among this target group.
- Practitioners often had difficulty effectively implementing products. Despite receiving training, teachers across products and implementation models needed regular, sometimes daily, help with simple tasks such as start-up and basic navigation.
Guide
Not sure where to go from here? Use our guide to frame a question and match it to the right method.
Process evaluation?
Find out more about process evaluations and how to conduct one.
Create your theory of change
Use our drag-and-drop tool to create a theory of change diagram