In 2016, the Luminos Fund launched its accelerated, catch-up learning program in Liberia to help address the country’s urgent education needs – including one of the world’s highest recorded rates of out-of-school children. To date, Luminos has helped 12,650 Liberian children catch up on learning and reintegrate into local government schools. In addition, Luminos has trained 497 young adults on our pedagogy and model, and supported them to deliver the catch-up program in classrooms.
During the 2021-22 school year, the Luminos program increased children’s oral reading fluency (ORF) by 28 correct words per minute (CWPM), with girls progressing 3 CWPM more than boys. Students also made substantial gains in numeracy, with a 28 percentage point improvement in addition and a 20 percentage point improvement in subtraction. Our latest report, “Liberia 2021-22 Endline Evaluation Report,” summarizes results from the 2021-22 Luminos program endline evaluation conducted by Q&A Services. 
In 2021-22, the Luminos program ran for 9 months—from November to August— in line with the Ministry of Education’s 2021-22 official academic calendar; this calendar was shifted slightly compared to a standard, September – June calendar due to COVID-19. Luminos students attended class for 7 hours per day from Monday to Friday, with approximately 5 hours per day devoted to reading and 2 hours to numeracy.
Luminos supported 3,150 out-of-school students across 105 classes and five counties (Bomi, Bong, Grand Cape Mount, Margibi, and Montserrado) in Liberia. Every year, Luminos works closely with a small group of community-based partners, each of which manages a cluster of classrooms, to deliver the program.
The results of the evaluation show that the Luminos Fund’s Liberia program positively impacted student reading and math outcomes across all EGRA and EGMA subtasks in the 2021-22 school year.
The evaluation aimed to demonstrate the impact of the Luminos Liberia program on student literacy, numeracy, and socio-emotional outcomes during the 36-week 2021-22 program. Q&A Services assessed the literacy and numeracy levels of a random sample of students across all Luminos classes in the first two weeks of the program (baseline) and again in the final week of the program (endline). The RTI/USAID-developed Early Grade Reading Assessment (EGRA) and Early Grade Mathematics Assessment (EGMA) tools, adapted for Liberia, were used at both baseline and endline to assess students on a variety of early grade reading and math skills. A socio-emotional learning (SEL) assessment was also conducted with a subset of the student sample using the International Social Emotional Learning Assessment (ISELA) tool. For more details on the evaluation and methods used, please see the full report summary.
The results of the evaluation show that the Luminos program positively impacted student achievement in both reading and math.
On reading, students showed improvement across every EGRA subtask, including an improvement of 50 percentage points on letter identification, 46 percentage points on oral reading fluency (ORF) of Grade 2 level text, 39 percentage points on familiar words, and 33 percentage points on reading comprehension. For ORF, students could read 29 CWPM at endline, compared to 1 CWPM at baseline, an improvement of 28 CWPM.
On numeracy, students again showed improvement across every single EGMA subtask, including an improvement of 35 percentage points on number identification, 33 percentage points on number discrimination, 28 percentage points on addition, 20 percentage points on subtraction, and 22 percentage points on word problems. While the program impacted student achievement on mathematics, improvement was less significant than for literacy. This makes sense given that 5 hours of the Luminos school day (approximately 70% of instructional time) is devoted to literacy and 2 hours each day (30% of instructional time) is devoted to numeracy.
The results of the evaluation show that the Luminos Fund’s Liberia program positively impacted student reading and math outcomes across all EGRA and EGMA subtasks in the 2021-22 school year. Results show that the average student improved by 28 CWPM within the 9-month program, with girls improving 3 CWPM more than boys. These results are incredibly impressive given the short (9-month) timeframe for the Luminos program. Results for the SEL assessment show improvement on self-concept, particularly for girls, suggesting possible impact of the Luminos program on broader student development; however, further research is required. When compared with similar programs in Liberia and globally, year on year the Luminos program is showing strong learning outcomes, particularly on literacy.
To read the full report summary, including additional background on our Liberia program and a more detailed overview of the evaluation and methods used, click here.
Simpson, A. “Luminos Fund Endline Evaluation 2021-22, Liberia,” Q&A Services, December 2022.
Ernesta Orlovaitėis Associate Director of Programs at the Luminos Fund. Ernesta oversees the design and delivery of Luminos’ program in Ghana, collaborating closely with the government and local implementing partners. She also guides Luminos’ efforts to strengthen its capacity for data-based decision-making and drive better outcomes for our students.Previously, Ernesta worked as a Product Manager at Google, leading product design and development teams in Switzerland and Japan.
Launching an education program in a new country is an unforgettable experience. As a seemingly endless list of tasks gets shorter, emails and calls give way to something much more tangible: printing primers, delivering teacher training, and, finally, ushering excited children to the classrooms for their first lesson. The first day in class is the singular focus in the weeks and months before program launch – getting everything ready just in time is a monumental undertaking.
Yet at the same time, the first day is just that – the first day in a long journey of learning to read, write, and do basic maths; of learning to learn; and of falling in love with it. In that journey, every day matters. It’s that journey, joyful and child-centered, that transports Luminos students from zero to functional literacy and numeracy in just one school year.
On March 8, 2022, as dusk fell over the hills of Ashanti, Ghana, where we celebrated the launch of the Luminos program to 1,500 out-of-school children, we were already thinking about our next big goal: an external evaluation of our first year in Ghana.
Kicking off an independent program evaluation in Ghana
We knew we’d want to use the national assessment instruments to evaluate our program. These instruments have been extensively tested and translated to Asante Twi: the language of instruction in Luminos classrooms this year. Using them would also allow us to compare our student progress against learning achievements in formal schools. But we also knew that Luminos classrooms are, in several important ways, different from government schools, making assessment delivery significantly more challenging. So, we got involved – and here’s what we learned.
Lesson 1: Attend enumerator training and provide rapid feedback
EARC ran a five-day enumerator training in Kumasi, the capital of Ashanti, the week before data collection started. Under the guidance of field coordinators, enumerators visit classrooms to administer EGRA/EGMA and record student responses.
Enumerator training is a critical component of the data collection process, so the day after our program officially launched, I arrived at the Bethel Methodist Primary School to observe a practice EGRA/EGMA delivery to Grade 2 students. After the first round, I had several pages worth of feedback and so did the two field coordinators from EARC. Huddling together in an empty classroom, we discussed our reflections from that first attempt.
Some mistakes the enumerators made were mundane and would go away with further practice. For example, with students facing enumerators, several indicated the wrong direction of reading – from right to left. In the second attempt, not a single enumerator repeated the mistake.
A more serious issue – and one that’s difficult to catch when you don’t speak the language of assessment – was not sticking to the assessment script. With the two field coordinators fluent in Asante Twi, however, we identified and addressed the problem right away. As one of the coordinators emphatically put it while pointing at the enumerator manual, “Read this, and you will go to heaven.”
Knowing that EARC is an experienced partner, and seeing most enumerators administer EGRA/EGMA with fluency and precision, I wasn’t too concerned about the technical aspects of the evaluation. What worried me was how our students would experience the assessment.
Luminos works with some of the most vulnerable children in Ashanti. A typical student enrolled in our program would be an 11-year-old who is unable to read even the simplest of words. She might have been kept out of school because her family could not afford a school uniform. She might be tired because that morning she had been working on the family farm. She might be distracted because she hadn’t had lunch before coming to class. The experiences of Luminos children are very different from those of 11-year-olds at the Bethel Methodist Primary School. Few children enjoy assessments. I was worried our students would hate them and fail to demonstrate the extent of their true knowledge.
As the day progressed, I demonstrated the behaviours that I observed and wanted to correct, celebrated behaviours I wanted to replicate, told heart-warming stories about our students, and gave passionate elevator pitches on rapport building. I might have overdone it, but that’s a small price to pay if, in return, our children showed off all their skills and had fun while doing so.
As I continued observing enumerators, I kept bringing it up – the importance of building rapport with the student, of making the assessment feel like a (granted, rather boring) game, of creating a welcoming environment, and of treating Luminos children with the same level of respect the enumerators were treating each other and me. As the day progressed, I demonstrated the behaviours that I observed and wanted to correct, celebrated behaviours I wanted to replicate, told heart-warming stories about our students, and gave passionate elevator pitches on rapport building. I might have overdone it, but that’s a small price to pay if, in return, our children showed off all their skills and had fun while doing so.
Lesson 2: Get access to raw data and analyse it daily
EARC used Tangerine, a mobile data collection tool, to record EGRA/EGMA observations in our classrooms. There are numerous advantages to using tablets in data collection, including a significantly higher data quality. No more fiddling with timers or trying to decipher the overly confusing question skip logic. A wonderful side effect of using digital tools is the opportunity to analyse data daily to identify and immediately address issues in assessment administration.
A wonderful side effect of using digital tools is the opportunity to analyse data daily to identify and immediately address issues in assessment administration.
In fact, raw data can be analysed even before assessments start. With practice at the Bethel Methodist Primary School completed, we walked back to the training venue for the first assessor accuracy test. All 20 enumerators completed the same assessment delivered by the two field coordinators. By the time I reached Accra the next morning, I had everything I needed to perform a quick enumerator accuracy analysis.
After three days of training, our enumerators had an average accuracy rate of 92%. With the 2015 national EGRA/EGMA accuracy goal of 90%, my initial impressions of the experience of the EARC team were confirmed. They were doing well and would do even better by the time training finished.
Digging deeper into the accuracy data, I noticed a few interesting patterns. EGRA, it turns out, is significantly more challenging to administer than EGMA (90% and 97% enumerator accuracy, respectively), with the phonemic awareness subtask, at an appalling 69% accuracy rate, giving everyone a headache. On the other hand, having worked with challenging EGRA/EGMA data before, I was pleased with the highly consistent task timings. If we are to scale the raw non-word reading scores by time-to-completion, we better trust that the time-to-completion metric is accurate – and now I knew we could.
That night, I shared my reflections and the names of the 2-3 enumerators who needed individual support with the EARC team. The next day, my feedback was incorporated into the training.
Once assessments began in our classrooms, I continued analysing the raw data. Rather than trying to gain insights into the baseline learning achievements of our students, I scoured for issues with the data that the EARC team could address right away. As I shared my reflections with field coordinators (“I rather doubt there were 256 boys present in our classroom in Aframso”), they were passing the feedback along to the assessors. We did end up with one more classroom recording the attendance of 207 boys (again, a typo), but I expect we would have seen quite a few more if not for the quick feedback loop.
As enumerator training finished, my “build rapport” mantra gave way to a fixation on logistics. Our classrooms are very different from a typical primary school in Ashanti. We work in some of the most marginalised communities – many don’t have a phone signal, some can only be reached by a motorbike (and it better not rain!), and few can be found on the map. Visiting 60 remote classrooms in five days is a tall order when merely finding these communities can be a challenge.
Our goal was to ensure that the EARC team completes the assessment in five days. That weekend, Angie Thadani (Luminos Senior Director of Programs) and I sat down and listed all the different ways data collection might go wrong, from enumerators not being able to reach teachers over the phone (definitely happened) to them failing to reach the assigned communities (also definitely happened). For each issue, no matter how outlandish, we identified a solution (or three). By the end of the day, we had a Plan B, a Plan C, and a set of simple mechanisms to improve the chances of Plan A succeeding.
The single most effective solution was connecting people. Nothing beats a phone call (once it finally goes through) in which the supervisor tells the enumerator how to get to the community, where to stop and ask for directions, and what kind of vehicle is needed to traverse the terrain. In low-connectivity contexts, WhatsApp is another must-have tool, great for planning the next day’s classroom visits once everyone’s back at the base.
The single most effective solution was connecting people. Nothing beats a phone call (once it finally goes through) in which the supervisor tells the enumerator how to get to the community, where to stop and ask for directions, and what kind of vehicle is needed to traverse the terrain.
Flexibility is another key ingredient. For example, our teachers and supervisors worked together to start teaching earlier in the day where possible (in Ghana, Luminos classes take place in formal school buildings and thus start in the afternoon once the other students have departed) so that EARC assessors would not have to travel after dark. As assessors became more familiar with the landscape, data collection schedules changed as well – on some days, a single team might assess two classrooms, while on others, reaching a single community could take hours and hours.
Finally, when all else fails, there’s luck. I planned to observe the first day of assessments, arranging to meet EARC assessors in Abotreye at midday. Abotreye is not the hardest-to-reach community we work in. Nevertheless, to get there in time, I had to get up at 3AM, spend hours in a (thankfully air-conditioned) car, and even push it on a particularly bad strip of the road. But I made it to Abotreye in time. The assessors, however, didn’t. Luck came afterwards – I ran into them a few hours later, alone with their backpacks (with no vehicle in sight), seemingly pondering their options. We picked them up, drove them to the nearest classroom, and left a few hours later as they were finishing the day’s work.
Working with such a strong evaluation partner was an incredible experience. The Luminos Fund knew we could trust EARC to deliver high-quality EGRA and EGMA in our classrooms in Ghana. But we also knew that the context we work in is unusual.
Informed by our understanding of the unique features of our classrooms and guided by their extensive experiences of administering learning assessments in Ghana, EARC completed the Luminos EGRA/EGMA baseline in time. We are yet to receive the final dataset and the accompanying report, but the raw data is already telling a story – one that we will share next time.