Even if you ARE part of government, evidence-based policy change is hard. It takes strategy, patience, and courage.

I have had the privilege of collaborating with researchers at South Africa’s Department of Basic Education (DBE) for over a decade, iteratively experimenting with ways to improve foundational literacy. At a recent Centre for the Study of African Economies (CSAE) panel, my long-time collaborator Nompumelelo (Mpumi) Mohohlwane, the new Director for Reading there, offered insights on how evidence-based change actually happens from inside government. What follows draws on her remarks — along with reflections from Stephen Taylor, who leads the DBE’s Research, Monitoring, and Evaluation unit and this research program.

The research

The starting point was the Early Grade Reading Study (EGRS), an evaluation of a structured pedagogy program. Teachers received detailed lesson plans and graded reading materials for home-language literacy instruction. In the first study, we experimentally compared training alone with training plus on-site coaching. Coaching was about twice as effective as training alone, and more cost-effective. But on-site coaching was still not scalable by government.

That raised an obvious next question: can coaching be delivered more cheaply? In a follow-up experiment, we compared in-person coaching with virtual coaching delivered by phone. Unfortunately, only in-person coaching worked; the cheaper alternative had no impact. We have since run another experiment exploring whether coaching can be provided by school-based government employees rather than external experts. Limited success so far (more soon).

We have not yet cracked the scalability puzzle in South Africa. Each time, the more resource-intensive version turned out to be the most cost-effective. Bringing coaching into government is not a tweak but a systems-level reform — it would mean changing the promotion and performance criteria of civil servants, for example. Reforms of that kind, as Stephen puts it, require “a broad enough and unified enough political and technical leadership to drive home the reform”.

But the research program has produced impact far beyond any single finding.

Target multiple people, at multiple levels

Influencing policy from within requires knowing whom to engage and when. The minister of education is often not the right person to target. Ministers have limited bandwidth, potentially short tenures, and less operational power than one might expect. The Director General has more control over implementation but cannot get into the details. Someone further down the hierarchy may be the one writing the policy document that actually gets implemented, or controls the relevant budget line. And engaging provincial government is essential, since adoption decisions are often decentralized.

Keep showing up

You never know when evidence is going to stick. It depends on reaching the right person at the right time — and you cannot predict which encounter that will be. One presentation of the RCT results is not enough. According to Stephen, while some senior policy-makers took the RCT evidence seriously, most “have been more influenced by the cumulative conversations and collaborations of constantly discussing the evidence and the programme components over years.”

The team has presented results of the research many times, in many venues: to central government officials, to provincial education departments, and even back to the schools where the program was implemented. There was a large showcase event at the DBE. They even got the president of South Africa to mention the study in a State of the Nation address.

Which of these was most effective at shifting thinking? I don’t know. Maybe it was that conversation with the head of a provincial department the evening after a presentation.

Look beyond a single program

The most significant policy impact has been indirect. We did not find a scalable model of coaching (yet), and did not push an unsuccessful program. But the body of evidence produced its own ripple effects, through three channels.

High-quality materials, with organic uptake. The most durable product of the research is not a finding but an artifact: teacher lesson plans and learning and teaching support materials. These have taken on a life of their own. Some provinces are now running versions of the structured pedagogy program — some directly shaped by our studies, others via partnerships with organisations that were themselves influenced by the research. Demand also comes from below: during qualitative research, teachers have asked me whether their school could get access to the lesson plans and reading materials.

Capacity and partnerships in the implementation and research ecosystem. A decade of iterative research built people and institutions, not just evidence. Each new field experiment funded implementers, producing improved materials, deeper operational experience, and growing demand for experts in this space. Within Stephen’s research unit, a whole cohort of researchers have developed their skills analyzing the data, and are now asking their own research questions.

Ripple effects on broader policy debates. The assessments that powered the RCTs produced a by-product: the first reading benchmarks for each of South Africa’s home languages, now adopted as national standards. The accumulated evidence also strengthened the case for teaching early reading in home language — a politically sensitive choice that evidence helped anchor.

Mpumi is now the Director for Reading at the DBE. I expect big things.


Thanks to Stephen Taylor for his input and for the quotes used in this post.