fbpx

This is a series of occasional blogs by BESA members and is part of their paid membership service. These views are not necessarily those of BESA and a published blog does not constitute an endorsement.

In this blog, Dr Georgie Hart – Education Director at Sparx – reflects on her recent experience of being a panel member at Nesta’s ‘How to transform the education system through testbeds’ event as part of London EdTech Week

EdTech has become a global phenomenon in recent years and we think it is having a positive impact on the education system, but can we prove it?

During London EdTech Week, I spoke at Nesta’s event which explored how Nesta and the Department for Education could build a successful testbed for the English education system. I was invited to sit on a panel of experts – namely Owen Carter (impactED), Cecilia Waismann (MindCET) and Nancy Wilkinson (Nesta) – due to my background in operational research and commitment to deliver true impact to schools through my work at Sparx.

Before the event, I asked myself four key questions:

 

  1. Why do we need more efforts to experiment with and evaluate EdTech?

Impact not engagement

If we hadn’t experimented with Sparx Maths, we would have made the wrong solution. An essential part of developing EdTech is getting something wrong, throwing it away, iterating and collaborating with the end user. Furthermore, if we didn’t evaluate its efficacy we would have released something that everyone loved but didn’t actually improve outcomes for students and wasn’t steeped in pedagogy. 

It’s so easy for EdTech companies to mistake engagement for impact. But just because teachers, parents and students are engaged doesn’t mean it’s actually working. 

We can think ‘we’re getting so much great feedback, so surely our product MUST be working’, but that is not always the case. I know it can be tough to collect and look at the cold hard data that says otherwise, but I believe we have a moral obligation to. It’s the only way to ensure you ultimately create something truly impactful, as we have at Sparx.

You wouldn’t give a child a drug that hasn’t been through a rigorous trial to ensure it worked and/or didn’t have adverse effects, so why do we think it’s ok to in education? I actually think it’s morally wrong not to be doing trials within education. 

 

  1. What are some of the main challenges to experimentation and evaluation of EdTech?

Funding, logistics and ethics

For most EdTech companies, the challenges will be around their board having the fortitude for evaluation. Sparx has a double bottom line (financially AND socially focused); most don’t have that luxury. If you are venture capital backed, it would be unusual for your board to support a £60k person or outsourced project concentrating on evaluating impact, when that could buy you two more sales people. 

Secondly, there are the logistics. How can you design a double-blind randomised control trial testing the impact of a student having homework or not having homework? That wouldn’t be blind, let alone double-blind. Also, consider the carnage that could arise in the classroom when Bill sees that his friend Ben hasn’t got any homework and he has! So how can you actually test the impact? At Sparx we’ve developed a really effective vehicle for doing this and now run randomised controlled trials nearly every half term to determine our impact or answer unanswered questions within educational research. But, this takes expertise and investment; it’s difficult to do and difficult to justify commercially.

Thirdly, there are the ethics. Depending on the nature of the research, this can range from getting consent from parents and teachers to setting up an ethics committee. Sparx runs every aspect of every trial past our own ethics committee. Julian Huppert is on that committee – Director of the Intellectual Forum in Cambridge and previously an MP particularly noted for his work on data privacy. We’ve found this impartial oversight is vital for ensuring the child remains at the core of what we’re doing. Beyond any necessary non-disclosure (for scientific purposes), we’re as transparent as possible. Schools always know what we’re researching and can opt-out if they wish – parents can also take their child out of the trial. 

 

  1. What might a more systematic effort to experiment and evaluate EdTech in English schools look like?

A standard for schools

There is an excellent guide from the Institute for Effective Education, which supports teachers and schools with making more evidence-based decisions. It’s a great place to start. Wouldn’t it be great if it were customers driving the demand for more proven impact-centric solutions?

But I think we should go a step further and develop some sort of certification kitemark that helps schools easily determine the quality of the evidence surrounding the solution. 

I know there is the Kokoa standard – an expert organisation from Finland who have designed a quality global standard around the pedagogical design of education technology solutions – andEducate’s EdWards quality mark, but is there one, nationally (or internationally) recognised stamp that schools can trust?

 

  1. What types of support would be most useful for EdTech companies to help them generate more robust and insightful evidence?

Better data on what normal progress looks like, and how to determine whether the intervention has improved on this baseline.

    • We’ve designed and embarked on a five-year study understanding what progress students expect to make in maths between year 7 and 11 without Sparx Maths as there is very little information out there already. If we don’t know what normal looks like, how can we assess whether we are making things better (or worse)?

Additional funding to allow for evidence to be collected, such as the Edtech Innovation Fund.

    • As I mentioned earlier, it can be difficult for a company to justify the resource, expertise and funding required for experimentation and evaluation when they need money to be sustainable.

Initiatives to encourage collaboration between suppliers and the end user.

    • At Sparx, it was incredibly important to have partnerships with schools from day one – these collaborations have been key to our collective success. Partnerships should also involve people from every aspect of the user journey: for us, this was headteachers, teachers and students. There could be an initiative or funding scheme available to encourage and/or match schools and EdTech companies.

Initiatives that support start ups to have evidence-based design processes.

    • Encouraging initiatives such as UCLs Educate EdWards programme which allows start ups to get access to support and expertise in educational research in order to become more evidence-based.
    • Also, platforms such as EdTech Impact and LendED can be used to help EdTech companies show their impact and for schools to easily see what product will suit them the most.

A shared learning/partnership led approach to evidence gathering.

    • We should encourage companies with similar ethics and values to get together. I don’t think a competitive approach in EdTech is the best approach. It feels wrong – we should work together and share our learnings and experiences. I think we’d all provide more impact sooner if we did that. And aren’t we all in this to improve the education system for schools, teachers and students?

 

So what is my main takeaway for you today?

The most important element for me is that when I hear people talking about evidence, it often feels like they are referring to a one-off thing. If you do it as a one-off thing then you have missed the point. The point is that we need to encourage an evidence-based design culture to ensure impact-centric EdTech. That means that it is not a tick box exercise for marketing, but a continuous process (from day one and forever). Why? Because that is what leads to better solutions that make a real difference and improves education. And that is what EdTech is all about.

 

For more information on the educational research we have conducted at Sparx, please head towww.sparx.co.uk/publications. Here you will find examples of how we can prove the impact Sparx Maths has on student progress and attainment; results of a student questionnaire finding that maths confidence, enjoyment and perceived importance of maths were key factors of their motivation; and how the length of homework affects attainment.