Acting on Evidence

Home / Single Post

This blog post is in response to a piece written by Dr Deborah Netolicky. It would be worth reading it to give you the context of my response below. But if you want the quick version:

  • Social Ventures Australia, The Commonwealth Bank and the Education Endowment Foundation released the Aussie Teaching & Learning Toolkit that looks at loads of different research in an attempt to rank effect or non-effect educational practices by cost and by the “security” of the research.
  • Fairfax Media published a piece about the Toolkit entitled The 10,000 Pieces of Research That Will End the Homework Wars.
  • I said, “Nonsense” in my section on the TER Podcast, and lots of others said similar, and Dr Deborah Netolicky blogged about it far more eloquently than I did.
  • SVA responded in a blog post saying, “Chill out” (it was far more reasoned than that, but you get the gist).
  • Dr Deborah Netolicky then responded with more words of caution around the use of meta-analyses on education research as well as the value of being publicly challenged.

Phew… you with me?

As has already been pointed out there are many who hold reservations about the veracity of meta-analyses. Dylan Wiliam pointed out several issues in a comment on a post I wrote last year about Visible Learning in the Aussie Documentary Revolution School.

But even if we assume – just for a moment – that we could place 100% faith in the “padlock” system, we are then presented with how we act according with evidence.

There are countless examples in society of where, even when presented with fairly substantial evidence, people still make “interesting” decisions – whether they be jurors in a courtroom, parents who choose to run the gauntlet with measles, smokers, or dare I say it, leaders of the free world.

So it’s interesting to note in his E4L blog post, John Bush states:

“We do not envision the Toolkit as a resource that should dictate or direct professional decisions in schools. Instead, we hope school leaders and teachers will use it to start discussions with their peers and to help inform their professional judgement with research evidence.”

For what it’s worth I find these words encouraging – not that E4L – or anyone else for that matter – need or want my blessing – but I believe these words also serve to highlight one of the fundamental issues in education.

In my current research I’m reflecting on the point that schools have never had more research, policies and programs aimed student well-being, yet the NSW Centre for Education Statistics & Evaluation present findings that suggest these are simply not being implemented or if they are, they are missing the mark.

This could be because as Stephen Ball suggests in his 2006 book, Education Policy and Social Class, that the disparity between policy, programs and student experience could because whilst providing goals or outcomes, such documents rarely tell you what to do. They may provide links to further resources or options for action, but a response from individual schools still needs to be put together. The contextual nuance means that the success or otherwise of these responses are hard to predict. He states that the enactment of such texts, “relies on things like commitment, understanding, capability, resources, practical limitations, cooperation and (importantly) inter-textual compatibility (Ball, p.47, 2006).

Furthermore Ball suggests the more ideologically abstract – which one might argue describes the concept of well-being and perhaps learning too – the less likely it is to be accommodated into the practice of a school.

To be clear, I’m not anti the toolkit, just as I’m not anti Visible Learning or meta-analyses per se, rather I’m urging – as I thinking most are now – a careful, contextually appropriate and nuanced approach to school improvement.


Leave a Reply

Your email address will not be published. Required fields are marked *