Late last year, the University of Western Sydney’s Whitlam Institute commissioned a national survey of over 8,000 educators on the impact of the annual National Assessment Program for Literacy and Numeracy (NAPLAN).
There were very few surprises in the results which found that many teachers believed NAPLAN had negatively affected their practice and narrowed the curriculum:
- More than 70 per cent of respondents agreed that NAPLAN means they ‘teach more to the test’ and spend more class time on areas they knew would be tested
- Just over two thirds believed the focus on NAPLAN had led to timetable reduction for other subjects outside of literacy and numeracy in their schools
- 64 per cent agreed that there had been a reduction of ‘face to face’ time with students
- 55 per cent thought NAPLAN had narrowed their range of teaching strategies.
We should be shocked and dismayed to learn that over half of our primary schools are preparing their students with weekly practice tests five months in advance of NAPLAN. It’s not the test that counts but the teaching that is critical here and good teachers know this.
The most depressing thing for me about the survey was the fact that less than 50 per cent of teachers spent any time looking through the data at their school to drive improvement.
It defies logic that we have introduced a national assessment measure designed to diagnose students on basic skills to drive improvement, when over half of our schools aren’t actually using the data.
The obvious question is why are we spending an inordinate amount of time focusing on preparing kids for NAPLAN when many of our teachers and schools don’t actually use the data to inform their practice?
At our 2012 Ann D Clark lecture, Professor Andy Hargreaves warned educators to ‘beware the tool’:
‘Beware when you bring in a tool, even if you think you are using the tool optimistically, because, at some point in the future, people will take the tool and use it for a completely different purpose that you never imagined it would be used for.’
This is true of NAPLAN. We have to put the tool (the data) in its rightful place. We have stigmatised NAPLAN and turned it into something it is not. NAPLAN is given far too much emphasis on judging school performance rather than on helping to inform the bigger, richer picture of how our students are tracking. And, most importantly, how we are helping them to improve.
I want to make this point which seems to get lost in the discussion of declining performance, withholding some students from sitting the tests to boost school scores, and so on: These tests, particularly in our NSW experience are to test basic skills. They are located within the curriculum and are designed to determine what a student should be able to achieve in literacy and numeracy at a particular age level. We are not asking kids to sit an astrophysics test for MIT here. This is to check to see if they have attained the skills they should have attained through the teaching of the curriculum. Again, it’s not the test – it’s the teaching that’s important here.
The general agreement by State and Federal Education Ministers last year to move NAPLAN testing online in 2016 will hopefully improve the usefulness of the data to inform learning and teaching. ‘Just-in-time’ data can provide teachers and leaders with the means to better diagnose students’ learning needs and areas for improvement so they can be more responsive in the areas of literacy and numeracy.
But we are beginning from a narrow starting point. We need to find new tools to assess our students across a broader range of areas including the development of key 21st century skillsets like critical thinking and collaboration skills. I will be very interested to hear what new ACARA head Rob Randall presents this month around this very idea.
The point I want to make here is that regardless of how good the data is, if we aren’t using it it’s no good.
Data, in and of itself, like technology or any other tool in a teacher’s kit won’t make the difference – it’s neutral. If we want to get serious about school improvement, we need to actually use the data to inform our work and change our practice.
Hopefully, UWS’s survey will help influence the necessary changes so our schools can use NAPLAN data to this effect.