NAPLAN: friend, foe or on the fence?

Late last year, the University of Western Sydney’s Whitlam Institute commissioned  a national survey of over 8,000 educators on the impact of the annual National Assessment Program for Literacy and Numeracy (NAPLAN).

There were very few surprises in the results which found that many teachers believed NAPLAN had negatively affected their practice and narrowed the curriculum:

  • More than 70 per cent of respondents agreed that NAPLAN means they ‘teach more to the test’ and spend more class time on areas they knew would be tested
  • Just over two thirds believed the focus on NAPLAN had led to timetable reduction for other subjects outside of literacy and numeracy in their schools
  • 64 per cent agreed that there had been a reduction of ‘face to face’ time with students
  • 55 per cent thought NAPLAN had narrowed their range of teaching strategies.
While these aren't officially endorsed publications of the NAPLAN program, and produced independently of Australian governments, they prove that these are high-stakes tests.
Parents can buy off-the-shelf practice tests. Is this sending the right message?

We should be shocked and dismayed to learn that over half of our primary schools are preparing their students with weekly practice tests five months in advance of NAPLAN. It’s not the test that counts but the teaching that is critical here and good teachers know this.

The most depressing thing for me about the survey was the fact that less than 50 per cent of teachers spent any time looking through the data at their school to drive improvement.

It defies logic that we have introduced a national assessment measure designed to diagnose students on basic skills to drive improvement, when over half of our schools aren’t actually using the data.

The obvious question is why are we spending an inordinate amount of time focusing on preparing kids for NAPLAN when many of our teachers and schools don’t actually use the data to inform their practice?

At our 2012 Ann D Clark lecture, Professor Andy Hargreaves warned educators to ‘beware the tool’:

‘Beware when you bring in a tool, even if you think you are using the tool optimistically, because, at some point in the future, people will take the tool and use it for a completely different purpose that you never imagined it would be used for.’

This is true of NAPLAN. We have to put the tool (the data) in its rightful place. We have stigmatised NAPLAN and turned it into something it is not. NAPLAN is given far too much emphasis on judging school performance rather than on helping to inform the bigger, richer picture of how our students are tracking. And, most importantly, how we are helping them to improve.

I want to make this point which seems to get lost in the discussion of declining performance, withholding some students from sitting the tests to boost school scores, and so on: These tests, particularly in our NSW experience are to test basic skills. They are located within the curriculum and are designed to determine what a student should be able to achieve in literacy and numeracy at a particular age level. We are not asking kids to sit an astrophysics test for MIT here. This is to check to see if they have attained the skills they should have attained through the teaching of the curriculum. Again, it’s not the test – it’s the teaching that’s important here.

The general agreement by State and Federal Education Ministers last year to move NAPLAN testing online in 2016 will hopefully improve the usefulness of the data to inform learning and teaching. ‘Just-in-time’ data can provide teachers and leaders with the means to better diagnose students’ learning needs and areas for improvement so they can be more responsive in the areas of literacy and numeracy.

But we are beginning from a narrow starting point. We need to find new tools to assess our students across a broader range of areas including the development of key 21st century skillsets like critical thinking and collaboration skills. I will be very interested to hear what new ACARA head Rob Randall presents this month around this very idea.

The point I want to make here is that regardless of how good the data is, if we aren’t using it it’s no good.

Data, in and of itself, like technology or any other tool in a teacher’s kit won’t make the difference – it’s neutral. If we want to get serious about school improvement, we need to actually use the data to inform our work and change our practice.

Hopefully, UWS’s survey will help influence the necessary changes so our schools can use NAPLAN data to this effect.

16 thoughts on “NAPLAN: friend, foe or on the fence?

  1. I find this to be a strange post. The first half is critical of the system and the second saying we should use it anyway. Why would teachers use the data if they don’t believe it is in their students interest? Some teachers realize the system can’t adapt to their needs and as a result will not bend to meet the needs of ACARA heads or system heads like you.

    1. Thanks for your comment Tim. I might not have explained clearly enough. It is not about the data. NAPLAN, like any other data set, is just a tool teachers use. My point was about ensuring we are using the tool in the right way. NAPLAN, particularly with the planned changes to make the data more accessible for teachers, can be a useful tool (if used correctly) to help teachers gain an understanding of where their students are in the fundamental areas of literacy and numeracy. But this is only a part of the picture and we need to look beyond simple measures to identify ways to assess and track students’ performance more broadly.

  2. Greg,

    Thanks for the post.

    As I see it the basic problem with Naplan data is that its not “just in time” data to use as a basis for inquiring about either our instructional practice as teachers or the differentiated curriculum content used to engage and provoke learning. You simply cannot wait 5 months for data to act. We have far better online tools and data systems for this now. What it does give us is some great big picture data views and enables some triangulation of data sets to ensure some reliability.

    Perhaps as you pose the computerisation of the test will enable better “just in time” data.

    I perceive you allude to a bigger issue – the use or non use of data sets in schools to inform inquiry processes around adjustments to instruction and curriculum content.

    I am engaging in embedding data informed inquiry practices across the school (Data Wise). I have posted a clip on my blog featuring a conversation with a faculty member of the Harvard Data Wise team after he visited Elsternwick PS late last year.

  3. Greg, many teachers are choosing not to analyse the NAPLAN data because they know it’s weak data for making judgments about individual students and schools. NAPLAN gives population data, not diagnostic data. The data can be useful for asking ‘whole population’ questions such as: Are girls performing better than boys? Are urban students performing better than rural students? Governments spend a lot of money on education and they are entitled to feedback, but we can get the kind of information they need by testing a sample of Year 3, 5, 7 & 9 students every three years or so. Testing every student every year is a massive waste of taxpayers’ money — estimated to be over $100 million annually.

    To improve learning, we have to improve teaching. To improve teaching, we have to invest in teachers. Imagine the outcome if the $100 million was spent on teacher professional development instead. At the moment, we are not investing in teachers; we are depressing them by using very weak data for high-stakes purposes. Many highly respected educators I see listed in the right-hand column of your blog would not agree with the way NAPLAN is being used/abused.

    The NAPLAN data are not robust enough to be used as diagnostic data and there are huge errors of measurement (I guess that’s why the technical reports aren’t published). The work of Prof Margaret Wu and others has demonstrated serious problems with the data — see (especially Papers 1 & 2). Wu has shown that approximately 16% of students will appear to have gone backwards (between testing years) when in fact they have made normal progress. It’s an appalling situation.

    The other problem, of course, is that the data are not available until 5 months after the testing. It’s fairyland stuff to think that the tests can be delivered online any time soon. No schools have adequate numbers of computers or good enough networks to cope with the demand. In Victoria, the Ultranet fell in a heap and failed on the first day! That was over 2 years ago and it’s still plagued with problems. Most teachers are not using it. Well, that was only Victoria — imagine trying to get students doing the NAPLAN tests right across Australia. The current technology in schools will not cope with such a demand.

    I love my job. I’m passionate about education and have been teaching for 47 years. I’ve never seen anything as destructive to good teaching and learning as NAPLAN. It makes me very sad, but it also makes me angry to see what it’s doing to kids and teachers and principals. There is a woeful lack of positive return and an alarming degree of destruction — destruction of morale, dedication and good will.

    David Hornsby

    1. I agree with many of your points David. Any tool or dataset is subject to misuse and NAPLAN, as you say, is being used to make judgements about schools and populations but this is not the reason it was introduced. Like its predecessor – the NSW basic skills tests – NAPLAN has been designed as a diagnostic tool to measure individual students on aspects of literacy and numeracy every two years. While there are some flaws, notably the delay in getting the data in teachers’ hands to make a difference, we can’t throw the baby out with the bathwater here. I agree wholeheartedly, as you can see from my previous posts, that we must work to improve teacher quality, but it’s not an either/or. Good teaching relies on evidence and NAPLAN can provide part of the richer picture of tracking individual students against their past performance and also against national and international benchmarks. It’s about we choose to use it.

      1. Thanks Greg, I appreciate your comments.

        You say that NAPLAN was designed as a diagnostic tool, but NAPLAN is NOT diagnostic. The small number of test items are drawn from a huge pool of thousands and thousands of possible items, and even then it’s falsely assumed that the small number of items are tested validly and reliably. Researchers in Queensland and Victoria have shown, for example, that the spelling items are not assessing spelling validly or reliably! (See Buchanan & Bartlett, 2012; Willett & Gardiner, 2009; Snowball, 2012. I can send you their papers if you wish.)

        I’m a Victorian, but what I know about the NSW basic skills tests tells me that it was far superior to NAPLAN. It’s NAPLAN that worries me — not rich, authentic assessment.

        You also say that NAPLAN was not introduced to make judgements about schools, but that’s exactly what it’s being used for. You can’t be happy about that.

        Finally, you say that NAPLAN can track individual students against their past performance, but that is demonstrably not so. PLEASE read Margaret Wu’s work, and a paper I wrote with Margaret (Paper 2 from: NAPLAN can NOT be used to track students accurately or reliably.


        Interesting, isn’t it, that politicians and senior educational bureaucrats are concerned about an apparent decline in standards in recent years — a decline that seems to coincide with the introduction of NAPLAN. I hope someone does some research on that.

      2. David, I can see you are very passionate about this issue and the work you have done with Margaret Wu makes very valid points. I think we are on the same page here. Too often, we make way more of out NAPLAN than it merits. While politicians may make NAPLAN high stakes – good teachers don’t. If every politician was as questioning as you are about the validity and use of the data, then we would be less likely to see it being misused. It’s not the conclusions we draw from the data, but how we interrogate the data and the questions we ask to inform practice. Even if more work can be done to refine the tool and make it more useful for teachers, I think teachers and school leaders can still draw value from an external national assessment for their own local contexts. Hattie nails it when he talks about effect sizes: everything has an effect on improving learning but it is about degrees. High stakes = low effect. Great discussion, thanks for your input.

      3. Thanks Greg. I agree that good teachers see the problems with NAPLAN. Sadly, though, many are harassed by their principals, because the principals are harassed by their superiors. It’s all such wasted effort.

        You have an impressive educational record Greg. I respect your work and your willingness to discuss issues so openly. Thank-you.


  4. I agree, we should all be dismayed when teachers spend inordinate amounts of time teaching for the NAPLAN tests. Yet this has become the distressing reality in our schools.

    After viewing what is the reality in our schools, I have totally changed my thinking about this testing, and now see it as the greatest limiter of student achievement and the greatest limiter of enabling teachers to make positive changes to pedagogy.

    I see the rests results as having benefits only for giving a broad picture at the whole school level and even then only when considered with results over three or more years.

    I can’t see how teachers or administrators can consider NAPLAN results as a diagnostic tool. There are considerable margins of error. The tests are not designed with reliable subscales. The reading of the OECD Review of Evaluation and Assessment in Australia, 2011 regarding NAPLAN makes this point. Many teachers are indeed being encouraged to use the data from the tests in ways that is not intended.

    The way forward I believe is to concentrate teachers’ efforts on something that has been shown to make a difference – formative assessment accompanied by effective feedback. Unfortunately, endeavours to enable this change can be severely hampered by mixed messages about NAPLAN data.

    1. My reading of the OECD Review of Evaluation and Assessment was more positive about NAPLAN, while acknowledging its limitations and propensity for misuse. I agree we can’t lose sight of the main game here. If teachers, school leaders and/or administrators are more concerned about NAPLAN than overall performance or improving learning and teaching we have a major problem. Scrapping NAPLAN won’t necessarily change that, so we have to ensure the profession educates policy makers and parents in this regard and clear up some of those mixed messages.

  5. Hi Greg,
    As usual a great read. As a recent arrival from the UK, and its test-driven education system (the National Curriculum literacy and numeracy tests there are done at Years 2, 6 and 9), I can see many parallels between the systems.

    As a parent, it was easy to see how the UK school’s became very focused on the test in Year 6, as the primary schools league tables were published focusing on the value added between Years 2 and 6 – and so many principals and parents saw the Year 6 results as the key measure of success. As a result, many schools massively slimmed down the curriculum in the three month run-up to the tests in Year 6, in order to boost the test results.

    The diagnostic data from the NAPLAN tests could be really useful to teachers – if the speed of delivery of analysis to teachers was able to make it assessment for learning, rather than too-late summative data. Or even giving parents the data in a form that allows them to help their children, rather than the format which today is difficult for many parents to understand, and too late to be really useful.

    Because of the focus on the high stakes nature of the tests, I always reassured my children that they should try their best, but not worry too much about the result (I’ll admit, I told them that the test was really of their teacher/school, not them). It helped them to cope with the stress that surrounded the test period in their school (which impacted teachers, leaders, and other parents).

    Perhaps the Scottish model – to test a sample of students instead of the whole cohort – in order to create the national/region based reports, but which don’t allow individual schools to be ranked, would work just as well in terms of creating the metrics needed by the system. And then the focus can move on to the discussion about what diagnostic tests and data are needed by teachers (but don’t need to be published externally); and in what ways teachers can be supported by the effective use learning and assessment data.

    If only 50% of teachers are using the current assessment data, there’s a big cultural gap to bridge, one way or another.


  6. Hi Greg, Tim, Mark, David, Barbara and Ray
    Thanks for the rich discussion. I agree with most of your comments and know ‘proper’ student focused change will eventually come! My experience as a retired K(P)-Y12 teacher now consultanting in curriculum differentiation and breaking open the Disability Standards for Education leads me to believe many school leaders are influenced by the demands of National data trends but could be more focused on the points already raised above: Teacher professional learning in formative assessment and feedback (Hattie); quality whole school implementation of differentiation as well as removal of pathognomic beliefs
    School admin personnel who attend my seminars comment on the ‘lack of time’ factor, however that is a time management issue that needs addressing. Quality leadership in schools produces quality student learning.

  7. Since you ask Greg … it looks like you are firmly sitting on the fence … I’m happy to be a foe at the moment because NAPLAN only tests that which is ‘easy to test’ and the feedback comes too late for teachers to do any refining of personalised learning for individual students.

    The data can certainly identify deficiencies in a teacher’s program/strategies, and some teachers may use their data effectively to improve their teaching in the following year with (most likely) a new cohort of students.

    Like you and a number of commenters here, I am looking forward to an assessment framework which also tests what is ‘important to test’, and you have rightly alluded to the 21C skill set … This is laudable, and will lead to a diversified, more relevant curriculum … one which assesses what Australia needs in our future citizenry … Creativity, Collaboration and Critical Thinking … Bring on an assessment framework which tests for these 3 Cs … as well as the vital Literacy and Numeracy skills …

    How might this happen you ask? Well, I was holding my breath for a while with the (now old) news that Barry McGraw and the OECD were working on such an assessment framework …. but I turned blue and the trail went cold … now it seems the best we can hope for is a quicker response rate from online testing of the (same old, same old) Literacy and Numeracy … and a further narrowing of the curriculum and even more ‘teaching to the test’.

    And please … don’t suggest I hold my breath for Gonski to enlighten Australia’s 19th century assessment framework. We are simply not measuring key competencies/

    On a related issue … PISA and TIMSS also lead us down the garden path by testing only what is easy to measure … as far as science is concerned, what is more important than comparative PISA/TIMSS data, and what it tells us more about our future as an innovative tech-savvy nation is the simple % of our primary students who go on to complete secondary and/or tertiary studies in Science/Technology … so when Minister Garrett gives an earnest interview about how important NAPLAN/PISA/TIMSS data is …. he is talking BS.

    This generation of students need to be inspired to be life-long learners. not a generation of pig-farmers who only know how to weigh pigs.

    1. I can’t disagree with your position. I’m not sitting on the fence but making observations in terms of how things are in schooling today.
      It is not just assessment that needs change but the whole learning and teaching construct, one rooted firmly in an understanding of today’s world. This requires a whole new narrative.
      In the short term I think we need to stop thinking assessment and focus on the learners performance. Assessment would form only a part of our full understanding of the learners performance. When we talk assessment we default to narrow instruments and their potential misuse by competing agendas.
      In the meantime, Our best hope is great teaching by intelligent teachers who know the issues

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.