Dr Ken Boston, former director-general of the NSW Dept of Education and former head of England’s curriculum authority has been in Australia sharing his views on national testing and league tables.
While he supports national testing as a diagnostic rather than deterministic tool, Dr Boston believes league tables are in themselves a ‘crude’ methodology – lacking depth and failing to capture the diversity of schools and the context in which they operate.
Speaking to the Australian Primary Principals Association this week, Dr Boston said he favours:
“…rich reports which explain why a school may be performing less well, not just simplistic league tables. Don’t massage the data, no jiggery pokery, no smoke or mirrors, just present the data as it is. My belief is this would offer greater public accountability than league tables.”
No-one is opposed to greater transparency in education so long as it makes a real difference to students and their learning.
What has become evident is that league tables are more a measure of government performance in educational reform. It may tick the boxes for voters but does little to address the substantive issues and challenges facing Australian education today.
The comments of two western Sydney principals in the Australian (12/8/09) strikes at the heart of this matter.
‘They [NAPLAN results] don’t show how the school is performing in terms of how it’s providing for the local community.’ (Phil Walker, Kings Langley Public School)
If you don’t work on their social and emotional wellbeing to start with, then they don’t have the facilities to learn.’ (Cheryl Walsh, St Bernadette’s Primary, Lalor Park).
League tables indicate which schools are meeting national minimum standards (or not) but they don’t offer explanations or provide solutions to improve the quality of learning and teaching.
For me , the focus on school improvement needs to be on the collection and analysis of a broad set of data including school participation in the community and student well-being.
To do this well, systems and schools need the skills and instruments to be able to effectively analyse data and frame questions that lead to improved practice and learning outcomes.
They need a practice of inquiry and reflection on learning which is too often ignored in a climate of test test and test. Testing then drives the learning agenda, not the nature and quality of the teaching. I referred to Elmore’s idea on this in a previous post. A school community committed to improving outcomes for all students know their students’ performance in both broad and deep ways. In their professional kit bag you find data on a range of areas but in particular:
Walk into any high-performing school and you will see an openness to inquiry, public accountability and a willingness to inform parents on their child’s progress. Providing teachers and schools with the skill set and tools to improve learning will score ticks in all the right boxes.
5 thoughts on “Ticking the right boxes”
My Greg, that’s quite a can of worms you’ve opened. Good job too. I’d like to hear more of your thoughts on how those things can be assessed. Satisfaction is a curly one. Would there be agreed criteria on what should or should not lead to satisfaction? If not all participants are themselves reasonable, how do we manage that data reasonably?
Nice to see you on twitter too!
Peter good point. The devil is always in detail and while we use words like satisfaction, we do so to indicate the substantive areas and ways we want to look at information.
The issue is around finding both relevant and timely data across a range of domains that shows the performance of schools, teachers and students in order to give an accurate picture of student learning. This does involve a process of collaboration with staff and a dialogue on what these pejorative terms can mean. We also need to be clear of the purposes for which this data is used. We have no better example of failure than the recent discussion around league tables.
The central issue for me is always around reliable data and my experience tells me that reliable data contains a both/and approach. This means that while we clear around purposes and processes, it is not necessary that all data tools need to be developed collaboratively. This is not new territory. There are many schools and systems that have moved into this area and have developed a degree of sophistication around collection and reporting on this data.
Our approach has always been to have a dialogue about good learning and teaching first and to discuss data in this context rather than talk about data and how we want it reported (e.g league tables). This means then, that we locate assessment, testing and student performance within the learning process not external from it.
Good teachers know the difference.
* National English curriculum poll
* General Curriculum
Are these polls a start to measuring “satisfaction”?
* School holiday poll
* Website Features Poll
* Artificial food colourings poll
* Children’s gym poll
* Changes to HSC poll
* SMS alert from schools poll
* Technology and bullying poll
* HECS discount poll
You put this very well Greg. The professional dialogue is the most important aspect around teaching and learning. This is what I am finding driving whole school improvement and action. Allowing teachers to have open and honest dialogue about what is actually happening within the classroom allows for the natural flow of teachers wanting data about, not only student performance, but their own teaching. When teachers reflect through constructive positive questions and criticism they are more than willing to take the next step in their own professional learning. It’s really a cycle of improvement – student improvement shown through assessment and data allows for continued professional growth.
Thanks for the thorough response Greg. I wonder what would happen to the excellent processes and perspectives you support if you moved on to other challenges.
So much of how we manage this power depends on people. Not how much they know or their specific skill sets so much but who they are. That seems to underlie the details more than anything. When we assess America’s future in negotiating world stability we consider who Obama and McCain are, more than the processes they undertake. I’m not comfortable with that, but it seems to be how we most effectively summarise how that role will be filled.
When you use the phrase ‘good teachers’.. what assessment do you use, could you put it to paper so well that a stranger using it to assess the same teachers would generate identical results? I think it’s you that makes that such a good process and that duplicating that is difficult.
League tables have terrified me, but I’ve never felt more sure that they are managed well than in reading your take on how they fit in the big picture. If someone asked me how to use league tables well, I wouldn’t ask them to read you blog, or suggest that they talk to you.. I’d say ‘fly Greg to your school..’ . I expect they’d learn how you make decisions mostly through who you are in those processes.
The problem with league tables is flat transference of power and social leverage simultaneously into many (often unskilled ) hands. This in turn can leverage the community in their decision making…. When this tips off a cycle of hoop jumping box ticking.. it’s hard to stop. Many parents don’t care for a variety of tests… they just care about the ones that count. And who decides on which ones count? The next school.. and the next school.. and the university.. and the job…
As a primary educator I realise no matter how I honour deep and broad assessment tools… the bottom line is the next school probably won’t. They have good reason not to. Their next school won’t either, because the universities don’t.
I like that you’re fighting the good fight against that.