A learning eco-system

Last week the Gonski Institute for Education held a forum to discuss school assessment. Clearly there are growing concerns around the world regarding the negative impact of high stakes testing. So much so that Singapore has now taken the step of banning high stakes testing of students below the age of 11. Australia hasn’t taken this route yet but as one primary principal suggested, NAPLAN should be relegated to the recycling bin.

Opportunities such as these are valuable but I think we’ve got hold of the wrong end of the dog. Why you ask? Because assessment and testing is often viewed as an add-on to the learning; something done at the end of a course of work, term or final year of schooling. For me, assessment and testing has to be viewed in the broader context of a learning ecosystem. In the ecosystem, teachers are continually using data and feedback as a way of accelerating growth so that nothing is out of balance. In an Algorithm-driven world, predictive analytics opens a whole new way of looking at student, and teacher performance. 

Assessment and testing are an integral part of the wider learning framework, in which each member of the learning community contributes to its sustainability. 

Some years ago, I had the pleasure of visiting Monet’s garden in Giverny, France. I often use the garden as a metaphor in my keynote presentations to describe the learning framework. Like Monet’s garden, it is in a state of ‘permanent evolution’.

Just as Monet did, passionate teachers oversee its design and ensure the right conditions for growth. It is complex and hard work but the value as Monet instinctively knew, comes in seeing the masterpiece as a whole, not by elevating individual elements.


4 thoughts on “A learning eco-system

  1. Hi Greg,
    High stakes testing is symptomatic of the accountability era of dry economics that has tended to drive governments around the first world. Despite protestations of schools for at least two decades to my knowledge, unfortunately we have been dismal failures in succeeding in getting policy makers to change the focus on assessment and data to drive the learning and teaching agenda.

    What is very sad for a country such as Australia is that we have had and continue to have an almost insatiable desire to follow everything UK and USA when it comes to school education.

    The real issue the way I see it now into my 29th year as a teacher of which 13 as a Principal is that we have become very good at assessing and testing and collecting data to meet the needs of the individual at the zone of confusion or zone of proximal development, however we continue to fail in knowing how to address what the data is telling us.The real game as I see it is knowing how to teach to meet the needs of the individual student. The best way I have seen to combat the teaching deficit is to employ master teachers to work along side classroom teachers to address the data that is revealed.

    Teachers are found out by assessments such as NAPLAN in that students are unable quite often to transfer their knowledge of a concept when it is presented as a question in a different form. For example probability in Primary School in stage 2 students are provided with probability examples that usually involve using coins and heads and tails. NAPLAN asks questions that address the same outcomes the teachers have covered, but in a form that might require probability knowledge applied using a different form such as lotto balls. Hence, the biggest challenge is teachers modelling and instructively teaching in such a way that students can apply their understanding of the one concept that is addressed in a different way when expressed as a question in tests such as NAPLAN.

    Therefore the way I see NAPLAN is that it is more of an IQ assessment rather than what it replaced the BASIC SKILLS test. As much as we deride politicians and Policy makers they need some sort of instrument that accounts for dollars spent on schools and that teachers ar actually doing their job. We need to get better at explaining the narrative so that we can address what we need to address and that we can appease politicians and policy makers who need some sort of instrument that provides them with accountability for their constituents.

    The garden in Giverny analogy is correct to a point, but quite often the conditions of growth really mean teachers knowing how to teach. The zone of confusion or proximal development knowledge is only part of the equation, the real improvement agenda should be about the pedagogical content knowledge of teachers. Singapore, Shanghai and Finland teaching models mean teacher experts observe, analyse the data with teachers,plan with these teachers on how to address what the data is telling the teacher. Almost like a master and apprentice. I honestly believe school budgets can be tweaked in such a way to re-distribute money away from programs such as reading recovery and plough it into master teachers who could most influence the pck of teachers working in the zone of confusion. This could equally be applied in a Primary school setting and a subject centric focus in secondary schools.

    1. Great insights which resonate with me. The master teacher construct will require a maturity to be open continuously to feedback and ways of organising learning.
      No standardised assessment regime that I know of has ever resulted in sustained scaleable embedded improvement of systems and individual schools. It is a work in progress

  2. Great article Greg – agree that testing should be used as a tool by teachers. It is not the be all and end all and should be used to help guide the teaching and learning.

    The great thing – as a parent – is that you at least get an objective test to see where your child is at. I do think this is critical.

    1. Thanks for your comments. NAPLAN and the like is a static process whereby schools get the report long after the test date. What we need is real-time data and feedback that relates to the learning experience itself, not post-event outcomes. Technology provides a good opportunity to do this already with wider ranges of data to help understand student progress. You could look at a test result compared with their attendance pattern. If the attendance pattern is erratic this could influence performance on the test but not reflect the student’s overall capabilities.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.