Archive for the ‘Data’ Category

The speed of things

According to Yong Zhao one of the biggest flaws of PISA is that it “directs the world’s attention to the past instead of pointing to the future.”  Yet education systems and policy makers rely on international assessments such as PISA to gauge student performance in maths, science and reading.

In World Class Learners, Zhao admits that based on data comparisons, countries performing extremely well on international tests such as China and Taiwan tend to score lower in perceived entrepreneurial capabilities.  The good news is that Australia scores relatively high when it comes to entrepreneurship.

Harvard Business Review had an interesting article last week on the fastest moving digital economies.  The authors developed an index to gauge how countries compare in terms of their readiness for the digital economy.  Looking at performance over five years (2008-2013), they assigned 50 countries into four trajectories: Stand Out, Stall Out, Break Out and Watch Out.

Australia currently sits in the ‘Stall Out’ quadrant – having achieved a high level of evolution but losing momentum and at risk of falling behind. Singapore, Hong Kong, Ireland, Israel, Estonia, the US and New Zealand are among the countries in the Stand Out quadrant.  These are countries who continue to invest in world-class digital infrastructure, encourage entrepreneurship and have governments which support and encourage growth of the digital ecosystem.

The authors’ advice for countries like ours is to invest in innovation, look globally for new markets and find ways of attracting ‘talented young immigrants’ to revive innovation quickly.

If as Zhao says schools must transform into global enterprises capable of educating globally competent entrepreneurs, then we need new measures that look forward – beyond traditional boundaries.

I wonder whether we need to be looking at schooling in the same way countries assess readiness for the digital economy?  Do we need a Digital Education Index based on key drivers?  For example, when schools deliver a curriculum (especially electives) are they aligning them with “Stand-Out” or the delivery of traditional subjects aligned with “Stall Out”?

Digital is no longer about hardware or software.  It isn’t about the number of computers or iPads in classrooms.  When we talk about digital education it encompasses the mindsets, policies, users, trends and infrastructure that support this dynamic and ever-evolving ecosystem.  An ecosystem that our learners are a part of and will inevitably shape based on their needs and ever-changing expectations.

The authors of the HBR article predict that the “next billion consumers to come online will be making their digital decisions on a mobile device – very different from the practices of the first billion.”

How will schooling be different from last year or even last week so we don’t end up in “Stall Out”?

 

 

 

 

 

 

 

 

 

 

The ‘Fitbit’ for education

The words ‘evidence-based’ are at the centre of the ongoing push to improve student learning and school improvement. This is a welcome focus and helps put the evaluation of both the learning and teaching in the centre of the work schools and systems do. We need to consider what this evidence base is and the robustness of the evidence.

Over January, I was watching the Australian Open Tennis. Several of the top players have new technology embedded in their racquets. A computer, in the handle of the racquet, captures the entire match in real time, every shot, speed, angle of returns, response times and so on. At the end of each match the player and his or her team sit down and analyse the data in order to improve the player’s performance. You can only imagine the size of the data and the power it brings to the process. Using big data is not new. What is new, however, are the ways we can collect and aggregate the data in real time.

My daughter gave me a Fitbit for Christmas. From what I gather I wasn’t alone in receiving this gift. Everywhere I go I see people with Fitbit bracelets collecting all sorts of data about the wearer’s physical activity during the day and sleep patterns at night. This capability is very quickly revolutionising the medical field, the health and fitness business and our lifestyles.

The collection of personal data  is not confined to physical data: the New Yorker article ‘We Know How You Feel’ explores how computers are now able to accurately read our emotions. Many companies are now starting to use this information to target advertising based on our emotional reactions and will likely see the advent of the “emotion economy”. Think about the applications of ‘emotional data’ for marketers, business, health providers, even the educational sector.

We are at a point where data and technologies are stretching our thinking on both the nature and use of evidence. What really surprises me is how much incidental discussion and reflection takes place amongst those who use the technology. The ideas and suggestions from other on how the user might use the data and even improve their stats widens the knowledge pool.

By comparison to the above the evidence base for learning and teaching is very thin. Most of the evidence is test based, after the fact analysis. High stakes testing distorts the improvement picture because much of the data is open to misuse and misunderstanding. Assessment tasks are generally used to make judgements about student performance and ignore the critical role of the teacher. Paper-based tests are still the lifeblood of the student assessment process. They are often removed from the actual teaching-learning process which it seeks to evaluate. Even at formal parent-teacher feedback sessions the data sets are limited, and a one-way process from teacher to student.

We need to identify how we collect data on the teaching process. While we are seeing increasing instances of teacher observation, instructional walks and data walls, much of the data collected is not in real time and not always linked to student achievement. Good teachers are already constantly assessing student understanding during class time. The question is, how can we use technology to assist teachers in collecting this data more accurately and effectively? Imagine if we could collect real time information about students’ emotional reactions to the learning and teaching process and use this to inform teachers’ work in real time.

I think we have much to learn about evidence of improving teaching and learning from the developments in big data and ways that the data is captured and analysed. The power of the evidence collected lies not in the volume of the data or the power of the device, but in the analysis of the data. Individuals and teams can access the data when, where and how they want and need it. It helps provide real time feedback to the user and encourages collaboration. Ultimately it has the capacity to put evaluation of learning and teaching in the centre of the process, not as outcome at the end of the process.

I’m not sure if there is an education Fitbit, but I’m sure it will be here soon. As the devices get more powerful, more embedded and more connected it won’t be too far into the future. Having this sort of capability for all learners but particularly teachers, will think, be a game changer for our understanding of teaching in a contemporary world. We will see a dramatic shift in collaboration and real time intervention.

It is clear that these developments are unstoppable. I hope the education profession will embrace these new capacities and show how they can and should be used. We can’t afford to let the emotional economy do it for us.

Innovating workplaces

The slow and steady demise of manufacturing in Australia has sparked interesting debate in recent times over competitiveness in a global economy.   I was interested in the discussion following on from Toyota’s recent announcement and whether workplace arrangements had jeopardised the big car manufacturers presence in Australia.

The need for contemporary practices impacts also on the education sector.  It seems these discussions have always been framed around productivity and performance but I think we are still looking at the problem through the wrong lens.

Daniel Pink proposes an interesting theory of 20th century motivation vs 21st century motivation and the changing nature of work in a knowledge age.  The knowledge economy requires a new mindset and skillset.  Innovation is key and key to innovation is human capital.

I heard Professor Bill Harley from the University of Melbourne talking recently about the need for workplace innovation in Australia.  He said research around the world shows that there are three things that make productive workplaces:

  1. Employees have appropriate skillset (teachers up-skilling and re-skilling)
  2. Approaches that allow people to collaborate and solve problems (de-privatised practice)
  3. Motivated workforce at every level (managing and rewarding performance)

Professor Harley reflected on the fact that a strategic approach to implementing these practices has been absent from Australian workplaces.

The practices that have prevailed in education over the past century are obstructions to innovation. We need to change our practices by changing culture.  The three points Professor Harley refers to demonstrate the shift from industrial to knowledge, from convention to evidence.

Ironically, Toyota is one of the companies recognised for its innovative culture.  There are numerous case studies on what drives Toyota’s success but it comes down to investment in its people (skillset) and organisational capabilities (problem-solving and intrinsic motivators).

Listening to Professor Harley made me think about education in terms of our manufacturing industry.  Only for us, it will be our students not car manufacturers who will walk away in search of something more relevant and rewarding.

A different level of insight

Following on from last week’s blog post on big data, I had the great pleasure of meeting researcher and educator George Siemens recently.  George is the Associate Director, Technology Enhanced Knowledge Research Institute at Athabasca University in Canada.  He was also one of the first people ever to facilitate the use of MOOCs.

George has been immersed in learning and online networks for such a long time that he presents a different level of insight.  He shared some of his insight when I asked him about the opportunities of big data on education.

 

Framing the right questions

In the past few weeks I’ve read at least three articles on ‘big data’. We are moving rapidly from knowledge capture to data generated insight and innovation.  I think that the questions being posed for business in the age of data can be equally applied to education.

How can we ‘create value for our students/teachers using data and analytics?  And if data is helping companies like Google and Amazon to develop new models of delivery, providing the customers with personalised and targeted information on likes and dislikes and information and opportunities which they may previously not known about, can this sort of data help education develop new models of personalised delivery?  The answer for me has to be yes, or we risk irrelevancy in the schooling space.

Schooling will benefit from looking at the innovative businesses who are capitalising on the opportunities being powered by the Internet.  Companies who are learning from and transforming what they do and how they do it through the data and tools available.  Imagine if schools had access to student data from pre-kindergarten or if primary schools shared student data with high schools? We wouldn’t have to re-invent the wheel time or start from square one because a student changed schools.  Critical information would be available for teachers who could then pick up the ball so to speak and identify new learning challenges. Imaging capturing data on career progression 10 years plus from exiting school and using that data to inform planning and learning opportunities for current students.

There is a great article in this month’s Harvard Business Review about using data to drive growth.  It’s well worth a read.  The authors pose five key questions for businesses.  These are questions that deserve our immediate attention.

1. What data do we have?
2. What data can we access that we are not capturing?
3. What data could we create from our operations?
4. What helpful data could we get from others?
5. What data do others have that we could use in a joint initiative?

Good data helps us frame good questions and good questions will help us find new ways of individualising content and personalising learning.  We need to be working smarter not harder in a connected online world.  

Know your learners

Here’s a question – do you believe all students can learn?  If you said yes and you’re a teacher or leader, are there examples at your school of students who aren’t achieving gains in their learning?  How do you reconcile the two?  Here’s another question – if you were asked to list ten things that knew you about each learner in your class or school could you?  More importantly, would they know you knew these ten things about them?   If you said yes, then you are doing well at knowing your learners.  If you said no, then you would be wise to read Lyn Sharratt and Michael Fullan’s book “Putting the Faces on Data“.

These are the questions that Lyn Sharratt asked us to reflect on when she was here earlier this month.  This is Lyn’s second visit to Parramatta and we are grateful for her assistance in helping us put faces on our own data.  It’s a strategy that takes personalised learning to a much deeper level because it requires us to continually and collectively analyse student learning and plan the next sequence. Sounds simple but as Lyn says it is hard hard work. It requires a relentless focus on a shared goal.

As former superintendent of Curriculum and Instruction in the York Region, Canada, Lyn says that literacy became their goal and their system mantra. They asked themselves what they expected of their literacy graduates and once they determined this, they worked backwards.  Stephen Covey refers to this as beginning with the end in mind.  It required coming up with a definition that everyone could live with from K-12.  “Literacy” was defined as language and mathematically competency. They then asked what were the foundational literacy skills necessary in the 21st century?  These were the ability of graduates to think, understand, analyse and to critically reflect.

Lyn says they worked hard at embedding the definitions and professional learning so that every single teacher was working toward the same goal – literacy.  It paid off; they achieved significant gains in Year 1 reading levels.  They analysed data relentlessly and looked closely at what was working in the ‘high focus schools’.  As Lyn and Michael drilled down, they discovered these schools hadn’t taken their eyes off literacy.  In the midst of flux, they were able to stay focused.  The other schools blamed everything from a change in principal to a leaky roof on why they couldn’t maintain focus.

Lyn’s experience shows that implementation is often our Achilles’ heel. We have a tendency to move on to something new every year than stay the course.  As Lyn puts it, we need to move beyond the modelling stage to the doing otherwise nothing actually happens in schools.  This means looking at the data, knowing the learner and asking what comes next.  We want our learners to be independent but we need teachers and leaders to be interdependent when it comes to implementation.  If something is fully implemented in your school, it means that 90% of teachers, according to Lyn, are doing it as part of their practice.  The short of it is we all need to know the same things about our work. We all need to know our learners.

On the last page of Lyn’s workbook is the quote: You can’t lead where you won’t go.  Lyn has given us permission to say no to the things that won’t make a difference to students and to go where we may not have been before.

NAPLAN: friend, foe or on the fence?

Late last year, the University of Western Sydney’s Whitlam Institute commissioned  a national survey of over 8,000 educators on the impact of the annual National Assessment Program for Literacy and Numeracy (NAPLAN).

There were very few surprises in the results which found that many teachers believed NAPLAN had negatively affected their practice and narrowed the curriculum:

  • More than 70 per cent of respondents agreed that NAPLAN means they ‘teach more to the test’ and spend more class time on areas they knew would be tested
  • Just over two thirds believed the focus on NAPLAN had led to timetable reduction for other subjects outside of literacy and numeracy in their schools
  • 64 per cent agreed that there had been a reduction of ‘face to face’ time with students
  • 55 per cent thought NAPLAN had narrowed their range of teaching strategies.
While these aren't officially endorsed publications of the NAPLAN program, and produced independently of Australian governments, they prove that these are high-stakes tests.

Parents can buy off-the-shelf practice tests. Is this sending the right message?

We should be shocked and dismayed to learn that over half of our primary schools are preparing their students with weekly practice tests five months in advance of NAPLAN. It’s not the test that counts but the teaching that is critical here and good teachers know this.

The most depressing thing for me about the survey was the fact that less than 50 per cent of teachers spent any time looking through the data at their school to drive improvement.

It defies logic that we have introduced a national assessment measure designed to diagnose students on basic skills to drive improvement, when over half of our schools aren’t actually using the data.

The obvious question is why are we spending an inordinate amount of time focusing on preparing kids for NAPLAN when many of our teachers and schools don’t actually use the data to inform their practice?

At our 2012 Ann D Clark lecture, Professor Andy Hargreaves warned educators to ‘beware the tool’:

‘Beware when you bring in a tool, even if you think you are using the tool optimistically, because, at some point in the future, people will take the tool and use it for a completely different purpose that you never imagined it would be used for.’

This is true of NAPLAN. We have to put the tool (the data) in its rightful place. We have stigmatised NAPLAN and turned it into something it is not. NAPLAN is given far too much emphasis on judging school performance rather than on helping to inform the bigger, richer picture of how our students are tracking. And, most importantly, how we are helping them to improve.

I want to make this point which seems to get lost in the discussion of declining performance, withholding some students from sitting the tests to boost school scores, and so on: These tests, particularly in our NSW experience are to test basic skills. They are located within the curriculum and are designed to determine what a student should be able to achieve in literacy and numeracy at a particular age level. We are not asking kids to sit an astrophysics test for MIT here. This is to check to see if they have attained the skills they should have attained through the teaching of the curriculum. Again, it’s not the test – it’s the teaching that’s important here.

The general agreement by State and Federal Education Ministers last year to move NAPLAN testing online in 2016 will hopefully improve the usefulness of the data to inform learning and teaching. ‘Just-in-time’ data can provide teachers and leaders with the means to better diagnose students’ learning needs and areas for improvement so they can be more responsive in the areas of literacy and numeracy.

But we are beginning from a narrow starting point. We need to find new tools to assess our students across a broader range of areas including the development of key 21st century skillsets like critical thinking and collaboration skills. I will be very interested to hear what new ACARA head Rob Randall presents this month around this very idea.

The point I want to make here is that regardless of how good the data is, if we aren’t using it it’s no good.

Data, in and of itself, like technology or any other tool in a teacher’s kit won’t make the difference – it’s neutral. If we want to get serious about school improvement, we need to actually use the data to inform our work and change our practice.

Hopefully, UWS’s survey will help influence the necessary changes so our schools can use NAPLAN data to this effect.

Follow

Get every new post delivered to your Inbox.

Join 9,694 other followers