In response to recent media coverage of flat or backward NAPLAN results, I engaged in a correspondence with a reporter.  Here’s what I wrote:
The perspective I can offer is one that focuses on how schools get the data as opposed to beating up the test, the schools or the government.
I can tell this story in three pictures (from screenshots of our software). This said, my point is not to flog our software, but to highlight the value of EASY ACCESS to data insights and how, without this, the lack of growth is not a surprise, but is, in fact, what we should expect.
All the screens are of actual NAPLAN data, but anonymised so as not to compromise confidentiality.
1) Flat results.
This visualisation shows 6 years of NAPLAN Band achievement across years 3, 5, 7 & 9.  You can see that the real story here is one of No Growth – the results are essentially flat.  This is the story your report told today. The reason I see this slightly differently is that we have schools who are just starting to use our software so 2017/18 is THE FIRST YEAR they have been able to easily see this data (and the next screens). So the point is that, without easy access to unpacking the band scores into skills and subskills, how were schools and teachers EXPECTED to make improvements?  Thus schools and teachers worked very hard either doing the same things they have always done or guessing what needs fixing.
(click to enlarge)
2) Unpacking the Data – from Skill problems to identifying Subskills 
No matter how hard teachers work, doing more of the same doesn’t necessarily address gaps in their students’ skills. Another visualisation shows how the data from the massive spreadsheets can be visualised in a way that goes from seeing the problem to seeing what needs targeting. Here, “traffic light colours” signal problems in specific skills and clicking one of the bubbles reveals the subskills that were assessed. NOW teachers know what they can target their teaching to:
  ​
3) Give teachers Insight into the students right in their classes!
 
The fact that NAPLAN data is often 1-2 years old by the time it reaches school and public attention makes it hard to use. The tests assess skills from the preceding year (e.g., Year 3 assesses Year 2 skills), then schools find out about the results toward the end of their year with the students and here we are almost upon 2018 NAPLAN and MySchool is only now updated with 2017 NAPLAN data.  How is a classroom teacher meant to help the students in their classes today?
In the last screen animation, you can see the “Teacher Dashboard” where a school’s NAPLAN data is sliced and sorted for the actual students sitting in front of a classroom teacher.  Yes, the data may still be a year old, but now the classroom teacher can accommodate and differentiate what he / she does based upon their students. In the animation, notice that both the data in the cards and the list of students in the right column change as I switch between classes (at the top of the dashboard). When I click on the NAPLAN Weather report card for writing, I can see which 4 students went backward from their 2015 to 2017 NAPLAN tests and which 5 achieved above expected growth targets.  Then when I click the NAPLAN Skill Focus card (and its backside) I get details about the top 4 (then 8 when flipped) areas in each of the 4 NAPLAN domains where this particular class of students scored lowest.  Again, clicking on the card, sorts the students according to the skill clicked so we can see who needs the most help and who could be extended.

So, to sum up, I see a big part of the problem is that classroom teachers have not been able to access the right kind of information easily in order to use the NAPLAN data (albeit a “snapshot” and a “diagnostic assessment being used as a high-stakes test” – two legitimate complaints against NAPLAN).  In fact, we have run into the situation where one of the leading state’s association for schools takes the approach of helping schools unpack NAPLAN results through a workshop on using Excel spreadsheets!!!! In 2018!

Our schools are just this year getting such access and we work with them to take charge of their remediation programs and initiatives and expect to see upward trends as they continuously improve their teaching and learning practices.

I’d love to chat or even take you through this software as a way to point to other solutions than beating up teachers, schools or the government – not something your reporting has ever done, but these bash-ups tend to be what’s buzzing in the media.  Perhaps a better, more productive approach is to use smart software to provide data insights?