The crucial last-mile of learning ‘analytics’ delivery.

The crucial last-mile of learning ‘analytics’ delivery.

The last-mile of learning ‘analytics’ delivery.

Teachers make the biggest impact on learning when they have the right information, all the time. It’s 2018 last time I looked and this is a simple idea. Having an ‘all the time’ capability of access and insight across all teaching and learning data is what every educator wants, needs, deserves…. What are the options?

Teaching and learning data has become a critical input into school improvement programs. Knowing where more effective and efficient programs can be implemented needs clarity and speed of evidence to support teacher instincts. Getting a flexible and ‘current’ data capability into the hands of teachers remains atop of the most wanted and still elusive list for many schools. Whoops, I should have included the concept of economy as well.

Some schools have structured some of their school data into BI platforms. From what we see and hear, many of these schools are yet to deliver these ‘dashboard’ systems the last mile, that is right into the hands of every teacher in their specific teaching and learning context. Delivering the last mile is complete when teacher adoption and ownership drives the initiative. Delivering the last-mile to teachers is what we all need to focus on as well as the cost of doing so. By the way, BI is an acronym we use in a generic context rather than in reference to a product.

Many schools aligned to BI platforms have linked admin data like SIS systems that span many areas, even scheduled maintenance. As one Principal recounted, ” we have managed to build an expensive rear-view mirror that our drivers are not looking into. They actually wanted a new heads-up display.”

For people that don’t necessarily understand the ‘learning data’ conversation, they should think about how they learn and realise that every place where there’s a feedback loop likely presents opportunities to apply automated learning technology to collect and surface meaningful data. Save the innovation thought for teaching because there are now recombinant platforms that allow teachers to collect and surface meaningful data to support teaching. Literatu Learning Ledger is a great example.

Megabytes of meaningful and current learning data is found in dozens of surrounding systems like Canvas, Mathletics, Flourishing, Mathspace, Edrolo, Essential Assessment and the routine PAT series. Most of this data is not unified into a single view or BI initiative.  We have found over 30 learning and pastoral sources of data in schools, many critical if you ever wanted to look at a whole-of-student view, that simply don’t make it onto the BI radar. All of these integrations into BI would be expensive and laborious to maintain. Add a new data source and the BI data team starts again.

Personally, I think much of what BI is doing out there in schools is akin to what Powerpoint does in meetings. It makes a few points, has nice UI transitions and charts and pretty much sends everyone to sleep when the room is dim. Many teachers are conservationists at heart, saving their energy for the next big idea every time they hear the word ‘analytics’. Teachers know data is important to inform teaching and learning, they just don’t see pre-configured charts helping them improve student learning in their daily context. We do acknowledge that there is a layer of every school’s team that loves BI charts, we get that too.

Delivering data ‘the last mile’ means getting every teacher involved, cracking the ‘keep it simple’ mantra around access and personalising insights for each teacher. Value to the teacher must emerge through a whole-of-student view. The ‘last mile’ of delivery is where data has to really work for each educator.

The step up from BI, is AI where data works much harder and smarter. Do teachers see the average grade chart, (after they have just done the grading), or do they get an alert identifying where the learning gap is widening for specific skills? I think the later is their need and preference. These are the big challenges we are working hard to solve without introducing another thing to do or another system to learn. No one needs that. Teachers ask us for data that comes to them. That’s the difference we see as the one worth making.

We are on a mission to give each educator a unified whole-of-student view of engagement, performance, learning and growth in the simplest and most engaging way possible. Welcome to Learning Ledger. Ledger is alive and well and set to go for every school. We already have great data integrations that turn on with Ledger and a heap of opportunities to open more data sources along with personalised student and parent learning ledger views.

Join the Evolution!

“If data is not about improving learning and teaching in a class or student context, why are teachers looking at it?”

Mark Stanley Sept 12, 2018

 

Get Insights from NAPLAN data – in 3 Screens

In response to recent media coverage of flat or backward NAPLAN results, I engaged in a correspondence with a reporter.  Here’s what I wrote:
The perspective I can offer is one that focuses on how schools get the data as opposed to beating up the test, the schools or the government.
I can tell this story in three pictures (from screenshots of our software). This said, my point is not to flog our software, but to highlight the value of EASY ACCESS to data insights and how, without this, the lack of growth is not a surprise, but is, in fact, what we should expect.
All the screens are of actual NAPLAN data, but anonymised so as not to compromise confidentiality.
1) Flat results.
This visualisation shows 6 years of NAPLAN Band achievement across years 3, 5, 7 & 9.  You can see that the real story here is one of No Growth – the results are essentially flat.  This is the story your report told today. The reason I see this slightly differently is that we have schools who are just starting to use our software so 2017/18 is THE FIRST YEAR they have been able to easily see this data (and the next screens). So the point is that, without easy access to unpacking the band scores into skills and subskills, how were schools and teachers EXPECTED to make improvements?  Thus schools and teachers worked very hard either doing the same things they have always done or guessing what needs fixing.
(click to enlarge)
2) Unpacking the Data – from Skill problems to identifying Subskills 
No matter how hard teachers work, doing more of the same doesn’t necessarily address gaps in their students’ skills. Another visualisation shows how the data from the massive spreadsheets can be visualised in a way that goes from seeing the problem to seeing what needs targeting. Here, “traffic light colours” signal problems in specific skills and clicking one of the bubbles reveals the subskills that were assessed. NOW teachers know what they can target their teaching to:
  ​
3) Give teachers Insight into the students right in their classes!
 
The fact that NAPLAN data is often 1-2 years old by the time it reaches school and public attention makes it hard to use. The tests assess skills from the preceding year (e.g., Year 3 assesses Year 2 skills), then schools find out about the results toward the end of their year with the students and here we are almost upon 2018 NAPLAN and MySchool is only now updated with 2017 NAPLAN data.  How is a classroom teacher meant to help the students in their classes today?
In the last screen animation, you can see the “Teacher Dashboard” where a school’s NAPLAN data is sliced and sorted for the actual students sitting in front of a classroom teacher.  Yes, the data may still be a year old, but now the classroom teacher can accommodate and differentiate what he / she does based upon their students. In the animation, notice that both the data in the cards and the list of students in the right column change as I switch between classes (at the top of the dashboard). When I click on the NAPLAN Weather report card for writing, I can see which 4 students went backward from their 2015 to 2017 NAPLAN tests and which 5 achieved above expected growth targets.  Then when I click the NAPLAN Skill Focus card (and its backside) I get details about the top 4 (then 8 when flipped) areas in each of the 4 NAPLAN domains where this particular class of students scored lowest.  Again, clicking on the card, sorts the students according to the skill clicked so we can see who needs the most help and who could be extended.

So, to sum up, I see a big part of the problem is that classroom teachers have not been able to access the right kind of information easily in order to use the NAPLAN data (albeit a “snapshot” and a “diagnostic assessment being used as a high-stakes test” – two legitimate complaints against NAPLAN).  In fact, we have run into the situation where one of the leading state’s association for schools takes the approach of helping schools unpack NAPLAN results through a workshop on using Excel spreadsheets!!!! In 2018!

Our schools are just this year getting such access and we work with them to take charge of their remediation programs and initiatives and expect to see upward trends as they continuously improve their teaching and learning practices.

I’d love to chat or even take you through this software as a way to point to other solutions than beating up teachers, schools or the government – not something your reporting has ever done, but these bash-ups tend to be what’s buzzing in the media.  Perhaps a better, more productive approach is to use smart software to provide data insights?