How to structure K-12 learning data to create student learning stories – Part 2

How to structure K-12 learning data to create student learning stories – Part 2

Open up learning data so that teachers are free to see

Part two of a three post series looks deeper inside the data closet and possibilities of K-12 schools. Lurking in all corners is data from multiple sources that when unleashed, often compete for attention. All of a sudden BIG DATA syndrome lurches forward into an unsuspecting audience. Keeping data terminology simple is of paramount importance. Post 3 will discuss how to run with the energy you will unlock in this post and deliver a framework for teachers.

I often ask teachers who come along to “data for learning” sessions in schools, “What data really matters to you?, “What data visibility would help you to help your students?” Surprisingly, most teachers don’t lead with “All I need is Power BI and a consultant to develop charts and data analytics”. Resoundingly teachers see their priority as doing the best they can to help their students achieve the absolute best they can. As true as that is, I  never really get an answer. “What have you got?” is the most common response. It’s like a standoff; “show us what you have and we will take a look”. It is a reasonable response considering many schools don’t have a system for organising, packing and unpacking data in a way that everyone feels a part of culturally. This unfortunately describes many of the to-and-fro data machinations schools go through when dealing with data.

More times than not, data in schools is rarely imagined as a fountain of collaborative knowledge. Data is thought of as more like secret silos of collective recount and just another thing to manage and transpose.  Not all teachers know where to go to access data or indeed what they could ask for. Online systems like OARS and SCOUT, and many others, open a limited window into data that external providers want you to be happy with. These systems have multiple logins and access points to remember and never really bring data back into practical frame of reference for teachers. Data is described and displayed as envisioned by developers. For example, SCOUT can tell you about one student at a time, not the class or cohort.  Most times, less than 20% of teachers in a school access these systems. That means 80% of teachers don’t have the time and find access too hard. Not a good run rate for any measure of progress.

So, how do we progress? There is an element of conscious incompetence when it comes to knowing what data there is and what value it holds. Another way of saying this is that many teachers don’t know what they don’t know. Until a ‘Kondo’ declutter event happens both in the physical data closet, and in our thinking, there will be no new windscreen through which teachers can explore a hunch or find a new direction. The current rear vision mirror, useful for lane changing in traffic, will persist. I recommend a simple Kondo-style approach to data organising, starting with a common framework to build momentum.

Data Domains

We can’t advance data conversations without a framework that underpins data. We can’t continually talk about ‘stuff’ nor can we declutter our ‘stuff’ into a containerised system if we don’t have containers. I believe there are six primary data domains in every school. That’s six ways we can declutter data.

  1. Diagnostic  
  2. Behavioural – Attendance – Compliance  
  3. Pastoral wellness – Participation  
  4. Academic growth
  5. Targeted skills development –  Formative – reading , writing , maths – and ‘apps’ for that
  6. Observation – Cognitive capture

The list of Data sources I highlighted in Post 1, all neatly slot into these Domains. In all categorisation challenges, the key phrase to remember  is ‘less is more’.

Finally I get to build out my metaphor for Blog 2.  De-boned, these domains represent the six faces of a Rubik cube. The Rubik cube represents the complete student learning data story of each school.

When teachers ask , “what teaching and learning data do we have?” the answer is that we have six Data Domains. Imagine each one as a side of a Rubik cube.  

Data Sources

NAPLAN, PAT and ALLWELL are great examples of Diagnostic Data Sources. Every Data source logically belongs in one of the six Data Domains and is a row or box on the Diagnostics Domain face of the cube. Each Data Source has a context, year and other surrounding metadata that makes it  either unique or simply more of the same thing. Let’s label the Diagnostic domain as Orange in color and each Diagnostic Data Source as a row on the Diagnostic cube face. The rows could be organised by year or more granular learning breakdowns like Type of Test, Reading, Writing, Comprehension (that’s the easy part done by data people).

There it is.  We have a base structure into which the initial clutter of diagnostic data Sources are organised. When teachers want Diagnostic information, there is a Domain in which all diagnostic Data Sources live and everyone uses the same terminology.  Make sense? Now it’s time to go one more level.

Data Elements

The clutter and detail found in most Data Sources is actually in the Data Elements, the pieces of information and specific content contained inside each Data Source. NAPLAN, for example, is a very rich source of many data elements. From Band to Scaled Score to question correctness, multiple data Elements exist within each data source. Don’t worry about the elements now. Good data storage will offer you choices around these data elements, hopefully in a big menu, tick box format.

Your initial burning question can now can now be expanded confidently, knowing you have the right ‘stuff’ in the right place.

  1. I am after Diagnostic information ( Domain)
  2. From NAPLAN 2018 ( Data Source)
  3. ..and I would like to see Band, Scaled Score and Raw Score for Reading (Data Elements).

Now, all you have to decide is the volume of data  you want. Do you want this information for:

  • A Student or Students?
  • A Class, your Classes? (because you are a teacher with a roster so that would really help)
  • A Cohort / House / Year or other aggregation?

How to think about data across Domains and Sources and then run free.

Imagine this new Rubik Super-Cube in your hands. You have data from your six Data Domains in sight. For a student you can see across all Domains of data, as you can for a cohort or class. The Cube metaphor implies that all volumes of data across all six Domains is available at any level of  inquiry. Be it, Student, Class, Cohort, Subject, House or Year, the query is the same. The only difference is the amount of information you want to look into.

With this metaphorical magic in your head, you can now frame any question using your general intelligence, something you have and computers don’t.

“Can I see Pastoral data on Personal Development (a row Data Source from Pastoral Domain- Yellow) against the Reading results from NAPLAN 2018 (a row Data Source from Diagnostic Domain- Orange)?” I usually add ‘Please’ but Computers don’t know that word either.

Imagine twisting  your Rubik Pastoral – Personal Development Data Source row across the face of the Diagnostic – Reading Data Source. You have your view side by side and the cognitive ability to run with it. As with a real Rubik cube, your twists and turn combinations are up to your imagination.

 

The only remaining question is what Data Elements you would like to see from each Data Source? This will define the level of detail you want from each Data Source to quench your insatiable thirst for learning growth insights.

Everything improves from a solid and consistent starting framework. These examples are just some simple starting gymnastics that you can do with organised Data Domains and Data Sources.

To recap the main points in this post.

  1. There are six Domains of data in K-12 Schools.
  2. In each Domain, there are multiple Data Sources. Start by finding a few but do understand that when a new Data Source arrives, it will fit into a Domain. That’s the Kondo ‘less is more’ way!
  3. In Each Data Source, there are Data Elements to see. This is Post 3.

The real excitement comes when your school has a platform approach to data, one that allows everyone to confidently explore data up, down and across the cube,  the way they want to. I started at the top of this post with and image of the end game.

In my final post, I will talk about how all of these Cube formats with Domain and Sources completely mixed, all make sense. Sometimes the answers you seek come from multiple Domain, Source and Element data pieces. Why not? The structure of the data should support the interest you have.

When you have a framework in motion  you can construct any combination of inquiry. The most important feature of any system is the ease at which you can snap back to a fully solved puzzle and start again. This would be like having Marie Kondo come back into the room and declutter all over again. Yes indeed, this is mandatory!

Mark Stanley is CEO and Founder of Literatu. : www.literatu.com

The crucial last-mile of learning ‘analytics’ delivery.

The crucial last-mile of learning ‘analytics’ delivery.

The last-mile of learning ‘analytics’ delivery.

Teachers make the biggest impact on learning when they have the right information, all the time. It’s 2018 last time I looked and this is a simple idea. Having an ‘all the time’ capability of access and insight across all teaching and learning data is what every educator wants, needs, deserves…. What are the options?

Teaching and learning data has become a critical input into school improvement programs. Knowing where more effective and efficient programs can be implemented needs clarity and speed of evidence to support teacher instincts. Getting a flexible and ‘current’ data capability into the hands of teachers remains atop of the most wanted and still elusive list for many schools. Whoops, I should have included the concept of economy as well.

Some schools have structured some of their school data into BI platforms. From what we see and hear, many of these schools are yet to deliver these ‘dashboard’ systems the last mile, that is right into the hands of every teacher in their specific teaching and learning context. Delivering the last mile is complete when teacher adoption and ownership drives the initiative. Delivering the last-mile to teachers is what we all need to focus on as well as the cost of doing so. By the way, BI is an acronym we use in a generic context rather than in reference to a product.

Many schools aligned to BI platforms have linked admin data like SIS systems that span many areas, even scheduled maintenance. As one Principal recounted, ” we have managed to build an expensive rear-view mirror that our drivers are not looking into. They actually wanted a new heads-up display.”

For people that don’t necessarily understand the ‘learning data’ conversation, they should think about how they learn and realise that every place where there’s a feedback loop likely presents opportunities to apply automated learning technology to collect and surface meaningful data. Save the innovation thought for teaching because there are now recombinant platforms that allow teachers to collect and surface meaningful data to support teaching. Literatu Learning Ledger is a great example.

Megabytes of meaningful and current learning data is found in dozens of surrounding systems like Canvas, Mathletics, Flourishing, Mathspace, Edrolo, Essential Assessment and the routine PAT series. Most of this data is not unified into a single view or BI initiative.  We have found over 30 learning and pastoral sources of data in schools, many critical if you ever wanted to look at a whole-of-student view, that simply don’t make it onto the BI radar. All of these integrations into BI would be expensive and laborious to maintain. Add a new data source and the BI data team starts again.

Personally, I think much of what BI is doing out there in schools is akin to what Powerpoint does in meetings. It makes a few points, has nice UI transitions and charts and pretty much sends everyone to sleep when the room is dim. Many teachers are conservationists at heart, saving their energy for the next big idea every time they hear the word ‘analytics’. Teachers know data is important to inform teaching and learning, they just don’t see pre-configured charts helping them improve student learning in their daily context. We do acknowledge that there is a layer of every school’s team that loves BI charts, we get that too.

Delivering data ‘the last mile’ means getting every teacher involved, cracking the ‘keep it simple’ mantra around access and personalising insights for each teacher. Value to the teacher must emerge through a whole-of-student view. The ‘last mile’ of delivery is where data has to really work for each educator.

The step up from BI, is AI where data works much harder and smarter. Do teachers see the average grade chart, (after they have just done the grading), or do they get an alert identifying where the learning gap is widening for specific skills? I think the later is their need and preference. These are the big challenges we are working hard to solve without introducing another thing to do or another system to learn. No one needs that. Teachers ask us for data that comes to them. That’s the difference we see as the one worth making.

We are on a mission to give each educator a unified whole-of-student view of engagement, performance, learning and growth in the simplest and most engaging way possible. Welcome to Learning Ledger. Ledger is alive and well and set to go for every school. We already have great data integrations that turn on with Ledger and a heap of opportunities to open more data sources along with personalised student and parent learning ledger views.

Join the Evolution!

“If data is not about improving learning and teaching in a class or student context, why are teachers looking at it?”

Mark Stanley Sept 12, 2018

 

Get Insights from NAPLAN data – in 3 Screens

In response to recent media coverage of flat or backward NAPLAN results, I engaged in a correspondence with a reporter.  Here’s what I wrote:
The perspective I can offer is one that focuses on how schools get the data as opposed to beating up the test, the schools or the government.
I can tell this story in three pictures (from screenshots of our software). This said, my point is not to flog our software, but to highlight the value of EASY ACCESS to data insights and how, without this, the lack of growth is not a surprise, but is, in fact, what we should expect.
All the screens are of actual NAPLAN data, but anonymised so as not to compromise confidentiality.
1) Flat results.
This visualisation shows 6 years of NAPLAN Band achievement across years 3, 5, 7 & 9.  You can see that the real story here is one of No Growth – the results are essentially flat.  This is the story your report told today. The reason I see this slightly differently is that we have schools who are just starting to use our software so 2017/18 is THE FIRST YEAR they have been able to easily see this data (and the next screens). So the point is that, without easy access to unpacking the band scores into skills and subskills, how were schools and teachers EXPECTED to make improvements?  Thus schools and teachers worked very hard either doing the same things they have always done or guessing what needs fixing.
(click to enlarge)
2) Unpacking the Data – from Skill problems to identifying Subskills 
No matter how hard teachers work, doing more of the same doesn’t necessarily address gaps in their students’ skills. Another visualisation shows how the data from the massive spreadsheets can be visualised in a way that goes from seeing the problem to seeing what needs targeting. Here, “traffic light colours” signal problems in specific skills and clicking one of the bubbles reveals the subskills that were assessed. NOW teachers know what they can target their teaching to:
  ​
3) Give teachers Insight into the students right in their classes!
 
The fact that NAPLAN data is often 1-2 years old by the time it reaches school and public attention makes it hard to use. The tests assess skills from the preceding year (e.g., Year 3 assesses Year 2 skills), then schools find out about the results toward the end of their year with the students and here we are almost upon 2018 NAPLAN and MySchool is only now updated with 2017 NAPLAN data.  How is a classroom teacher meant to help the students in their classes today?
In the last screen animation, you can see the “Teacher Dashboard” where a school’s NAPLAN data is sliced and sorted for the actual students sitting in front of a classroom teacher.  Yes, the data may still be a year old, but now the classroom teacher can accommodate and differentiate what he / she does based upon their students. In the animation, notice that both the data in the cards and the list of students in the right column change as I switch between classes (at the top of the dashboard). When I click on the NAPLAN Weather report card for writing, I can see which 4 students went backward from their 2015 to 2017 NAPLAN tests and which 5 achieved above expected growth targets.  Then when I click the NAPLAN Skill Focus card (and its backside) I get details about the top 4 (then 8 when flipped) areas in each of the 4 NAPLAN domains where this particular class of students scored lowest.  Again, clicking on the card, sorts the students according to the skill clicked so we can see who needs the most help and who could be extended.

So, to sum up, I see a big part of the problem is that classroom teachers have not been able to access the right kind of information easily in order to use the NAPLAN data (albeit a “snapshot” and a “diagnostic assessment being used as a high-stakes test” – two legitimate complaints against NAPLAN).  In fact, we have run into the situation where one of the leading state’s association for schools takes the approach of helping schools unpack NAPLAN results through a workshop on using Excel spreadsheets!!!! In 2018!

Our schools are just this year getting such access and we work with them to take charge of their remediation programs and initiatives and expect to see upward trends as they continuously improve their teaching and learning practices.

I’d love to chat or even take you through this software as a way to point to other solutions than beating up teachers, schools or the government – not something your reporting has ever done, but these bash-ups tend to be what’s buzzing in the media.  Perhaps a better, more productive approach is to use smart software to provide data insights?