4 Tips on Data Analytics for Educators

Background – The Four Types

Every educator is immersed in data every day, but if you aren’t familiar with the main categories of data and how they are analysed, please review the preceding article, Four Types of Data Analytics for Educators.  As a reference for what follows, the four types are:

  • Descriptive – what’s already happened
  • Diagnostic – why it happened
  • Predictive – forecasts what might happen in the future
  • Prescriptive – recommends what to do

It might seem like working with data in any of these four ways is overwhelming, but the truth is, schools are already interacting with data, still few get enough value from it.  So much assessing, marking, tracking, conferencing, mentoring, etc. is done at schools, yet, I suggest, the value derived doesn’t match the effort invested. This isn’t anything to be ashamed of, as a profession we are in the early days of using data to achieve our goals for students.  So what can schools do?

4 Tips for Educators - Data Analytics

Four Tips: Educators Leverage Data Analytics

1. Inventory Current Data

It’s like when we’re lost in a new place. We look for a map that tells us, “You Are Here.” Unfortunately the data swirling in schools isn’t laid out as well as a map. So the best thing to do first is to take an inventory of what data sets currently exist. These might be on spreadsheets, in software, on paper, in file folders, etc.  Engage staff to brainstorm a comprehensive list. Then filter the list for “what’s valuable.”  Not all information contributes to the goals you have for students. This doesn’t mean to discard (unless you can) what doesn’t contribute to your vision for student learning and success, sometimes data are required by sector authorities. But by identifying “your valuable data,” you’re ready for the next step.

Possible Outcomes:

  • comprehensive list of all data collected / available
  • identification of the 3-5 most valuable data sets
  • an idea of where the data resides and in what formats

2. Grow Quality

It might go without saying, but if we want quality insights from our data, we need to start with quality data.  My experience suggests that other than school reports and government supplied data, it’s likely that even the most valuable data you’re collecting will “have some issues.” Usually the issues relate to two areas. First, is incompleteness. Perhaps not everyone generates or collects the data or does it inconsistently. The second typical issue is irregularities in timelines: for no obvious reason, sometimes assessments aren’t given some years or to certain cohorts. When you have “holes in your data,” you miss out on the opportunity to see longitudinal trends.

A good way to start is to engage staff in the review and valuation of the data sets and what benefit can be gained. Following a robust discussion about both the effort involved, the benefits gained, and, maybe even, consideration of data types (descriptive, diagnostic, etc.), you will have raised awareness the issues and begun what will be an ongoing dialogue. A main outcome of the process will be deciding which data sets to commit to and enlist full participation from those involved in generating / using the data.  Such complete commitment will place a demand on staff who will be contributing so it’s important that everyone see that value, the end goal and how the data will support student success.

The second way to grow quality data is to keep the foundations of measurement in mind: validity and reliability. When an assessment actually provides a good measure of what’s assessed, it’s said to be a valid measure. A classic example, and a pet peeve, of an invalid assessment is using a wordsearch activity to provide insight on students’ knowledge of vocabulary: recognising a row of letters bares no relationship to understanding the word’s meaning.  The second criteria is reliability. This is a tricky topic because it usually relates to the differences among humans in their assessment of student performance. The classic example is when the same essay receives difference marks from two readers (or even one reader when hungry or tired). Thus, when a school identifies a valued data source and enlists complete participation, some professional learning and team work will likely be required. On the positive side, this is the kind of learning that develops consistency and dialogue amongst professionals, leading to improved practice and outcomes for students.

The third very positive outcome of growing quality data is to identify what data is missing. This isn’t in relation to inconsistencies and holes as discussed above, but gets to the heart of a collective view of what data is important.  Rather than simply “try really hard” to achieve your school or system’s vision for student success, you will have a better time if at least some evidence can validate that you are achieving it. Thus, what else would you really like to know about students’ performance? A common response is to recognise missing data related to such things as student wellbeing, critical thinking, non-cognitive skills, creativity, etc. Identifying and then developing or sourcing such additional measures is part of growing quality data. It also helps engage staff who will see that data goes beyond a numeric or letter snapshot of students, that by layering dimensions, we can build a more complete picture of what student success looks like and how to achieve it.

Possible Outcomes:

  • community buy-in on which are the “valued” learning data
  • a more disciplined schedule of diagnostic assessments
  • professional learning and consensus on moderation of student work and its merits
  • identification and sourcing of currently un-assessed traits or skills
  • staff ownership of and commitment to growing quality data

Given a whole school / system approach and time, you will have solid descriptive and diagnostic data to build predictions. Which leads to…

3. Engage Hypotheses

Even before your school has friendly analytical data software in the hands of every teacher, to grow a culture of data-informed professionals, consider action research, faculty-based pilots or simple whole school initiatives to practice a “closed loop” curriculum. An example of this “closed-loop” is when unexpected variations in descriptive data or learning gaps in diagnostic assessments confirm or challenge staff members’ perceptions. Having a data point to work with, staff then explore the units and lessons related to those variations or gaps followed by a professional sharing of effective targeted solutions. Sometimes solutions will need to be created, but often some staff members have a great lesson or activity that suits the need. These solutions are then built into the curriculum and used in all appropriate classrooms / school spaces. Team teaching, peer feedback or other mentoring approaches might be needed to help all staff master the solution. After the usual instructional cycle (lesson? unit? semester?), a valid and reliable assessment is made to determine if the hypothesised solution was effective. This is simply the scientific method (or common sense?) that, once established in the culture, guarantees continuous improvement.

Possible Outcomes:

  • initiate simple approaches across the whole school (e.g., uniform beginning / ending routines for all lessons, use of open-ended questions, common writing rubrics across subjects, etc.)
  • pilot closed-loop solutions in a cohort or course
  • expand effective approaches used by only some teachers to all in a faculty
  • trial new approaches that have proved effective at other schools
  • test new measures for things like wellbeing, creativity, etc. to see if they generate the desired data insights

All these efforts will need to have defined data collected, then reviewed, to determine each initiative’s effectiveness.

Beginning this closed loop curriculum cycle, even before all quality data is grown, engages staff in the practice that will ultimately make the data useful.  Even the best quality data sets are worthless if the insights they provide are not brought to life and inform teaching and learning practices.

4. Dissolve Silos

Going back to the first tip, review all the different places your school’s data resides and their formats: student administration systems, state and national log-ins, online software, paper results, spreadsheets, etc.  It’s not realistic to think that all your data will exist in one system, but software is always under development, and being able to import, share or read data across platforms has become a requirement. However, as in any area, there are leaders and laggards in the field of educational technology. Rather than think you need to begin this work with a comprehensive new data analytics system, because software improves quicker than school cultures change, start with the systems you’re already using and identify aspects that contribute to gaining immediate insights into students’ learning. Get the most from what is immediately available. Then, as your staff gets accustomed to using valuable data insights, focus on the learning insights you want to gain. Avoid getting bogged down in evaluating the feature lists of various software platforms. Approach the vendors with your goals and make them show you whether their software can easily produce quickly visualised displays – or at least how close they can get.  No software is perfect or ever truly finished, so look for vendors who want input and have reliable roadmaps.  Also, remember, it’s the data analytical insights you want, not bells and whistles.

Possible Outcomes:

The reason we want to dissolve the silos of separate data sets is because we hope insights can be gained by combining information that now sits in isolation. For example, can we gain insights by:

  • comparing class grades to student performance in related diagnostic assessments
  • aligning wellbeing measures to semester grades or class rank
  • looking at whether a correlation exists between co-curricular participation and academic achievement
  • assessing intrinsic motivation or study skills in comparison to other elements of a vision for student success

Conclusion

Being able to crunch and dig deeper in the collective data you value will not happen overnight. On the positive side, as time passes, smarter software is always developed, so by the time you have grown several years of quality data and engaged staff in educational hypotheses, you will have a very good idea of the kind of functionality and easy visualisations you need to achieve the goals you have. After decades of using educational technology, we know there is no end point, no “there” that we will get to, but recognise that this is a space of constant evolution where we sometimes merge pieces together and at others we’re better off starting with a clean slate. So don’t rush to get “the latest” technologies, but nurture a school culture around data-informed improvement and build relationships with vendors to see who’s selling software and who’s targeting solutions.  Remember, unless we start now and make this a new routine, another generation of teachers and students can pass through your school missing out on the assistance quality data can provide.

4 Types of Data Analytics for Educators

Data and Education?

A quick survey of the Web turns up many helpful pages on data analytics. But because data is the currency of our now digital economy, the articles typically focus on the business world.  Any mentions of data in terms of education usually surface one of three understandable perspectives: protecting student privacy, boosting engagement (i.e., $) and the mostly misguided use of high-stakes, though meagre, metrics to represent human learning.

Furthermore, some educators can be hesitant about “data.” Maybe data seems like numbers and graphs while our role is to nurture growth in human beings?  But the right data is an asset in our very human task as it can provide some evidence about how things are going. Quality data, easily accessible and effectively visualised is one way to gain insights on such questions as:

  • Are students learning?
  • What are they learning?
  • What aren’t they learning that could help them grow?
  • What pedagogical approaches are the most effective for which students?
  • How can our school and system use these insights to keep getting better?

This post will provide background on the four types of data analytics and offer four tips for educators. Let’s start with a graphic that we can unpack:

4 Types of Data Analytics

What are the Four Types of Data Analytics?

A handy way to understand data analytics is to view it in terms of what it can do for us:

  • Describe – what’s already happened
  • Diagnose – why did it happen
  • Predict – forecasts what might happen in the future
  • Prescribe – recommends what to do

The first handy thing to notice is that the “D”s are about the past and the “P”s are in the future. The other useful dimension is the common sense relationship between value and complexity: the simplest data is also the least useful, whereas the most complex has the potential to add the most value.

Now let’s see how the the four types of analytics align with data sources schools already know. I’ll also highlight how complexity and value factor in.

Descriptive Analytics

Descriptive analytics can tell us “what happened.” This is by far the most common type of data used in schools.  When teachers give tests at the end of a unit and assign a mark or compile semester reports, they are generating descriptive data. Other kinds of descriptive data schools might collect are attendance records, participation in co-curricular activities, literacy progressions, feedback surveys, etc.

Value and Complexity

Teachers and schools invest extraordinary resources collecting such information, so it’s unfortunate that these only provide a simple “backward glance”.  As much detail as a teacher might consider before assigning a result, in the end, it’s typically an A-E grade or 0-100 percentage.

Applying Analytics

Probably the main applications of these datasets are ranking students and potentially serving as extrinsic motivation.  However, careful analysis of such descriptive data across a school can help identify variability (and success?) among teachers and faculties, identify patterns for specific student cohorts, and compare academic achievement in the context of things like school engagement and well-being.

Diagnostic Analytics

Diagnostic analytics offer more in-depth insights into student performance or ability. In Australia, when people think of diagnostic data in education, what comes to mind are standardised or “high stakes” tests such as the National Assessment Program – Literacy and Numeracy (NAPLAN) or ACER’s Progressive Achievement Tests (PAT).  In the primary years, many schools use commercial benchmarking programs (such as PM, F & P and Probe). Some online programs / apps also provide diagnostic data. In the classroom, many teachers use formative assessments and rubrics to assess students’ skills. What these all have in common is that, just like a medical doctor reviewing blood tests or x-rays, diagnostic assessments provide greater detail about what’s going on.

Value and Complexity

This greater detail we can call “granularity”. So the real benefit of, say, a NAPLAN Reading assessment is not to find out what Band a student is in, but to gain insight into a student’s strengths and gaps in core skills related to the domain such as “connecting information,” “applying comprehension” or “inferencing.” With this greater degree of complexity, you can see how the value increases.  Rather than try to help that student “read better,” we can target teaching to fill gaps and extend strengths.

Applying Analytics

Taking advantage of the diagnostic assessments used in a school is a challenge, mostly because the data is buried and not easily visualised. When teachers, faculties, grade-levels teams and leaders can quickly see such gaps and strengths across core skills at the student, class and cohort levels, each stakeholder is empowered to support targeted solutions.

Predictive Analytics

Predictive analytics can offer insights into “what is likely to happen.” It uses the results of descriptive and diagnostic analytics to predict future trends. As such, predictive analytics forecast likely outcomes to be pursued or avoided.  Many schools employ data consultants to conduct statistical reviews of past performance in high-stakes measures, especially Year 12 results such as HSC, VCE and Advanced Placement tests.

Value and Complexity

Clearly, because solid predictive analytics is based on equally solid descriptive and diagnostic data, it is more complex to gather and analyse.  The reason schools often outsource this analysis to data experts is because of the statistical analysis required and also the savvy to integrate appropriate data sets. Equally complex is the interpretation of the analysis so that patterns aren’t seen as causative when they might really only correlate (e.g., high participation in co-curricular activities may correlate with high achieving students, not cause the high achievement).

Applying Analytics

Besides employing data scientists, schools can begin use of predictive analytics by identifying what information they most value about students and their learning. This might require introducing new descriptive and diagnostic measures and then taking several years to grow these data sets.

Prescriptive Analytics

The fourth type of analytics results in actual prescriptions of “what action(s) to take” to solve a problem in the future or to take full advantage of promising trends seen in the predictive analysis. Many schools engage in pedagogical initiatives such as Reading Recovery, Maker Spaces, STEM, writing across the curriculum or wellbeing programs, but such decisions typically lack a robust foundation in hard data. It’s not that such initiatives are bad, but when they are begun without targeting an evidence-based issue with aligned pedagogical solutions, measuring success is ad hoc at best. One downside of such initiatives lacking credible success criteria is the drain on staff morale and change fatigue. We are all motivated when we perceive ourselves as being effective.  Without measures, this is difficult.

Value and Complexity

Like predictive analytics, prescriptive is also based on data from each of the other types and thus requires both quality data and insights (see the next article on Tips for Educators).  Obviously, the value-add of such predictive analytics is great, but the complexity needed to combine and model the data is comparably great.

Applying Analytics

Both the value and complexity of prescriptive analytics suggests not only the participation of experts, but also technology. In fact, this is where many tech companies get involved and apply Artificial Intelligence to the rapid modelling of data.  “Big data” sets are needed for algorithms to apply both unstructured and structured analysis to tease out reliably demonstrated outcomes. A good example is the increasing sophistication and accuracy of tech giants like Google and Amazon to make suggestions based upon constructed profiles for each user.  In this way AI can move beyond more simple “if/then” predictions to an assurance that specific actions will lead to the desired outcome. Extensive work is being done in this area at the university level.

Where to Next?

As you can see, most schools would collect, use and benefit from each of the four types of data analytics.  Yet, it’s the rare school that has a mature approach to managing and employing the data circulating through its software, systems and folders. Now that you have more background on the types of data and analytics, please read the next article offering 4 Tips on Data Analytics for Educators.