Background – The Four Types
Every educator is immersed in data every day, but if you aren’t familiar with the main categories of data and how they are analysed, please review the preceding article, Four Types of Data Analytics for Educators. As a reference for what follows, the four types are:
- Descriptive – what’s already happened
- Diagnostic – why it happened
- Predictive – forecasts what might happen in the future
- Prescriptive – recommends what to do
It might seem like working with data in any of these four ways is overwhelming, but the truth is, schools are already interacting with data, still few get enough value from it. So much assessing, marking, tracking, conferencing, mentoring, etc. is done at schools, yet, I suggest, the value derived doesn’t match the effort invested. This isn’t anything to be ashamed of, as a profession we are in the early days of using data to achieve our goals for students. So what can schools do?
Four Tips: Educators Leverage Data Analytics
1. Inventory Current Data
It’s like when we’re lost in a new place. We look for a map that tells us, “You Are Here.” Unfortunately the data swirling in schools isn’t laid out as well as a map. So the best thing to do first is to take an inventory of what data sets currently exist. These might be on spreadsheets, in software, on paper, in file folders, etc. Engage staff to brainstorm a comprehensive list. Then filter the list for “what’s valuable.” Not all information contributes to the goals you have for students. This doesn’t mean to discard (unless you can) what doesn’t contribute to your vision for student learning and success, sometimes data are required by sector authorities. But by identifying “your valuable data,” you’re ready for the next step.
Possible Outcomes:
- comprehensive list of all data collected / available
- identification of the 3-5 most valuable data sets
- an idea of where the data resides and in what formats
2. Grow Quality
It might go without saying, but if we want quality insights from our data, we need to start with quality data. My experience suggests that other than school reports and government supplied data, it’s likely that even the most valuable data you’re collecting will “have some issues.” Usually the issues relate to two areas. First, is incompleteness. Perhaps not everyone generates or collects the data or does it inconsistently. The second typical issue is irregularities in timelines: for no obvious reason, sometimes assessments aren’t given some years or to certain cohorts. When you have “holes in your data,” you miss out on the opportunity to see longitudinal trends.
A good way to start is to engage staff in the review and valuation of the data sets and what benefit can be gained. Following a robust discussion about both the effort involved, the benefits gained, and, maybe even, consideration of data types (descriptive, diagnostic, etc.), you will have raised awareness the issues and begun what will be an ongoing dialogue. A main outcome of the process will be deciding which data sets to commit to and enlist full participation from those involved in generating / using the data. Such complete commitment will place a demand on staff who will be contributing so it’s important that everyone see that value, the end goal and how the data will support student success.
The second way to grow quality data is to keep the foundations of measurement in mind: validity and reliability. When an assessment actually provides a good measure of what’s assessed, it’s said to be a valid measure. A classic example, and a pet peeve, of an invalid assessment is using a wordsearch activity to provide insight on students’ knowledge of vocabulary: recognising a row of letters bares no relationship to understanding the word’s meaning. The second criteria is reliability. This is a tricky topic because it usually relates to the differences among humans in their assessment of student performance. The classic example is when the same essay receives difference marks from two readers (or even one reader when hungry or tired). Thus, when a school identifies a valued data source and enlists complete participation, some professional learning and team work will likely be required. On the positive side, this is the kind of learning that develops consistency and dialogue amongst professionals, leading to improved practice and outcomes for students.
The third very positive outcome of growing quality data is to identify what data is missing. This isn’t in relation to inconsistencies and holes as discussed above, but gets to the heart of a collective view of what data is important. Rather than simply “try really hard” to achieve your school or system’s vision for student success, you will have a better time if at least some evidence can validate that you are achieving it. Thus, what else would you really like to know about students’ performance? A common response is to recognise missing data related to such things as student wellbeing, critical thinking, non-cognitive skills, creativity, etc. Identifying and then developing or sourcing such additional measures is part of growing quality data. It also helps engage staff who will see that data goes beyond a numeric or letter snapshot of students, that by layering dimensions, we can build a more complete picture of what student success looks like and how to achieve it.
Possible Outcomes:
- community buy-in on which are the “valued” learning data
- a more disciplined schedule of diagnostic assessments
- professional learning and consensus on moderation of student work and its merits
- identification and sourcing of currently un-assessed traits or skills
- staff ownership of and commitment to growing quality data
Given a whole school / system approach and time, you will have solid descriptive and diagnostic data to build predictions. Which leads to…
3. Engage Hypotheses
Even before your school has friendly analytical data software in the hands of every teacher, to grow a culture of data-informed professionals, consider action research, faculty-based pilots or simple whole school initiatives to practice a “closed loop” curriculum. An example of this “closed-loop” is when unexpected variations in descriptive data or learning gaps in diagnostic assessments confirm or challenge staff members’ perceptions. Having a data point to work with, staff then explore the units and lessons related to those variations or gaps followed by a professional sharing of effective targeted solutions. Sometimes solutions will need to be created, but often some staff members have a great lesson or activity that suits the need. These solutions are then built into the curriculum and used in all appropriate classrooms / school spaces. Team teaching, peer feedback or other mentoring approaches might be needed to help all staff master the solution. After the usual instructional cycle (lesson? unit? semester?), a valid and reliable assessment is made to determine if the hypothesised solution was effective. This is simply the scientific method (or common sense?) that, once established in the culture, guarantees continuous improvement.
Possible Outcomes:
- initiate simple approaches across the whole school (e.g., uniform beginning / ending routines for all lessons, use of open-ended questions, common writing rubrics across subjects, etc.)
- pilot closed-loop solutions in a cohort or course
- expand effective approaches used by only some teachers to all in a faculty
- trial new approaches that have proved effective at other schools
- test new measures for things like wellbeing, creativity, etc. to see if they generate the desired data insights
All these efforts will need to have defined data collected, then reviewed, to determine each initiative’s effectiveness.
Beginning this closed loop curriculum cycle, even before all quality data is grown, engages staff in the practice that will ultimately make the data useful. Even the best quality data sets are worthless if the insights they provide are not brought to life and inform teaching and learning practices.
4. Dissolve Silos
Going back to the first tip, review all the different places your school’s data resides and their formats: student administration systems, state and national log-ins, online software, paper results, spreadsheets, etc. It’s not realistic to think that all your data will exist in one system, but software is always under development, and being able to import, share or read data across platforms has become a requirement. However, as in any area, there are leaders and laggards in the field of educational technology. Rather than think you need to begin this work with a comprehensive new data analytics system, because software improves quicker than school cultures change, start with the systems you’re already using and identify aspects that contribute to gaining immediate insights into students’ learning. Get the most from what is immediately available. Then, as your staff gets accustomed to using valuable data insights, focus on the learning insights you want to gain. Avoid getting bogged down in evaluating the feature lists of various software platforms. Approach the vendors with your goals and make them show you whether their software can easily produce quickly visualised displays – or at least how close they can get. No software is perfect or ever truly finished, so look for vendors who want input and have reliable roadmaps. Also, remember, it’s the data analytical insights you want, not bells and whistles.
Possible Outcomes:
The reason we want to dissolve the silos of separate data sets is because we hope insights can be gained by combining information that now sits in isolation. For example, can we gain insights by:
- comparing class grades to student performance in related diagnostic assessments
- aligning wellbeing measures to semester grades or class rank
- looking at whether a correlation exists between co-curricular participation and academic achievement
- assessing intrinsic motivation or study skills in comparison to other elements of a vision for student success
Conclusion
Being able to crunch and dig deeper in the collective data you value will not happen overnight. On the positive side, as time passes, smarter software is always developed, so by the time you have grown several years of quality data and engaged staff in educational hypotheses, you will have a very good idea of the kind of functionality and easy visualisations you need to achieve the goals you have. After decades of using educational technology, we know there is no end point, no “there” that we will get to, but recognise that this is a space of constant evolution where we sometimes merge pieces together and at others we’re better off starting with a clean slate. So don’t rush to get “the latest” technologies, but nurture a school culture around data-informed improvement and build relationships with vendors to see who’s selling software and who’s targeting solutions. Remember, unless we start now and make this a new routine, another generation of teachers and students can pass through your school missing out on the assistance quality data can provide.
Great article Tom. Seems like an overwhelming task but exciting times ahead
Accurate data analysis is great in determining possible outcomes.