Metrics in Learning Analytics

Before life in the United States was rudely interrupted (to put it extremely mildly) last week, I teased the concept of metrics in learning analytics. Where dimensions frame a problem space, metrics are (literally) the things I look at to model the problem space. In other words, metrics are how I make sense of an analysis.

Problem Spaces need a model that describes the space (dimensions) that enable me to make sense of it, or “perform an analysis.”

More than counting.

My analysis is always going to be defined by my metrics. Often, I need to count or tally things, but COUNTS ARE NOT ANALYSIS. In presentations, I often talk about how an xAPI statement is an observation. I explicitly associate that a given metric will likely have an xAPI statement to support an observation (observations = metrics = xAPI statements).

Here’s an example of how I generally approach learning and job-task analysis (relating the learning to doing). While these dimensions of the learner, the learning activity, and the learning experience might make sense to a learning professional, it’s important to remember that these dimensions aren’t (and shouldn’t be) the only way to do learning analytics. I’m expressing a complex enough model that if 80% of readers copy and paste without more forethought, it hopefully will point them in better directions than they were going before reading (the whole “making better” thing).

“All models are wrong, but some are useful.”

Anyone putting together an learning analytics strategy is arbitrarily framing a box, more or less, in which learning happens. Only through the pinholes of such a theoretical box can learning be observed. Then, only by putting those observations together can an analysis be performed.

simple model of learning analytics dimensions: Learner by learning activity by learning experience
Pinholes in a theoretical closed box

So what should those pinholes be? This seems like a pretty straightforward question. Working with stakeholders to define metrics in learning analytics requires patience, planning and practice. I’ll share methods for getting to these metrics in the coming weeks.

Barring further interruptions, tomorrow I’ll review MakingBetter’s work over the past year. On Friday, expect some catch-up on what I have been up to, with Elsevier, ADL and IEEE. Lots. To. Share.