white clouds


“Question-storming” is an exercise I tapped into back in the very early xAPI days. I hit a wall; one I suspect most of y’all come up against when you’re ready to start doing something meaningful with xAPI. I struggled to articulate questions that felt like it made xAPI worth doing, and I would get discouraged about it.

A headshot of Aaron sporting a thick "fu-manchu" mustache.
Speaking of question-able choices, what about my mustache circa 2016?

When I look at what analytics solutions do (not data visualization tools – I’m talking web analytics), the analytics are geared towards defined funnels and targeted actions that map to KPIs. That’s a little too simplistic for what I plan to infer from the datasets y’all and I will build, together, with xAPI.

xAPI can provide feedback that is more attuned to how people think and relate information. It requires a different perspective on what questions are important to answer with data than how analytics work most everywhere else in technology. It’s no wonder to feel frustrated at the start to design, engineer or architect beyond replicating patterns established in eLearning or video.

Waxing Philosophical

All the learning environments we create online, all the social media we use, the apps we build, the laptop or tablet or phone you’re reading this on… it’s all built on a system that, at its very metaphorical base, is one of files and folders. One might think of hard drives as file cabinets, etc.… nice, neat and tidy. There are rights to knowing things in a folder vs only knowing about a specific file. There are ways to reference information precisely. This is important and useful stuff for referencing things technically.

The thing is that everything I read and learned and ultimately synthesized through my master’s program at University of Wisconsin informs me that humans organize information related to people, to places, to shared experiences. These relationships are not limited to strict hierarchies, like file and folder structures, as much of our technologies are. A hypothesis I have is that after two decades of constant online living, we as people get hyperfocused on the box we’re in, when we need to be looking through all the things that are outside our box we can play with.

Constants like time, for example, exist beyond the limits of your, and my, creativity (or lack of it). It’s a handy place to start question-storming because time is a concept that everyone involved with making, or taking meaning from the data can understand similarly.

Prioritize Research

At Elsevier, my product’s manager, Lauren, fleshed out a quarterly OKR that puts applied research and development as a focus for our team’s work. Each quarter, we decided, we’ll come up with a question to answer with our data set that we don’t quite know how to answer. A chance for us to improve our capacity and capability for learning analytics, as a cross-functioning product team that includes software and quality engineers, product managers, UX designers, commercial folks, content folks, users in multiple roles.

It’s the investment in drinking our homebrew that excites me most. We are all new at doing this. We get smarter about learning analytics when we establish trust among everyone who depends on the same set of data, regardless of why it may be important for different reasons to different people.

Yes, I’m talking ‘bout Praxis.

Question-storming “How Long Does It Take…?”

So, with time as our ally, Team Apollo at Elsevier began question-storming “How long does it take to get through an activity?”

Seems like a really simple question, right? Give it a few seconds to get your own doubts going. Turn those into questions.

  • Did he mean “learning activity” or, like, a survey?
  • What does it mean to “get through” an activity, specifically?
    • All the way to the last frame of linear content?
    • Do you need to have seen all the content?
    • Must you have passed in order to be considered complete?

Now… reframe the main question to reflect all the permutations of these bulleted conditions on how to interpret it. That is the question-storming.

Next post, I’ll highlight what we produced to answer our research question(s), what statements we used… I’ll go full nerd. I might get into how the results set a research agenda for the year. Meanwhile, in the comments, maybe share your takes:

For now, what are other ways to interpret “How long does it take to get through an activity?”



2 responses to “Question-storming”

  1. Without a didactic purpose there is no reason to even ask “How long does an activity take?”, let alone to explore its possible variations.
    With a didactic purpose only a small set of variations makes sense, again no need to explore all variations.
    Example: If a teacher plans the time for an online exam, time from exam start to exam submit may be worth looking at but that’s irrelevant for a homework.

    1. Ingo! What a treat to hear from you! My point in the exercise is to suss out a didactic purpose that all stakeholders recognize. So there is a point in asking.. because that’s the question people ask. My point is to help refine that into a question that is worth investigating. A first step.