1.1 What questions can learning analytics help answer?

Learning analytics data can be reviewed to provide some insight into questions like:

  • What is the average grade for each assignment in my course?
  • Which pages are students viewing in my site?
  • When are students most active? least active?
  • Which students have late and/or missing assignments?
  • How does an individual student's participation compare with their peers?

CONSIDERATIONS

When asking questions of learning analytics data, keep in mind that the terminology in the tools may not accurately represent the behaviors generating them. For example: 

  • Many learning analytics tools use labels like "views" or "watches," but the behaviors they represent are usually more accurately described as "accesses" or "plays." A student may access a syllabus file to make sure they can open it or to download it for reading later, and a student may play a video and then leave the room. A group of students may also watch a video together, but with only one student actually playing the video for the group, the other students' data would report 0 "watches."
  • One metric may represent a variety of behaviors. For example, counted "participations" in Canvas' New Analytics can represent the following events:
    • Announcements: posts a new comment to an announcement
    • Assignments: submits an assignment
    • Collaborations: loads a collaboration to view/edit a document
    • Conferences: joins a web conference
    • Discussions: posts a new comment to a discussion
    • Pages: creates a page
    • Quizzes: submits a quiz
    • Quizzes: starts taking a quiz

(Analytics Page Views and Participations, 2021 Links to an external site.)

The complexities impacting these data don't necessarily decrease their usability, however. We'll cover how to work with them regardless of their limitations in Module 5: Learning Analytics in Context: Review Amend Apply.

Questions Unizin Consortium faculty are asking of learning analytics

The Unizin Consortium Structured Conversations initiative met with many faculty at multiple Unizin Consortium institutions to find out what kinds of data they would like to see and why. The following are some of the requests from faculty paired with considerations to keep in mind when looking at data for particular purposes. This is only a small selection of the kinds of questions learning analytics can help answer -- many others can be generated when considering a specific course or courses.

 

I want to know . . .  Consideration

which specific pages students are viewing, because I think it would provide context for in-class participation and and help me see what students think is important and not important.

Many learning analytics services currently report which pages in the learning management system students are clicking through. Getting a sense of which content students are – and aren’t – accessing can be an effective way to identify possible areas for coaching or changes to course design. As described in the Review, Amend, Apply section, students should also have an opportunity to provide their context (i.e., why they are or aren’t accessing certain content) in order to avoid wasting time applying inappropriate interventions.

how much time students are engaging with certain types of content or materials rather than just if they "viewed" it.

Learning analytics in isolation can’t report whether a student is “engaging” with content – in fact, it’s even possible that a student could open some content without truly “viewing” it (e.g. if they opened and immediately closed a document just to make sure they could access it, or opened the document to make sure they could access it and then left it open while they engaged with other work). Learning analytics provide indicators, not certainties, and are therefore best used alongside traditional methods for measuring engagement with course content.

if a student is really doing what the analytics say they are (e.g. being in the course site for hours on end) so I understand to what extent the data can be trusted.

Learning analytics will always be impacted by technical and human errors, whether during their generation or interpretation. It is critically important to recognize that while learning analytics can be powerful indicators, they are significantly less accurate when considered without including student narratives. Consider being transparent with students about how the analytics are used so that they understand the purpose of reviewing course activity. If students are aware that the learning analytics will be used with them to improve their learning, not to penalize them, they will be less likely to generate false activity.

how the data between different semesters compare because I want to gauge if my teaching methods are improving.

Comparing data between semesters is a powerful method for identifying the efficacy of various changes made to a course. Examples of possible indicators may include an increase in students accessing some course material since adding a short quiz covering the material, or a dramatic positive or negative change in grades earned after introducing new content. Considering that a multitude of events can potentially impact semester-to-semester success, it’s recommended that student feedback on the relevant changes is included when evaluating their efficacy.