Explorance news

Putting “Learning” Back in Analytics

Without question, the momentum behind analytics in higher education has been building. Institutions have expressed a hunger to detect patterns in their vast quantities of data and the marketplace has answered, by providing more and more tools for analysis. Unfortunately, this momentum has collapsed different types of analytics under one overarching and seemingly singular term, resulting in a mistaken understanding that, to paraphrase Gertrude Stein, “analytics is analytics is analytics is analytics.”

One example of this is the emergence of learning analytics, a recent approach focused on the analysis of student learning data such as academic grades and student interactions with online course materials. Some organizations such as Educause and others in the United Kingdom and Australia have sought to define it, yet many discussions about learning analytics still lump it together with other forms of analytics. Most tend to equate it with “operational analytics,” which is an approach that aims to examine institutional data for discovering patterns of enrollment, retention, and other areas not directly tied to student learning.

This treatment obfuscates any unique considerations institutions seeking to deploy a learning analytics solution should have. Likewise, an institution can go down the road of devoting costly efforts and resources to tools bearing the name “analytics,” without gaining any understanding of whether its students are succeeding, which students need support, or which courses are efficiently delivering instruction.

Understanding Analytics

In a recent webinar featuring Intellify, we sought to build a framework with which institutions can unpack the meaning of learning analytics and develop a strategy. Building on the excellent work already done in this area by the Open University of the Netherlands, we defined learning analytics and described how it differs from operational analytics based on four simple questions:

Who is the audience that will review the output of the analysis?
Why do you need to analyze data (rationale)?
What are the data and systems required for the analysis?
How is the data being crunched for analysis (methods and models)?

As shown in the table below, applying these questions to the approaches of learning and operational analytics shows some significant messiness between the two, but also shows some clear areas of differentiation.

Comparison of Learning and Operational Analytics

Dimension Learning analytics Operational analytics
Who?
(Audience)
Mainly aimed at instructional staff and instructional designers. Primarily aimed at institutional leaders, such as provosts and enrollment managers.
Why?
(Rationale)
Drive specific teaching and learning strategies, including interventions and curricular improvement. Understand how the institution is performing overall and identify any concerning patterns.
What?
(Data Sources)
• Learning Management Systems
• Student Information Systems
• Student Engagement Systems
• Intelligent Tutoring Systems
• Learning Management Systems
• Student Information Systems
• Constituent Relationship Management
• Enterprise Resource Planning Systems
How?
(Methods & Models)
• Predictive Modeling
• Prescriptive Modeling
• Diagnostic Modeling
• Descriptive Modeling
• Student Engagement Analysis
• Predictive Modeling
• Prescriptive Modeling
• Diagnostic Modeling
• Descriptive Modeling
• Historical Data Mining

The two approaches overlap in some of the models and systems used—each approach leverages the more standard analytic models (e.g., predictive and prescriptive), as well as relying on data from learning management systems and student information system.

On the other hand, learning and operational analytics demonstrate the greatest differences regarding their audiences and rationales. Learning analytics targets instructional staff, while the audience for operational analytics is more likely to be staff focused on enrollment or the overall health of the institution. Likewise, the focus of learning analytics is more on student and curricular improvement, while operational analytics focuses on discerning patterns that concern the financial, enrollment, and other more general aspects of an institution.

Selecting a Learning Analytics Solution

Understanding the meaning and boundaries of learning analytics as part of the broader analytics landscape is important, but how should an institution go about choosing a solution? We have seen two solutions that show promise:

  • Intellify: While impressive in all components, Intellify shows strength in its focus on the audience and the source systems for learning analytics. It helps institutions develop and capture metrics to support its questions and captures data across a wide range of source systems, including courseware providers, educational apps, and publishers.
  • Explorance: Like Intellify, Explorance makes a strong showing across our “who, why, what, how” framework, but is especially strong in understanding and adapting to the needs of the learning analytics audience. Specifically, Explorance grasps that, because the learning analytics process is iterative, its solution has to be responsive to any new questions users may discover as a result. Its product—called “Blue”—has a robust dashboard, which can also display course evaluation data, standardized using a using its proprietary text analysis engine.

BlueBlue analyticsEducational experienceStudent insight solutions

Get in touch with us about this article.

Stay connected
with the latest products, services, and industry news.