Learning Analytics Functional Taxonomy
UW-Madison needed a practical, easy to understand framework for discussing the educational benefit of learning analytics. Faculty and instructor fellowships explored and adopted a pragmatic framework* of learning analytics applications because it easily distinguishes between purposes. This framework looks at learning analytics form a top level view; it is intended to answer the commonly asked question “why are people using learning analytics?”
Other questions include:
- What is the main focus? [Does it focus on the learner, event or content?]
- Who is the primary stakeholder? [learner, teacher or institution?]
- What type of feedback is provided? [reflective, adaptive or predictive?]
The six functional taxonomies can be explored in the tabs below.
Access Learning Behavior
Access Learning Behavior
Definition:
Learning analytics can collect user-generated data from learning activities and offer trends in learning engagement. Analyzing those trends can reveal students’ learning behavior and identify their learning styles. This approach measures engagement and student behavior rather than performance, giving instructors insight into how their students interact with their course materials.
For example, an instructor would be able to see when, how long, and how often a student accesses different activity types in Canvas.
Key Features:
- Drive impact of behavior (for example, correlations between behavior and outcome)
- Identify behavior profiles/approaches to work; can be used to provide guidance, support, and resources
Defining Characteristics:
- Unique: Measures engagement and student behavior rather than performance
- Focus: Event-centric (the main focus of the analysis is on the interactions of a learner)
- Feedback Type: Reflective (i.e. presenting historic data analysis)
- Primary Stakeholder: Teacher, Learner
- Guiding Questions:
- Understanding students
- What do students do?
- Student support
- Efficiency
- Modeling learner success (course level)
- Learner self-assessment
Examples of Access Learning Behavior Approach:
- Check my Activity
- JISC Case Study B: Analyzing Use of the LMS at the University of Maryland, Baltimore County.
- Protoypical example at Univ. Maryland: provided information about how students use the website.
- Key takeway: students who used website more got a C or higher.
- Constant over time.
- Allowed students to compare their engagement with website with other students in the course.
- Using Learning Analytics to Assess Students’ Behavior in Open-Ended Programming Tasks
- Stanford. Track students while coding in a Comp Sci class.
- Seven canonical approaches.
- Preliminary work about approaches but not predictive.
- Possible utility recommended: for letting students know where the problems lie (eg get stuck in the middle not the end) and provide resources.
- Learning latent engagement patterns of students in online courses
- Applying classification techniques on temporal trace data for shaping student behavior models
- LAK 2017 Conference Proceedings: [pg 100] Identifying Non-Regulators: Designing and Deploying Tools that Detect Self-Regulation Behaviors
- Predictive Analytics at Nottingham Trent University: JISC Case Study 1
Evaluate Social Learning
Evaluate Social Learning
Definition:
Learning analytics can be applied to investigate a learner’s activities on any digital social platform — such as online discussions in Canvas — to evaluate the benefits of social learning. This measures and tracks student-to-student and student-to-instructor interactions to help understand if students are benefiting from social learning in their course.
For example, an instructor might apply social network analysis to their online discussions and identify students that bridge groups as knowledge shepherds; they could also identify other students that may not be connecting with others as much as expected.
Key Features:
- Focus is on digital aspects of learning
- Derives meaning exclusively from learner/learner & learner/teacher interactions
Defining Characteristics:
- Unique:
- Focus on the digital aspect of learning (not F2F) social interactions.
- Clickable, trackable.
- Focuses on the student-to-student or student-to-instructor interactions in these environments.
- Allows reflection
- Focus: Learner-centric (the application of the analytics is specifically on an individual as a learner)
- Feedback Type: Reflective (i.e. presenting historic data analysis)
- Primary Stakeholder: Teacher
- Guiding Questions:
- Understanding students
- What do students do?
- Measuring student learning
- Modeling learner success (course level)
- Learner self-assessment
Examples of Evaluating Social Learning Approach:
- SNAPP: Realizing the affordances of real-time SNA within networked learning environments
- Discourse-Centric Learning Analytics
- Bringing order to chaos in MOOC discussion forums with content-related thread identification
- MOOCs, social networking within MOOCs.
- Focus on distinguishing signal to noise:
- SO much interaction in there, what’s relevant and not.
- Looking at information chaos/overload.
- Models to sort discussion threads by relevance to learning.
- Question about algorithm for what is being deemed “relevant” (and who has input into that algorithm)
Improve Learning Materials & Tools
Improve Learning Materials & Tools
Definition:
Learning analytics can track a student’s usage of learning materials and tools to identify potential issues or gaps, and offer an objective evaluation of those course materials. This allows instructors to make deliberate decisions about modifying approaches. Using aggregate student data, instructors can see ways to improve the process of learning or the structure of their course.
For example, learning analytics might show that a large percentage of students in a course struggle with a newly introduced topic based on quiz answers.
Key Features:
- Uses aggregate student data to adjust instructional practices and materials
- Primary focus is on outcomes (performance) and process
Defining Characteristics:
- Unique: Primarily for instructor use
- Focus: Content-centric (the primary emphasis of the application is on the curriculum, course content, or materials.)
- Feedback Type: Reflective (i.e. presenting historic data analysis)
- Primary Stakeholder: Teacher
- Guiding Questions:
- Are materials meeting intent?
- Understanding ourselves as teachers
- Efficiency
- Course design
- Effective outcome alignment
- Measuring student learning
Examples of Improving Learning Materials & Tools Approach:
- Assignments.org
- Students (HS/MS math) completing problems outside of class, submitting online, teacher gets report on how students did, then uses info to change instruction that follows, instead of giving feedback to every student individually.
- Based on past student performance that allows modification of future design of the learning experience.
- Concern: value of data about individual vs aggregate.
- Informing learning design with learning analytics to improve teacher inquiry (Persico & Pozzi)
- LAK 2017 Proceedings: MAP: Multimodal Assessment Platform for Interactive Communication Competency
- LAK 2017 Proceedings: [pg 131] Lessons Learned from a Faculty-Led Project: Using Learning Analytics for Course Design
Individualized Learning
Individualized Learning
Definition:
Adaptive or individualized learning systems apply learning analytics to customize course content for each learner. Furthermore, user profiles and other sets of data can be collected and analyzed to offer greater personalized learning experiences. This approach uses continuous feedback to help individual students in their learning.
For example, if an instructor tests students on three topics and a student shows mastery of two topics, but not the third, a program may be able to deliver additional material to the student regarding the topic that has not been mastered, rather than delivering further material/practice questions on concepts the student already has a grasp on.
Key Features:
- Driven by information about learners’ prior experience/characteristics (learning style preference, content knowledge etc.)
- Focus is on continuous feedback in real time
Defining Characteristics:
- Unique:
- Primary focus is learner-centric nature with emphasis on individual learners.
- Instructor can be hands-off once it’s set up.
- Real-time component for learner. Does not need to wait for instructor to respond.
- Focus: Content-centric (the primary emphasis of the application is on the curriculum, course content, or materials.)
- Feedback Type: Adaptive (i.e. presenting real-time data analysis)
- Primary Stakeholder: Learner
- Guiding Questions:
- Understanding students
- Student support
Examples of Individualized Learning Approach:
- LAK 2017 Proceedings: [p88] Piloting Learning Analytics to Support Differentiated Learning through LearningANTS
- Math tutoring example – compare software to face-to-face tutoring. They tried to figure out if outputs from the system were better than tutors.
- Conclusion: system outperformed traditional tutoring.
- Premise: students need to approach tutors, vs system spits out recommendations to students
- Using Data to Understand How to Better Design Adaptive Learning (Liu et. al.)
- Baseline knowledge example – remediating to get to the expected baseline. Response to student performance.
- A Fuzzy Logic-based Personalized Learning System for Supporting Adaptive English Learning. (Hsieh et.al)
- Adaptive Learning (Kerr)
- Development of an adaptive learning system with two sources of personalization information (Tseng et. al.)
Predict Student Performance
Predict Student Performance
Definition:
Based on already existing data about learning engagement and performance, learning analytics applies statistical models and/or machine learning techniques to predict later learning performance. By doing so, likely at-risk students can be identified for targeted support. Focus is on using data to prompt the instructor to take immediate action to intervene and help a student course- correct before it is too late.
For example, if a student’s behavior and performance in a course suggest that a student is struggling, an instructor has an opportunity to intervene. Predictive analytics can also help instructors identify students that are doing OK, but may need some additional motivation to do better in the course (a C student that could be a B student).
Key Features:
- Learners can be provided with feedback to encourage study habit behaviors
- Focus is squarely on “actionable intelligence”
Defining Characteristics:
- Unique:
- Common themes: LMS + demographic data used.
- Focus on taking action
- Focus: Learner-centric (the application of the analytics is specifically on an individual as a learner)
- Feedback Type: Predictive (i.e. presenting predicted future state or events)
- Primary Stakeholder: Teacher
- Guiding Questions:
- Understanding students
- Student support
- Modeling learner success (course level)
- Learner of self-assessment
Examples of Predicting Student Performance Approach:
- Signals: Applying Academic Analytics
- Purdue University. Used LMS to data mine info from performance of students. Used data to send info to students to students at potential risk;
- Actionable performance.
- Used red/yellow/green warning system to indicate status to students.
- Comments about ability of faculty to use appropriately, concerns about students being overwhelmed by amount of data and warning signals.
- Two weeks into semester could detect at risk.
- Open Academic Analytics Initiative – JISC Case Study E
- Day 0 predictions with student data generated before they begin the courses.
- Improving retention: predicting at-risk students by analysing clicking behaviour in a virtual learning environment
- Forecasting student achievement in MOOCs with natural language processing
Visualize Learning Activities
Visualize Learning Activities
Definition:
This approach traces all learning activities performed by users in a digital ecosystem to produce visual reports on the learner behavior, as well as performance. The reports can support both students and teachers to boost learning motivation, adjust practices and leverage learning efficiency. This is about facilitating awareness and self-reflection in students about their learning patterns and behaviors.
For example, a learning analytics tool may help a student see how much time she is spending on certain activity types compared to her peers, and how that might relate to performance measures.
Key Features:
- Making learning visible for students and instructors
- Facilitates awareness–>self-reflection–> sense making–>impact
Defining Characteristics:
- Unique:
- Common Supposedly the simplest of the applications (novice – low).
- Expertise requirements (“low”) but depends on LMS skill, ability to get the data, and manipulate it, and know how to use it for what purpose.
- Awareness –> self-reflection –> sensemaking –> impact
- Focus: Event-centric (the main focus of the analysis is on the interactions of a learner)
- Feedback Type: Reflective (i.e. presenting historic data analysis)
- Primary Stakeholder: Teacher, Learner
- Guiding Questions:
- Understanding students
- What do students do?
- Student support
- Efficiency
- Measuring student learning
- Modeling learner success (course level)
- Learner of self-assessment
Examples of Visualizing Learning Activity Approach:
- Learning Analytics Extension for Better Understanding the Learning Process in the Khan Academy Platform
- Learning Analytics Dashboard Applications (Verbert et al)
- LeMO: a Learning Analytics Application Focusing on User Path Analysis and Interactive Visualization
- LAK 2017 Conference Proceedings: [pg 36] Supporting Classroom Instruction with Data Visualization
- GLASS: A Learning Analytics Visualization Tool