List of Figures and Tables

in Emerging Trends in Learning Analytics
Open Access
 Figures
3.1Frequency of technologies involved. 30
3.2Frequency of research types. 34
3.3Frequency of subject matters at different educational levels. 35
4.1Number of authors per selected article. 50
4.2Distribution of country or geographic area of study. 50
4.3Distribution of research design. 51
4.4Distribution of participants’ grade level. 54
4.5Distribution of data collection types. 55
4.6Distribution of employed data analysis method. 56
4.7Research roadmap for learning analytics. 59
6.1Tensions in the digital environment. 90
7.1Learning analytics process. 102
7.2Samples of learning dashboards. 105
7.3Learning analytics process model (adapted from Verbert et al., 2013). 106
7.4The abstract architecture of PLDs. 107
7.5The PLD1 tab: A=The button named Open/Close PLD; B=Profile Info; C=The tab named Basic Usage; D=Titles of Indicators; E=Individual Performance; F=Class Means. 108
7.6The PLD2 tab: A=The tab named Learning Objects; B=Titles of Indicators. 109
7.7The PLD3 tab: A=The tab named Discussion Activities; B=Titles of Indicators. 110
7.8The PLD4 tab: A=The tab named Assessment; B=Titles of Indicators. 110
7.9The PLD5 tab: A=The tab named Recommendations; B=Titles of Indicators; C=Recommendation Messages. 111
7.10Employed process for systematic review. 113
7.11Distribution of learning dashboards across educational levels by target users. 113
7.12Distribution of learning dashboards by educational levels. 114
7.13Distribution of learning analytics indicators used in learning dashboards. 116
9.1Conceptual model of UMBC’s data warehouse known as report exchange (REX). 162
9.2Fall 2013 freshmen & transfer student retention by Bb LMS risk profile. 163
9.3Fall 2017 freshmen & transfer persistence prediction by days enrolled & LMS use. 164
9.4Spring 2015 LMS adaptive release use & percentage items accessed by students. 167
9.5ASU’s results with ALEKS in college algebra. 168
10.1Student, learning, and curriculum profiles of the learning analytics framework (adapted from Ifenthaler & Widanapathirana, 2014). 182
10.2Students’ frequency of use of the different resources in the learning management system for each week of the semester. 191
10.3LeAP cockpit including learning outcomes and resources (on the left), information about reminders (top right) and students’ possibility to decide about their anonymity (bottom right). 195
11.1Interface of social presence visualization (Yamada et al., 2016). 202
11.2The interface of the change of social presence visualization. 203
11.3Interface of chat, concept map and member’s login time display (Yamada et al., 2016a). 207
11.4The results of sequential equation modeling for the relationships among social, cognitive, and perceived contribution. 219
12.1Block display of the active learner dashboard. 227
12.2Active learner ranking. 227
12.3Active learner process and active learner distribution. 228
12.4Summary of students’ answers regarding active learner ranking. 228
12.5Overview of slide summarization. 230
12.6Examples of selected pages. 231
12.7Average quiz scores (*P < 0.05, **P < 0.01, ***P < 0.001). 232
12.8Achievement ratios: percentages of students who previewed more than 80% of slide pages. 233
12.9A case study of lecture support. 234
12.10Real-time heat map of browsed pages. 235
12.11The second section of the reporting tool showing weekly keyword rankings. On the left, rankings of noun keywords for the first three weeks are shown. Additional dynamic features are shown on the right. 240
12.12The third section of the interactive reporting tool shows temporal changes in adjective word usage. The frequencies of individual words are shown as a stacked bar chart on the left. The same for three groups of words, positive, negative, and other, is shown on the right. 241
13.1Evolutionary tree of tools. 252
14.1Learning analytics cycle (Khalil & Ebner, 2015). 273
14.2UML of abstract factory pattern for chart.php. 278
14.3Sequence diagram of widget creation. 279
14.4Widget for logins over time with filter set for the duration of the SIMOOC. 280
14.5Widget with the comparison of the post frequency between three different MOOCs. 281
14.6Widget for quiz attempts, which opposes users with state in progress to state finished. 282
 Tables
3.1Articles reviewed and the journals these articles were published in. 28
3.2Types of data collected in the reviewed research. 31
3.3Types of analytics techniques used. 33
4.1Rank and title of journal included in literature search. 47
4.2Search results for included and excluded articles. 48
4.3Aim of selected articles. 53
5.1Learning analytics objectives and approaches for ODL institutions. 69
5.2Benefits of learning analytics: improve student retention. 73
5.3Benefits of learning analytics: predict at-risk student and student failure. 74
5.4Benefits of learning analytics: deepen understanding of student behaviors. 75
5.5Benefits of learning analytics: enhance support to students. 76
5.6Benefits of learning analytics: enhance student participation. 77
5.7Benefits of learning analytics – provide feedback and intervention. 78
7.1Learning analytics indicators and data visualization techniques used in learning dashboards. 114
10.1Decision matrix. The last column shows the final prioritization as product of the four aspects: students’ willingness to use a feature, perceived learning support, technological effort of implementation and the organizational effort. 184
11.1Social presence indicators and rule-based procedures for text analysis for social presence (Yamada et al., 2016). 205
11.2Activities for project based learning (PBL). 209
11.3Part of task assignment matrix form. 211
11.4Sample of group research results (authors did not modify wrong spelling). 212
11.5Community of Inquiry Instrument (Swan et al., 2008). 216
11.6Indicators of Cognitive Presence (Shea et al., 2010). 218
11.7The descriptive data of the social and cognitive presences, the perception of contribution to the collaborative task, and perceived effects of the visualization interface. 218
12.1Criteria for active learner point (ALP). 226
12.2Number of students who browsed the slides. 231
12.3Synchronization ratio of each group in three minutes of allowable delay.. 237
12.4Example results of weekly keywords extraction. Weekly keywords are extracted from the journal entries for the course “information science.” From the 1st week to the 14th, actual topics taught in classes and extracted weekly keywords are listed. We can see topic-related words were extracted and general words were eliminated successfully. 39
12.5Example lists of strongly associated words for the words “difficult” and “cipher.” For the word “difficult,” the most strongly associated weekly keywords are shown. For the word “cipher,” in turn, the most strongly associated adjectives are shown. We can see what topics are considered difficult by students and how students feel about cipher from the lists. 241
12.6An example result of impression abstraction by NMF. In this example, we obtained four groups of adjectives. We can see every group is composed fairly differently. 242

Table of Contents

Index Card

Metrics

Metrics

All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 8 8 5
PDF Downloads 0 0 0
EPUB Downloads 0 0 0

Related Content