5. Introduction to Learning Analytics

This pages is used to share links to relevant resources that each of you locate.  After you have located relevant resources and posted annotations on your wikifolio each week, paste the link and the annotation for the one you think your classmates will find most relevant in the appropriate place below.  put your name in parentheses so we know who added it.  Please be tidy and succinct.  If it become apparent that a new category is needed you should consider adding it. Please note that is multiple people are editing simultaneously your edit may be lost.  Cut and paste the page into your browser before you save just in case.  If you want top post files such as articles, save the file to a public folder or as a public file in Google drive and paste the link here.

 Quick Links:   Student Wikifolios    Classwikis        Class Discussion Links

Aggregators

These are general resources that aggregate other resources and may be good hunting places for peers.

 

http://www.educause.edu/library/learning-analytics Links to an external site.

This website is a collection of links on learning analytics that covers a wide variety of topics.

 

Articles

If you want to post files such as articles, save the file to a public folder or as a public file in Google drive and paste the link here.  Consider posting a formal reference in the annotation in case your peers want to cite the article in their paper.

 Schreurs and de Laai (2014)  (Links to an external site.)presented a thorough review analyzing learning using network analysis tools and theoretical understandings of learning as a network. Further, this paper presents a network awareness tool (NAT) that is intended to identify real time emergence and patterns of interactions and display those interactions to participants, instructors, and researchers in a visualized network graph.

Jones, S. J. (2012). Technology Review: the possibilities of learning analytics to improve learner-centered

                decision-making. Community College Enterprise, 18(1), 89-92.

This article from 2012 deciphers between Academic Analytics and Learning Analytics. In relation to higher education, they reference the SoLAR definition of academic analytics as data gathered to move the institution forward from an organizational standpoint. Learning analytics would then be directly related to the learner.

Koch, F., & Rao, C. (2014). Towards massively personal education through performance evaluation analytics. International Journal of Information and Education Technology, 4(4), 297-301. (Links to an external site.)

  • Provides "a framework to provide affordable alternatives for data collection, information services, and Analytic models about the classroom environment. This development advances the state-of-the-art by introducing an alternative to analyse the education performance based on differentiated multi-dimensional data and large data sets of relevant data and information.
  • (shared by Amanda Mason-Singh)

 

Macfadyen, L. P., & Dawson, S. (2012). Numbers are not enough. Why e-learning analytics failed to inform an institutional strategic plan. Journal of Educational Technology & Society, 15(3), 149-163. (Links to an external site.)

 In this study the author’s conducted a “current state analysis” of trends and patterns of useage of a learning management system at a university. One of the reasons I chose this resource was its focus on LMS data. Something I found interesting is the author’s description of limitations within the LMS and some adjustments they made to compensate for them. I found their findings to be interesting, especially what they derived in terms of how users are spending their time in the LMS.

Hickey, D. T., & Willis III, J. E. Research Designs for Studying Individual and Collaborative Learning with Digital Badges.  (Links to an external site.)

This paper does a good job of pointing out the many potential uses of badges for and in research. I especially think their point in 4.6 is salient, so I copy it here:

4.6 Research WITH Badges and FOR Ecosystems Eventually researchers are likely to begin using the evidence in digital badges to systematically study and improve entire learning ecosystems. In this way it seems possible that digital badges might ultimately transform the entire learning analytics movement. But this seems unlikely to even get started until clear research design principles for summative and formative studies using the evidence in badges emerges.

This is one of the few kind of methodology-oriented papers I have found on LA and badges.

Rubel, Alan, and Kyle ML Jones. "Student Privacy in Learning Analytics: An Information Ethics Perspective. (Links to an external site.)Forthcoming in The Information Society (2015)(2014).

Rubel and Jones provide a general overview of learning analytics and its development, and provide a list (from Long and Siemens) of goals for learning analytics:

  • better institutional decision making and resource use; 
  • improved learning for at-risk students;
  • increased institutional transparency; 
  • transformative change to teaching methods; 
  • better insight into networked knowledge; 
  • data-driven experimentation for administrative problems (e.g., enrollment and retainment); 
  • increased “organizational productivity and effectiveness”; 
  • value-ranking of faculty activity; 
  • comparative learning metrics for students (e.g., how a student compares to her peers in a particular area). 

They then provide a deep discussion of privacy in information contexts and how lessons previously learned about privacy and data science can be applied to the educational context of learning analytics. They present the question of privacy from the perspectives of "privacy in respect to whom," privacy about what,"proper accounting of benefits and burdens," and "awareness and control." The conclusion covers four direct problems to consider for designing learning analytics research:

We have argued that before we conclude that learning analytics is justified, proponents must address four narrow problems related to the use of student data, and we have posited partial answers to each: 1) learning analytics systems should provide controls for differential access to private student data 2) institutions must be able to justify their data collection using specific criteria–relevance is not enough; 3) the actual or perceived positive consequences of learning analytics may not be equally beneficial for all students, and the cost, then, of invading one student’s privacy may be more or less harmful, and we need a full accounting of how benefits are distributed between institutions and students, and among students; finally, 4) in spite of legal guidelines that do not require institutions to extend students control their own privacy, they should be made aware of collection and use of their data and permitted reasonable choices regarding collection and use of that data. We also argued that there is a wider question concerning learning analytics: the practice may diminish student privacy to the detriment of a student’s autonomy, which is related to some important values underwriting higher education. To the extent that higher education is important as a function of autonomy, learning analytics is justifiable just to the extent that it does indeed promote autonomy.

Page 22

Kelkar, S, (in review). Platformizing “learning”: MOOC platforms, learning research and new forms of expertise. DigitalSTS: A Field Guide and Handbook. This is part of an online journal, currently in peer-review. Link to papers here - you'll need to create a login. (Links to an external site.) The format is that draft papers are presented for feedback before being finalized. This one looks at the intersection between computer scientists who have recently discovered "learning" as an area of interest and "learning scientists" who were already addressing the field. A fascinating take on "jurisdicational negotiations"in an emerging field.  (Other papers in this handbook may also be of interest - the reviews are open until June 15, and I understand that non-participants are permitted to review.) Note: Submitted by Linda Gilbert. I chose this reference because (1) it illustrates a "boundary dispute" that is relevant if we're talking about history of this field and how it connects to others and (2) the journal itself - and its format - may be of interest.

Choudhury, T., & Pentland, A. (2003, October). Sensing and modeling human networks using the sociometer. In null (p. 216). IEEE. Links to an external site.

(From Una Winterman) The sociometer can be used to measure interactions between different people, tracking the development of social networks in person (as opposed to online) based on the proximity, frequency, and length of time people talk and interact with each other, along with recording the conversations. Choudhury and Pentland were seeking to better understand how communications related to “(i) diffusion of information (ii) group problem solving (iii) consensus building (iv) coalition formation etc.” 

Websites

These are specific websites

 

Website: acrobatiq powered by Carnegie Mellon (http://acrobatiq.com/ Links to an external site.) (shared by Dianne Parrish)

(**DISCLAIMER**:Dr. Hickey pointed out this website to me outside of class.  I felt it was interesting enough that I wanted to share it here!)

This is an example of a company that has developed “adaptive courseware” that provides educators with predictive learning analytics allowing for interventions when and where needed.  The visualization of the data is provided on what they call the “Learning Dashboard” that faculty can personalize to show the data that they want to see in the order they want to see it in.

 

Videos, Podcasts and Slideshares

these are narrated resources (may be appropriate for listening while commuting

 

 

Other

These are things that don't fit in the other categories.