Posted by: Gillian | January 12, 2011

Learners using their LMS data

Last night John Fritz of UMBC gave a presentation in George Siemens‘s open learning analytics course that could be of interest to many higher education learners.  John gives students access to the data that is captured on most learning management systems (e.g. Blackboard, Moodle) and that is traditionally analysed by administrative staff to help develop future courses.  John’s point is that the data can be useful to the learners so why not let them have it? The question is not straightforward so, what use can you make of it and what are the problems?

The good

I can almost guarantee that if I want to hear from a ‘missing’ student, I just put a zero in their attendance grade. Students read gradebooks.  It is therefore reasonable to assume that students are concerned about course performance. Students might be very interested in seeing the statistics that show the relation between how often they have logged on, what they read, how many posts they made and to whom, along with the wealth of other data that might have been collected in order to shed light on how their effort correlates to a particular grade.  If that information can then be compared with similar data from all other students in the class, you can get a better picture (particularly if you never meet face-to-face) of your performance.  For confident adults, unfazed by the idea of being in charge of their own learning and development, the information can be used in a number of ways. You might get clues as to how to improve if, for example, you ignored the discussions and then discover that all the top grades went to those who had contributed significantly to them.  At a more sophisticated level, if you do a number of courses, you should be able to appraise your own learning skills and develop or modify them according to your needs. You will, at the very least, be in possession of some statistics to share with a learning mentor and discuss your learning plans. You will be in possession of data that lets you be in charge of developing yourself as a lifelong learner.

The bad

Security is the first and largest concern.  21-year-olds clean up their Facebook pages before applying for jobs. The fear that one might have to hide course statistics from HR professionals may seem ludicrous to overworked HR professionals today but if you really wanted to know if an applicant was a good networker and good at working in teams as they claim, wouldn’t you wonder about looking at available course discussion statistics?  That would be so much more cost effective than running assessment centres where everyone is on their best behaviour. Also, if you are an employer paying for a course, there could seem to be a very thin line between checking clocking-in times at the place of work and checking participation statistics in the course.

A second concern may be less obvious to the learners themselves: universities themselves are both regulated and in a competitive marketplace.  Once those statistics are public, regulators and accrediting bodies, in an attempt to quantify their standards, could start to demand certain levels of participation in particular areas and how long would it be before some enterprising hack compares statistics between Econ101 from two different institutions and draws a set of conclusions for publication?

A third concern is that just because one is over 18, it does not mean one is a confident learner, nor that one can accurately interpret the data captured by an LMS and work with the results. A learner has, normally, only one chance at being in a given situation.  Universities can afford to aggregate results, hypothesise and try out new appproaches.  If the university gets it wrong, last minute tweaks to the grading scheme can, in extremis , minimise the damage. If a learner misinterprets the data there is no such option and there are many ways in which they could do so. For example, a complete novice student might fail to take into account that most of the rest of the class are on their fifth (or tenth) course and be unduly harsh on themselves.  Or a learner that gets by perfectly well on one course without doing the readings might come very unstuck in another course where they have less background knowledge.

A further, and important, issue is that as there are ever more part-time learners who tend to use a wide variety of media to communicate with one another and carry out research. Although some universities try to force people to conduct all learning through the LMS, citing as reasons quality assurance or grading requirements or proving attendance for funding, the reality is that this does not happen.  Adults will use whatever means fits their circumstances to communicate with others.  Working adults will often have access to excellent, relevant resources behind their work firewall and they will also work with software incompatible with university requirements.  That is life.  It means that the data sets gathered by LMSs will be imperfect.

Gaining the good and avoiding the bad

To work, sharing statistics with the whole class needs to be done in an atmosphere of trust and possibly even with written rules on disclosure.  Certainly, if employers start asking for course data, there should be pre-agreed terms as the current consensus seems to be that they are paying for the outcomes, not the journey.  There should also be (as there is at UMBC) a support structure that helps students and staff understand what the raw data means and helps students use that data-set as a tool for their wider learning and development.  Course leaders should be able to take that data independent of ‘institutional analyics’ teams and use it to examine their own efficacy as teachers and facilitators.

In principle, I see no problem at all with students having access to data collected on their own LMS participation. How this is done, the support structures that are put in and the rules about sharing will need to be factored into the learning design but that has already been shown by John to be fully possible. Ideally, this is not a case of universities offloading responsibility for student development but putting/keeping the learner in charge in a professional, supportive teaching environment.  That environment will also have to take into account all the other media that students can and will continue to use when the LMS is inconvenient, inappropriate or unavailable.


  1. Hi Gillian,

    Really like your post, and I share all your critical remarks. There is one concern that I would like to add to the bad (or ugly):

    The abundance of statistical data available from these systems might wrongly be taken as comprehensive analysis of a learner’s performance. This means learning progress is reduced to student’s hits in the log files. It ignores the fact that still most of the learning is done in untrackable cafeterias and ‘real’ social networks! Thus, relying too much on data produces a distorted image, and I am highly suspicious that even pedagogic experts would not fall into this trap when easily accessible quantitative data is available at the click of a button, whereas for a thorough qualitatitve assessment they would need to get off their seat.

    • Wolfgang: I fully agree that there is a very real danger of administrators and regulators using the readily accessible quantitative data without looking at the whole picture. Thank you for your thoughts and for dropping by.

  2. […] Post de Gillian Palmer. Bookmark on Delicious Digg this post Recommend on Facebook Buzz it up Share on identica Share on Linkedin Share via MySpace share via Reddit Share with Stumblers Share on technorati Tweet about it Share on xing Subscribe to the comments on this post Tell a friend Bookmark in Browser Tags: #LAK11, ambiente, analytics, aprendizaje, archivo, autoaprendizaje, creación colectiva, environment, estándares abiertos, fluir, formatos, innovación, institución, learning, management system, minería datos, proyectos, redes sociales, self-efficacy, software libre, student success, tecnología, transformación, usuarios […]

  3. Comparto completamente las aprehensiones de Wolfang. De hecho son las mismas aprehensiones que tengo yo como educadora y alumna asidua a capacitaciones online. Existe una línea delgada entre realizar análisis de datos y un sobredimensionamiento de las tendencias o estadística.
    Lo que sí creo que es un aporte es que el tema se discuta, se ponga en el tapete con profesores y alumnos (nosotros). De otro modo tal vez el análisis de datos quedaría en manos de los administración quienes no siempre consideran lo mejor para todos los participantes.
    En fin, es grato encontrar a alguien que comparta las ideas que me he estado cuestionando.

    • Hola, Kami! Mañana voy a escribir mas sobre esta tema desde es importante para todos. 🙂 Gillian

  4. Hello,
    I agree with your analysis. Learners may have different preferences for mediums to use, or may prefer to do much of their learning offline. Inability to acknowledge this when analyzing their LMS data may lead to wrong conclusions.

    • Hello, Stefaan. Your last sentence is really important for all perspectives on this topic, I think. Factoring in learners’ personal preferences to a dataset is, in my view, one of those essential processes that changes learning analytics from being a science to being an art – but perhaps (if we get it right) and informed art???

  5. […] more interesting is the resulting discussion with other participants. Gillian Palmer indicated some serious concerns, which I share. More generally, I see the danger of reducing students’ interactions to mouse clicks and […]

What are your views on this?

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s


%d bloggers like this: