Mon 13th March 2017 - 16:52

Join OEE’s Q&A with learning analytics guru Dr Bart Rienties!

On 22 March 2017 don’t miss the unique opportunity to take part in a Q&A live chat with Dr Bart Rientiesone of Europe’s leading experts in learning analytics!

The live chat session is organised as part of OEE’s March focus on learning analytics. We invited Dr Rienties to tell us more about how Open University is leading the way with learning analytics, and the benefits and challenges of using learning analytics in education.

Dr Rienties is the Programme Director for Learning Analytics within the Institute of Educational Technology. His research interests also include Computer-Supported Collaborative Learning, and the role of motivation in learning.

Do you have any questions about learning analytics? Perhaps you have worked with learning analytics and you’d like to share your experience? Save the date and take part in OEE’s Q&A live chat with Dr Rienties!

The event will take place on this page on 22 March 2017 from 1:00PM to 2:00PM CET. Before taking part, make sure you’re logged into your account (creating an account is free).


All Comments

Wed 22nd March 2017 - 11:57
Looking forward to this session :-)
Wed 22nd March 2017 - 11:59
Hello Bart and hi everyone :)
Wed 22nd March 2017 - 11:59
Welcome to this month’s Open Education Europa LiveChat! We’re joined by Bart Rienties today, who is one of Europe’s leading names in learning analytics. This is an open text-based LiveChat – any comments or questions from the community are welcome :-)
Wed 22nd March 2017 - 12:00
Hi everyone :-)
Wed 22nd March 2017 - 12:00
So let’s get started – hi Bart! And hi to all of our community members who are joining us today!
Wed 22nd March 2017 - 12:01
So Bart - Can you tell us a little bit about yourself and the kind of work that you do?
Wed 22nd March 2017 - 12:02
Yes good question... During the day I am Professor of Learning Analytics at the Institute of Educational Technology at the Open University UK. I am programme director Learning Analytics, which pushes the boundaries of learning analytics research ( The research group consists of two Professors (Prof Denise Whitelock and me), two existing senior lecturers (Dr Doug Clow, Dr Rebecca Fergusson), three recently recruited lecturers (Dr Simon Cross, Dr Wayne Holmes, Dr Thomas Ullmann) and two post-docs (Dr Jekaterina Rogaten, Jenna Mittelmeier), and six PhD students (Shi Min Chua, Garron Hillaire, Jenna Mittelmeier, Victoria Murphy, Quan Nguyen, Vasudha Chaudhari). During night time I am a sporty person who likes gaming as well :-)
Wed 22nd March 2017 - 12:03
At the OU we have a wide portfolio of learning analytics projects that pushes the boundaries of learning analytics in education. For example, our project compares longitudinal learning gains at the OU, Oxford Brookes, and University of Surrey. Our Leverhulme programme has 18 PhD students who push the boundaries of Open vs Closed World learning ( Our ESRC IDEAS project uses learning analytics and learning design principles designed at OU and we test whether these principles can be applied at a South African context to better understand how international students are studying at a distance ( In our TESLA project ( we work together with 16 partners to build an adaptive trust e-assessment system for assuring e-assessment processes in online and blended environments.
In terms of my own research interests, I conduct multi-disciplinary research on work-based and collaborative learning environments and focus on the role of social interaction in learning (see
Wed 22nd March 2017 - 12:04
So what about you all? What are you doing with Learning Analytics at your institution?
Wed 22nd March 2017 - 12:04
Hello Bart, greetings from Germany.
Do you think, the reports of quizzes in e.g. moodle are useful for learning analytics? (we are using moodle with more and more formative e-assessment)
Wed 22nd March 2017 - 12:05
It would be great to hear about other experiences with learning analytics :)
Wed 22nd March 2017 - 12:05
Welcome Silke :-)
Wed 22nd March 2017 - 12:06
I'll be 'introduced' to learning analytics soon in coursework but wondered over what timescale you collect data before performing an analysis - months, a year, or several years? Janet
Wed 22nd March 2017 - 12:06
@Kirberg. Yes formative assessment scores and behaviour of students during quizes are really useful. See our paper in CINHB that argues that these kinds of data are the best for predicting learning outcomes:
Wed 22nd March 2017 - 12:07
Hi All, I work for ETS Global and we have an English language learning course, English Discoveries with a Teacher Management System where learning data is present.

Wed 22nd March 2017 - 12:07
Hi to Daniela and Janet, thanks for joining!
Wed 22nd March 2017 - 12:08
Many Thanks for the papers, Bart!
Wed 22nd March 2017 - 12:09
Hello Janet. At the OU we collect data on a daily basis. Some institutions also collect weekly or monthly data, but the more finegrained our data the more we can help to push the boundaries of how to support our students. See also our 2015 CinHB paper how different levels of data and quality can influence what we can predict
Wed 22nd March 2017 - 12:10
Hello everybody and thanks Bart for this opportunity to exchange on learning analytics. I am currently a PhD student in Paris. I am working on online forum in general and aside I am a member of the elearning team of ecole Polytechnique. There, I am in charge to implement a data-analysis pipeline to collect, aggregate and analyse data from several platforms, mainly on Coursera. I also commit a MOOC "How to write and publish a scientific paper" on Coursera. And being on both sides (instructor and analyst) of the scoop, I wonder what are the most recent and relevant measures to follow a learner path? Thanks
Wed 22nd March 2017 - 12:10
@Bart Rienties I have seen that the Open University has been developing the OUAnalyse Dashboard ( ) Could you tell us a little bit about that project, in relation to Jane's question?
Wed 22nd March 2017 - 12:11
Yes OU Analyse is our flagship LA tool: OU Analyse (, which uses a range of advanced statistical and machine learning approaches to predict students at-risk so that cost effective interventions can be made (Herodotou et al., 2017; Hlosta, Herrmannova, Zdrahal, & Wolff, 2015; Wolff, Zdrahal, Herrmannova, Kuzilek, & Hlosta, 2014). Have a look at to get an idea of the types of information we provide to teachers, and feel free to register for a free demo.
The primary objective of OUA is the early identification of students who may fail to submit their next teacher marked assessment (TMA) (Four to six 4-6 TMAs per module are typically requested from students in each module). Predictions of students at risk of not submitting their next TMA are constructed by machine learning algorithms that make use of two types of data: a) static data: demographics, such as age, gender, geographic region, previous education, number of previous attempts on the module, and b) fluid data: the students' interactions within the VLE hosting a module.
Wed 22nd March 2017 - 12:12
In OU Analyse we provide our teachers with interactive dashboards, which has been designed in close collaboration with experienced teachers, who expressed their needs in terms of what kinds of learning analytics data they would like to have access to through OUA and evaluated earlier versions of the dashboard. The overall aim of the dashboard is to provide teachers with information about how their module compares to the previous year's presentation in terms of students' engagement with VLE and also, by combing evidence from the VLE and demographics, to give them access to predictions about whether their students will submit their next assignment. OUA is designed as a tool that informs teachers about their students' behaviour and motivate them towards taking action when students are flagged as being at risk of not submitting their next assignment. The broader objective is to increase students' retention and completion of their studies.
Wed 22nd March 2017 - 12:13
@MATTIAS: yes great question. In part your answer depends on the types of data that your institution collects, and the richness of trace data. For example, in our learning gains project we work with three institutions and the level and quality of data influences how we can measure learning gains and student journeys
Wed 22nd March 2017 - 12:15
@Bart Rienties It's really useful to find out the process you went through to create the predictive models, and how important it is to liaise with the teachers. Have you had any feedback from the teachers (or the students) about the impact LA has had on learning?
Wed 22nd March 2017 - 12:17
Thans for your answers Bart. So does analytics avoid the danger of the spurious correlation? Janet
Wed 22nd March 2017 - 12:17
Yes good question MaryClare. The process was one of bottom-up approach. Our colleagues at the Knowledge Media Institute initially worked with one or two module teams to really identify where key bottlenecks were in their modules, and afterwards they developed four LA engines to test whether the students were able to overcome these bottlenecks. The Wolff, A., Zdrahal, Z., Herrmannova, D., Kuzilek, J., & Hlosta, M. (2014). Developing predictive models for early detection of at-risk students on distance learning modules, Workshop: Machine Learning and Learning Analytics Paper presented at the Learning Analytics and Knowledge (2014), Indianapolis. paper provides a good introduction to this process
Wed 22nd March 2017 - 12:19
Afterwards, we slowly upscaled this approach with 5-6-7 and now 20 modules per implementation, whereby we work intensively together with large modules (>500 students per module) to help teachers to identify which students need some additional support. At the same time, we have trained over 300 teachers (associate lecturers in OU jargon) to use these tools... Quite a challenging process, but definitely worth it...
Wed 22nd March 2017 - 12:19
At the same time, we run more general institutional analytics on all our 400 modules and give our tutors 4 times a module feedback on which students are doing well, and which students might need a bit more support :-)
Wed 22nd March 2017 - 12:21
In terms of feedback from teachers, the results were a bit mixed. Some teachers really liked the tools, and some teachers were a bit afraid of the analytics and how this might influence their role. Have a look at the amazing work of Dr Herodotou et al :-) Herodotou, C., Rienties, B., Boroowa, A., Zdrahal, Z., Hlosta, M., & Naydenova, G. (2017). Implementing predictive learning analytics on a large scale: the teacher's perspective. Paper presented at the Proceedings of the Seventh International Learning Analytics & Knowledge Conference, Vancouver, British Columbia, Canada.
Wed 22nd March 2017 - 12:23
Hi Bart, when we talk about learning analytics we can't not mention privacy. How do you consolidate learners' provacy with data collection at Open University? Rumen
Wed 22nd March 2017 - 12:24
Hi Rumen! Again a great question and an essential question
Wed 22nd March 2017 - 12:25
Pricacy is a key concern. We have argued for the last four years that a clear ethics policy needs to be put into place to make sure that we are ethically engaging with learning analytics, and that students are aware of why and how we are collecting data about them and their behaviour. The OU has been leading this policy as we were the first university to have a clear ethics policy on learning analytics, see Dr Sharon Slade has been leading this work for years, and in her recent work together with Prof Paul Prinsloo on “An elephant in the learning analytics room – the obligation to act “ they indicate although ethics can be challenging to implement in terms of learning analytics, we also need to act when our analytics indicate that students are struggling (Prinsloo & Slade, 2017). See
Wed 22nd March 2017 - 12:26
Pricacy=privacy :-)
Wed 22nd March 2017 - 12:28
Thanks a lot for your answer Bart. I'll take a look at the links you just referred to.
Wed 22nd March 2017 - 12:28
What do you think about the analysis of items in quizzes - is that a useful supplement to other data collections?
In theses analysis, the focus is on the questions of a quiz, not on the learners
e.g. (moodle again):
Wed 22nd March 2017 - 12:31
Hello Silke, yes this is again an interesting question. There is a forthcoming special issue on this by Samuel Greiff in Computers in Human Behaviour that brings together 5-6 papers on this issue, including our research. Our OU analysis has mostly focussed on total scores, but I know that Prof Greiff has also focussed on specific items, which could be a great treasure trove to identify where students have misconceptions.
Wed 22nd March 2017 - 12:35
Any other questions :-)? Surely there must be many :-)
Wed 22nd March 2017 - 12:37
The work you’re doing is obviously on a big scale. How could learning providers in smaller organisations get started with learning analytics, what are the first steps?
Wed 22nd March 2017 - 12:39
Yes good point Rumen. In other institutions that I am working with mostly there are small steps by individuals or departments. To me curiosity is an essential first step. For example, the work I have doing in the last ten years at Maastricht University with Dr Dirk Tempelaar is in a way very small-scaled, but at the same time it can really help the institution if one teacher becomes enthusiastic about learning analytics :-)
Wed 22nd March 2017 - 12:41
Most institutions focus first on large scale modules (see University of Michigan) and then try to determine for which types of students and modules learning analytics might actually make a difference. I would agree with a recent keynote of Tim McKay at LAK2017 by starting to focus on big modules but with strong diversity of students (i.e., if the students are fairly similar why would you need LA)
Wed 22nd March 2017 - 12:44
The Open University is known for its distance learning courses. Do you think this model particularly fits learning analytics? How can learning analytics be used to measure learning gain in more traditional learning organisations?
Wed 22nd March 2017 - 12:45
How about analytics 'packages' that can be used in schools. Are they useful and is the 'lone' teacher in a position to make an analysis on the basis of a set of data provided by one student, or a small class, . . . or in your opinion is this best avoided perhaps until there is longditutional data available?
Wed 22nd March 2017 - 12:46
. . . . 'longditutional' = 'longditudinal'
Wed 22nd March 2017 - 12:46
Personally I think that all institutions (whether distance, blended, or f2f) could benefit from data. For example, at University of Surrey we are working to collect data on how and when students go to the library, post messages online in the learning environment, or how they use clickers in the classrooms, in order to follow their attendance over time. As nearly all institutions now have a virtual learning environment, even basic analytics can help teachers to identify which students might need a bit extra support...
Wed 22nd March 2017 - 12:48
@Jturn. yes a very good question. On the one hand it might be difficult for an individual school teacher to become a data expert in everything. On the other hand, there is great potential and good practice in schools as well. Both from a political as well as institutional perspective there is a clear understanding that institutions will need to develop a thorough understanding of the opportunities and limitations of learning analytics. For example, in our recent EU report on “Research evidence of the use of learing analytics; implications for education policy” lead by Dr Rebecca Ferguson, we highlight that the time is now right for learning analytics ( There are some good examples of school usage of LA in there :-)
Wed 22nd March 2017 - 12:48
I heard that you have just got back from the Learning Analytics and Knowledge Conference in Vancouver ( How was it?
Wed 22nd March 2017 - 12:50
Yes LAK17 was amazing, and many institutions and individual researchers are really pushing the boundaries of learning analytics. There seems to be an increased focus on tracking emotions of students to understand the complex processes of learning. I guess a buzz word from LAK is multi-model (i.e., tracking many different aspects of learners at the same time)
Wed 22nd March 2017 - 12:51
Learning Analytics is a pretty fast-moving field. Other than going to conferences, I was wondering how you stay up-to-date with developments?
Wed 22nd March 2017 - 12:52
@rumen: To me a good start to keep up to date is Furthermore, following some of the key LA people on twitter is probably a good idea :-)
Wed 22nd March 2017 - 12:54
As we're reaching the end of this LiveChat, maybe you can look into the future of Learning Analytics for us Bart :-) Where do you see learning analytics at the OU and in wider education in five year’s time?
Wed 22nd March 2017 - 12:54
Thanks for the tip, Bart! :)
Wed 22nd March 2017 - 12:55
Another good point is to check our the LACE evidence hub ( A recent review by Ferguson and Clow (2017) entitled “Where is the evidence? A call to action for learning analytics” of 123 case-studies of learning analytics indicate that there is some evidence that learning analytics has a positive impact on student experiences ( So if you have some good evidence, please share this on our evidence hub :-)
Wed 22nd March 2017 - 12:56
Hi Bart, you mentioned that knowing the key people in Learning Analytics on Twitter is a good idea... Care to share some of their handles?
Wed 22nd March 2017 - 12:56
Learning analytics is going to be essential for the OU to be successful in providing flexible and personalised learning opportunities for our students in the next five years. With learning analytics we can provide just-in-time feedback for students (e.g, OU Analyse), as well as to improve our learning designs. For example, recent research and practical experience at the OU indicates that learning design has a fundamental influence on our students’ learning behaviour, their satisfaction of the module, and most importantly pass-rates. For example, Rienties and Toetenel (2016) linked 151 modules taught in 2012-2015 at the OU followed by 111,256 students with students’ behaviour using multiple regression models and found that learning designs strongly predicted Virtual Learning Environment (VLE) behaviour and performance of students. The primary predictor of academic retention was the relative amount of communication activities (e.g., student to student interaction, student to teacher). We can use this kinds of data to really improve the design of our teaching and learning practices...
Wed 22nd March 2017 - 12:58
@hpod16: yes follow @dgasevic @HDrachsler Stephanie Teasley; Rebecca Ferguson :-)
Wed 22nd March 2017 - 12:58
Thank you very much for the good informations + success for your research!
Wed 22nd March 2017 - 12:58
Thank you Bart, and Mary Clare, this was a really helpful opportunity.
Wed 22nd March 2017 - 12:59
Thanks all for joining... Do ping me at @DrBartRienties on Twitter if you have any further questions :-)
Wed 22nd March 2017 - 13:00
Thanks a lot for all this insights, advices and references!
Wed 22nd March 2017 - 13:00
Thank you all for joining and thanks especially to Bart :-)
Check back to OEE next month for our LiveChat on using social media in learning.
Wed 22nd March 2017 - 13:00
Bye all :-)
Language Help