Case study for the ABLE Project: Achieving benefits from learning analytics
Learning analytics is a phenomenon emerging throughout Europe and they have a great potential to help educational institutions provide better learning. The ABLE project believes learning analytics is ultimately only as useful as the action it generates. OEE interviewed project partners Rebecca Edwards, Ed Foster and Tinne De Laet to learn more about the ABLE project and the benefits of learning analytics.
Can you tell us some details about the team and the work you do?
The ABLE project consortium consists of Nottingham Trent University (United Kingdom), KU Leuven (Belgium) and Leiden University (The Netherlands). The team is made up of experts from different research fields, pedagogical and educational sciences, data science, and practitioners (student counsellors and teachers). Across the team there is a strong interest in the transition from secondary education to the first year of university.
We define learning analytics in a fairly mainstream way. Learning analytics refers to using and analysing existing institutional data and learning traces to provide insights into students’ engagement with their studies. These insights can be used in different ways by different agents, for example, to provide managers and planners with more information about the student cohort, or directly by individual students or staff to sustain or even enhance engagement appropriately. Learning analytics is ultimately only as useful as the action it generates. Without meaningful action, it's merely interesting statistics.
We often refer to the definition of our late colleague Erik Duval, learning analytics expert from KU Leuven,
“Learning analytics is about collecting traces that learners leave behind and using those traces to improve learning."
Can you tell us about your project and the issues you wanted to address with it?
The ABLE project aims to research strategies and practices for using learning analytics to support students during their final year at university. Our work is focussed on developing the technical aspects of learning analytics and how it can be used to support students.
This project was devised during the early stages of learning analytics usage in higher education. It aims to pre-empt the potential issues in the area. We seek to make recommendations to maximise the potential benefit of learning analytics tools by considering both resource design and best-practice recommendations around use.
Why is learning analytics useful?
They are particularly useful as they have the ability to allow both students and staff to make evidence-based decisions.
They have the potential to identify students most at risk of early drop-out or those failing to achieve their academic potential, and support those most in need. However, we strongly believe learning analytics is only as useful as the actions it instigates and therefore it needs to be properly embedded into institutional support practices.
You focused on supporting students during their first year of university. Why in particular did you choose first year students?
The first year of university presents a number of challenges. Students try to get to grips with developing the core skills and approaches needed to be effective in higher education. Often students struggle to understand the changes in rules and expectations associated with the transitions and can find it difficult to seek support.
We strongly believe learning analytics could play an important role in identifying those students at risk. The support can provide a tool to help students manage their learning, including having access to information to motivate them to seek support where needed. We believe learning analytics is not limited to at-risk students. High potential students can also be pushed forward through learning analytics interventions.
What were some of the bigger challenges you faced during your project and how did you overcome them?
At this point in the project one of the main challenges is obtaining the data required for analysis. It is particularly difficult because we have three different institutions involved in the project who are at different stages of their learning analytics journey. They are required to adhere to different institutional and national policies with regard to privacy and data protection laws. The bigger challenge associated with this, is that institutions need to change their internal systems and processes to use learning analytics effectively.
We envisage the greatest challenge of all to be integrating learning analytics tools into a wide range of working practices.
Did you face any issues regarding ethics and privacy?
Yes, naturally this is a complex area given the nature of learning analytics.
We are striving to balance the utility of using student data against their needs for and rights to, privacy. From the beginning each partner has tried to ensure that they have fully complied with the legal requirements. Jointly we have considered best practices beyond the legal requirements, including how best to disseminate information to staff and students.
Can you tell us more about the stages of implementation of the project?
So far the stages of implementation of the project have been guided by the different stages of the three institutions, in terms of embedding learning analytics into working practices.
At the start of the project Nottingham Trent University had an early stage learning analytics solution in place across the whole institution. The knowledge gained from developing and integrating early usage of this resource was shared in the first stages of the project with the other partners.
Within the new project KU Leuven has developed a new learning analytics dashboard. This is aimed to support live interaction between the student and the study advisor.
This dashboard is used within 12 programmes at KU Leuven. Using the experience from learning analytics interventions in Nottingham Trent University and KU Leuven, both KU Leuven and Leiden University are currently preparing the deployment of the KU Leuven dashboard.
Each stage of the project feeds into the next. Information learnt along the way has helped to guide future plans for wider scale integration and embedding of learning analytic resources. We plan to disseminate our findings at the end of the project.
What are the measurable results of the project?
Our project is still ongoing; however the midterm results which are focussed on student and staff perception of usefulness are very promising.
We recently produced a report which forms the basis for evaluation strategy, in order to test the impact of learning analytics developed.
However the true impact will only become clear at the end of the project. All findings to date are detailed on our website.
How did you personally benefit from this project?
Being part of a collaborative team has been extremely valuable. We’ve had the chance to share experiences and knowledge that we would not normally have access to. It has been in a level of detail that allows for a much deeper understanding of the needs of a global learning analytics tool.
It has been interesting to learn that while educational processes might differ across institutions, many of the challenges students face still remain the same.
What is your best memory from the project?
It has been genuinely exciting to be working in this field. A particular highlight was the joint workshop we ran at the European First Year Experience Conference 2016. We’re aiming to relive this experience at the 2017 conference.