LAEP2016: Tues am

The LAEP project and the LACE project with JRC-IPTS are jointly running a two-day expert workshop on “The implications and opportunities of learning analytics for European educational policy”, at the Allard Pierson Museum, Amsterdam.

These are liveblog notes from the opening session on Monday morning.


Implications and Opportunities of Learning Analytics for European Educational Policy (LAEP)

Riina Vuorikari, JRC-IPTS

Riina gives some introduction and background.

Why is JRC-IPTS organising this? JRCs are the Joint Research Centre of the European Commission, the in-house research centres. There are 7. The Institute for Prospective Technological Studies (IPTS) is one of those, based in Sevilla.

Looking at ICT in education, through various studies and projects, like LAEP. It’s to find evidence to help policymakers make better policy at the European level. Have invited ministries of education, school networks, etc, to understand how to support it. Policies already include 2013 COM on Opening up Education, E&T 2020, Digital Agenda, New skills and Jobs, EU Recommendation on Key Competences for Lifelong Learning. And Digital Single Market agenda.

Mission is to provide EU policies with independent, evidence-based scientific and technical support throughout the whole policy cycle. This is your chance to tell the funders what you want! Also work with DG Employment and Inclusion.

Working on digital transformation of education and training. Working plan with many acronyms, frameworks, looking at national settings and at the European level. They work with teachers, schools, HE institutions, and the skills digital citizens need. This is a supporting study, a short one, tendered and awarded to OU UK.

Riina introduces Jonathan, who works with her.

Expert Introductions (Lightning Presentations)

Rebecca Ferguson (OU UK) introduces the day. Fast moving, four-minute presentations.

Adam Cooper (Tribal Group, UK)

Used to work on LACE and on LAEP. Now works for Tribal, commercial venture in UK, student records systems and data management. Working on pilots organised by JISC. Trying to get universities started with LA in practice. The pilots are useful for us as a supplier, to experiment with process and software. Data mining standard process, to systematise how we work with clients. For adoption at scale, balancing repeatability is going to be tricky. Business understanding is very important, including the objectives, critical success factors, and who’ll be involved. We translate those, go through an iterative cycle. Different data available and different levels of interest in using them. We have an ensemble of data streams, can build predictive models around those sources, linked to the business objectives, and work towards a single indicator. Exploring how to build this, how transparent to make this.

We have software, including a dashboard. That’s not very flexible alone. We have to accommodate understanding. Our software application does a lot of the heavy lifting.

Alan Berg (University of Amsterdam, NL)

Program manager for LA. Amsterdam has had a 3y LA project, with 8 pilots. One pilot was to collect student activity streams from learning record store. Four were dashboards, two were predictive models. Now looking at scaling up. Looking at data warehousing, so each faculty can decide what they do with the results. It’s important to have an educational API. We can put a specialised team in.

Doing a hackathon with JISC at LAK16 on their architecture, building a generic architecture, how to cluster students, how to cluster student activity in to standardised views.

Alan Berg again (Apereo)

I also work for the Apereo Foundation, on board of directors. A consortium of 90-100 universities in open source software. Have our own suite of LA tools. It’s growing and maturing. The foundation is a do-acracy – if you see a problem, you volunteer to solve that problem. Marist College donated some software in this framework. It’s closely related to the SoLAR initiative around the Open Learning Analytics. Gaining traction around the world. Some components are involved in the JISC architecture.

Alex Rayón Jerez (Deusto University, Spain)

Research group in three areas – LA, game-based learning, remote experimentation. One of the most important areas are which questions to make of the educational data. Our activity is data science. We start with proper questions to data that we have gathered in the same data hub, and build domain knowledge. We work first with STEM learning, and with competency-based learning. We have some EU projects – SCALA, KODETU, Make World. Making predictive models for their students. Also SNOLA – Spanish Network of Learning Analytics.

Andrew Cormack (JISC)

I was a law student when I came across LA; the question “do you consent to LA” worried me. As a student, it didn’t make sense. As a service provider, doesn’t help either. Tempts me towards what I can get away with doing. Had a dig around in European Law. Two phases – pattern-finding stage, what factors help students do well, what factors can we include in learning. Then a pattern-using phase. In the pattern-finding phase, that’s legally a legitimate purpose, has to be necessary, minimise individual impact. Then once you have the pattern – e.g. teaching maths at 9am on Saturday doesn’t work. That could lead to improvement in course design. Some patterns you can use without dealing with individual students – eg. move it later on Saturday, or more wifi, or provide more support to non-traditional students. That doesn’t need personal data, doesn’t need consent. If you want to intervene with individual students, to provide support, more challenge – then you can offer meaningful consent choice: do you want personalised choice, or vanilla service. In conversation with a law professor, the big change, the point is changing form ambition to minimise the impact on the individual, to maximise it. That’s really interesting point to go to the individual: do we agree together? There’ll be a paper in the JLA next month.

Anne Boyer (University of Lorraine, France)

Computer science prof. Big university in East of France. Lead KIWI research group, working on automatic analysis of digital traces. AI, modeling prediction. User behaviour modeling and prediction to get better understanding of learner, teacher and their interaction. Design recommender systems, what action would improve this. Project Pericles to recommend OERs. Work on memory too. Delivery of pertinent indicators to learners and teachers. Automatic detection of learning communities. Adaptive learning to improve competencies and skills in foreign language.

Barbara Wasson (University of Bergen / SLATE)

Representing the ministry. We are the new centre in Norway, won the national competition for the centre for learning analytics and learning sciences. Norway was first country to put digital literacy beside 3Rs, in the 2000s. Now digital competence. All our schools connected, first to do national tests digitally. National programme on assessment for learning. Lots of situations for analytics!

Centre came from government report on MOOCs in 2014. One recommendation was a research group looking at that. We started in January. Our mandate is wide – research, competence centre, input to ministry, and to educational sector.

Projects starting: MAP LA – mapping LA in Norway. Big Data in HE, starting with University of Bergen, first to go to digital exams entirely, by end of next year. Push came from students, had uproar about exam formats. Now assessments underway, not just exams. iComPAss – taking Eu project in to firefighter education, open learner models and analytics.

Charlotta Grönqvist (Sanoma, Finland)

Educational publisher. We make books and digital material for the K12 market, in 5 markets, Belgium, Sweden, Finland and Poland. (?and NL) Market leader in many countries, 1m teachers, 10m pupils. We make content. More than 50% is either hybrid or digital. We use LA to improve the content, to make it better, to make life easier for the teacher, to make life more fun and efficient for the pupils, to improve the learning impact. Process: create content, publish in digital platform (bingel is their most famous one), pupils exercise, they analyse the content and answers – see which questions work and which don’t, too hard, how long do they take, why do they answer it wrong. Aim is to improve the content. Other benefits – fun for students if they get it right, very frustrating if they answer correct but don’t get it. Makes teachers more effective. Good for company to make our products better, and faster.

Challenges around changing market, and projects with universities.

Dirk Tempelaar (Maastricht University, NL)

We are one of the teachers applying tools. I’m teaching mathematics & statistics to 1st year students, collaborating with Amsterdam and Eindhoven in SURF projects. Digital testing, formative assessment and LA. My class is 1,100 first year students. We use blended learning approach, using e-tutorials, with assessment, to personalise the learning. Since the groups are so large, they are also diverse, students with different backgrounds. In Anglo-Saxon system you would stream, but law doesn’t allow us to do that, but we need to personalise. Do it using e-tutorials, assessment tools, give feedback. At Maastricht, we use dispositions analytics (Buckingham Shum & Deakin Crick, LAK2012). Aim of research is to find what data is most predictive at what stages. Especially in first stages, learning dispositions in combination with background data are very predictive of learning outcomes and can help interventions. Later on, formative assessments provide useful information.

Ed Foster (Nottingham Trent Unviersity, UK)

[Coming later]

Gábor Kismihók (University of Amsterdam, NL)

In business school, small RG, multidisciplinary. Eduworks network. Many disciplines from sociology to data mining. Project on goal setting: from area of human resource management, evidence this is important for success. So apply this in educational context. Set goals, measure performance, and behaviour, how they meet those goals. Looks very promising. Did pilot study here, scaling up at University of NSW in Australia. Another project in Eduworks, around Self-Regulated Learning in a MOOC context. In organisational psychology, existing measures for SRL. Want to look for evidence of these measures in large datasets from MOOCs. Ongoing project, looks promising so far. Also working on adaptive assessment, how to analyse data from these systems. Want to stress importance of privacy and ethics, a fundamental issue. This is a serious barrier.

Ian Dewes (Dunchurch Infant School, UK)

My children in my school, from the age of 3 to 7. Their education is different to university, don’t have large datasets. LA is still something we’ve found useful. There are parallels to be drawn between very young children, and those going to university. Student retention at university, analysing early signs of dropout. Very young children can’t drop out, but because of Early Years Education (3-5 yo), lot of play and choice in their learning. My children couldn’t sit down quietly like you lot are doing, well done. [laughter] LA help us monitor this. Children doing their own activity. Trend of increasing accountability in learning in England. We assess their learning through observations. Were struggling with this (photos, writing) until we used LA, pick up activity on iPAds. “Can’t see the wood from the trees” – some children observed doing same thing multiple times, some were avoiding being observed. LA helped us – can analyse the observations, use the digital trace well, has had a big impact on outcomes. Next step, not just existing digital traces, but converting the datasets we have in to digital datasets. Children read at home with their parents, want to digitise that so we can analyse it better. LA is something schools are doing without realising it. We are now much more aware, data privacy, data visualisation. Lots of scope to improve what we do and the outcomes for our children.

Jocelyn Manderveld (SURFnet, NL)

Dutch national research network. Mission to provide educational innovation about the use of ICT. Learning LA programme. Innovation Programme 2015-2018, working with Dutch HEIs to get them working with LA. Several topics. Learning analytics readiness – developing instruments, workshops to debate with IT and education depts. We see in the NL lots of researchers, but not deployed at large scale for whole institution. We are trying to scale this up. Most important thing is what is the question, what you want to know from your data. Easy to collect a big warehouse full of data. Algorithms will always give you results. We will publish a report soon looking at pedagogical models and what are the right questions to ask. Then we have privacy. We have a big task addressing these issues. Important to not look at the hindrance, but the possibilities. What’s possible according to EU law, Dutch law. Another report. We have a large part on architecture and tools, developing infrastructure, making tools available so institutes can make these available. Also looking at data loggers, so students can own their own data. We are also looking at research – what data really say something about student success. At a national level, bring educationalists, data scientists, all together to solve problems.

Jonatan Castaño Muñoz (EC JRC IPTS)

More focused on MOOCs. Quantitative methods in educational research. Example of MOOCKnowledge. Interested in complementarity with learning analytics.

Kristel Rillo (Ministry of Education and Research, Estonia)

We have a lot of data. If you torture data enough, it will confess. We can do anything, but we want to do it wisely. As ministry, we have a very good database about teachers and students. We have a lifecycle. We haven’t used it as wisely – e.g. for career planning. Want to decrease dropout, have better career planning. Vocational education is always a challenge. We have data in our system, LMS – only two big ones in Estonia, so we can do it in Excel. We are following Polish experiment or experience, to see how it can be useful for us. In general education, moving towards formative assessment. Good to analyse numbers, but when you have [more detailed feedback, it’s better]. Admission Information System, can see who studies e.g. ICT. E-tax database, a need to follow the labour market. Need to follow privacy, take the data that’s available for us. We took graduates and data from e-tax system, employment taxes, measured how successful they were after university. We have a lot of data! We need to do something useful with that.

Kristian Ørnsholt (Danish Ministry for Children, Education and Gender Equality, Denmark)

We’re working with central-level data, and local data – so on two levels. We have central data. One big initiative is a data warehouse, 3-4y work, available publicly on the web. Aimed at school leaders, data about the goals we’re working with in the educational system. Latest initiative is some new dashboards, targeted towards parents and choice of school. We bring relevant data to choosing schools, and when you discuss the class with your teacher. E.g. on pupil’s wellbeing, attendance rates, and so on. We have national tests, including adaptive tests. We have a centralised IT infrastructure, with ID number for every pupil in the Danish schools. This ID number we try to distribute to exchange local data. We want to help the market and local schools and municipalities to exchange data more easily. We don’t want to be a Big Brother ministry gathering data, but help locally, to bring those data we do have centrally more in to the game locally. Working a lot with formulating standards for data exchange, and also trying to make some platforms where we can share the central data with local data. We are starting a project on how to measure progression, student progression within the frames of our national curriculum. It is complex, we experience a lot of demand to make standards for how to measure pupil’s progress on it. It’s huge work. Interested in hearing from other countries about how to measure that.

María Jésus García (Ministerio de Educatión, Culture y Deporte, Spain)

Head of online teacher training in Spain. Three different pathways. Online tutored courses, on traditional Moodle platform, takes 2 months, open twice a year, March and September. Our big thing is MOOCs, we are launching one per months, usually in Spanish, but launching first in English. educaLAB moocintef. Looking at how learners interact, from Facebook groups to our ministry’s professional networking system. Whatever we do, it has to be social. We are interested in analysing what’s going on, what’s more motivational. Agreement with IPTS for MOOCKnowledge. Apart from collecting data, we want to analyse it. Portfolios, debates, interactions, digital artefacts the learners make. We write reports, but then what? That’s why I’m here.

Mark Brown (Dublin City University, Ireland)

Director of National Institute for Digital Learning (NIDL). Institutional hat too. Project called PredictED. We have national centre for data analytics, more general than for education. Partnership with my team and them. Simple project, sending email to 1st year students based on data from the VLE/LMS, Moodle. 6y data gathering to establish patterns. Based on that, sent emails weekly based on risk/activity. Very simple intervention. Will talk about the politics at institutional and national level.

I wasn’t centrally involved, but my staff were. My role was to take these innovations and connect them up. The President wants us to be doing LA, this delivered a project. CEOs, VCs, what they know at one level is not the same as the depth of knowledge as e.g. in this room. Some of the claims made about the project – they are weak if not somewhat risky, to say the least. A national newspaper article appeared, claiming the initiative had increased achievement by 3%. Correlation and causation issue. Not everyone does educational research. Only about 15% of students responded to the survey, so the percentages quoted are misleading – but are used around the institution without that level of critique.

Be critical of the focus on LA, turn it on to what we care about. Australian report, Shane Dawson – look at process as a way of rethinking teaching and learning.

Susan Flocken (European Trade Union Committee for Education)

Teacher union’s point of view. We’re developing a policy document on the use of ICT in the C21st. My main point is quality. That’s the main issue for use. Focusing on effectiveness of the teaching profession, what makes it attractive, what is the role of teachers, assuring academic freedom. ICT is only a tool, not the main purpose – it’s about learning. Learning analytics, it’s not about the marks at the end, it’s about the process, have they learned to learn. Heard a lot about the link to the economic link, the labour market – but it’s about preparing for a life outside school and university. It’s important to involve those working in the field. Many those working only see ICT or labour market. So ask the teachers, the professors, education staff. Working conditions, working environment – their working environment is the students’ learning environments. Not enough skilled women in digital skills, want to do something about that. Other inequalities. Access, special needs (teachers and learners). Big problem is risk of privatisation and commercialisation of public education. Education is a public good. Critical point: it’s not about making money out of it. Also looking at Internet safety, promoting open education as a means of adding value to teaching and learning. It’s not about saying teachers are not interested in ICT, need to provide the environment, training, CPD for teachers who have been teaching for many years.

Tim Vogelsang (iversity, Germany)

MOOC providers. iversity is a European providers. I lead marketing and business intelligence team, my passion is analytics – which I can do publicly. Two examples – peer grading, group creation. I’m not so interested in plotting graphs as having a concrete problem and solving it. Peer grading, a course with 1,000-2,000 students, they grade themselves, many grade each other, how do you derive a grade for each person – means or averages, or like Google PageRank – finding the best one that improves the learning is the challenge there. Second one is group creation. The question is how to create learning groups in a MOOC. Used k-means clustering, put people together who are similar. The real challenge is you want similarity on some factors, but diversity on others. Maybe same time zone, so they can collaborate, but want mixture of e.g. male and female. Except in e.g. Muslim countries that want male and female separate.

Topi Litmanen (Claned, Finland)

A startup, 20 people, in Helsinki, Dubai, Singapore, London. Learning environment, can be used for e-learning. Pilots and customers, universities providing elearning, from medical education to dance education. Try to provide tools to make the learning process visible for the student and the teachers. Our environment, you can use your elearning materials embedded, or upload videos, documents, Powerpoints, we provide automatic keywords and topics, and track everything that the learners do, how they spend their time. We do analytics on the interactions between different learners, focused on collaboration. An example screenshot: a teacher view of a course on medical education. We provide data to teachers by looking for groups of students who act similar, or have similar motivational patterns. Here it is students with similar challenge evaluations, at the topic level. We try to make the learning process visible for the teacher, information might be used for giving supporting materials, or more education for topics experienced as challenging. We give the data back to the leaner, a learning tracker tool. Next phase is to use the data to provide suggestions for individualised learning paths. If you have a learning goal, what materials would be useful, suggest materials outside the course to support your learning process. Differentiation from other providers is that we try to provide a solution that doesn’t have to be hard-coded in.

This work by Doug Clow is copyright but licenced under a Creative Commons BY Licence.
No further permission needed to reuse or remix (with attribution), but it’s nice to be notified if you do use it.

Author: dougclow

Data scientist, tutxor, project leader, researcher, analyst, teacher, developer, educational technologist, online learning expert, and manager. I particularly enjoy rapidly appraising new-to-me contexts, and mediating between highly technical specialisms and others, from ordinary users to senior management. After 20 years at the OU as an academic, I am now a self-employed consultant, building on my skills and experience in working with people, technology, data science, and artificial intelligence, in a wide range of contexts and industries.

One thought on “LAEP2016: Tues am”

Comments are closed.

%d bloggers like this: