LAEP2016: Tues pm

Liveblog notes from the Monday afternoon session at the a two-day expert workshop on “The implications and opportunities of learning analytics for European educational policy”, at the Allard Pierson Museum, Amsterdam, organised by the LAEP project and the LACE project.
The Amsterdam Royal Palace in Dam Square

LAEP Project

Rebecca Ferguson welcomes everyone back.

Doug Clow (me!) gave a short tribute to Erik Duval, a pioneer and leading figure in learning analytics, who sadly died recently.


Rebecca takes over to talk about the background to the LAEP project.

Three research questions: what’s the state of the art, what are the prospects for implementation, and what’s the potential for European policy?

Once we have a sense of what’s happening now, what do we think could happen – what’s feasible? And what’s desirable – what do we want to happen?

Project process: literature review, glossary, inventory of learning analytics, case studies, expert workshop (this event!), then final report.


Terms developed by Adam Cooper before he moved to Tribal. [Definitions developed by the team, including Doug Clow!] Used LAK dataset, frequently-used keywords. Available online, suggestions welcome! Help us improve definitions, and expand it.

Literature Review

Also by Adam. There are many literature reviews, but few focused on implementation. Gone through much literature about implementation, five areas: underpinning technology (interoperability, sharing expertise, avoiding lock-in); policy, codes of practice and governance; skills and literature; culture, values and professional practice; leadership and change management.

Not much work on what’s good about data warehouses; needs exploring further.

Lots of work on ethics and privacy. Lots of data has been gathered without much knowledge – e.g. LMS/VLE is gathering data you may not have thought about. How do we tell learners? And what? Data governance. Rights of ownership (Andrew Cormack tweets:


Need for open and shared analytics curriculum. Also research on use of visualisation.

Why are we educating people? To some extent we agree, but to some extent it varies by context. What are we trying to achieve, and how can we link analytics to that so it really works for people. How do we embed LA in to professional training, so they are confident and empowered, rather than bored by algorithms and statistics. Help to make informed decisions.


Andrew Brasher takes over. Desk research looking at the current state-of-the-art of the actual implementation of LA across the world.

Broad-but-shallow collection of informative examples across three categories. Policy documents (N=14); Practices (N=13); Tools (N=19). One-page descriptive information about each. This is one view. Please look online and add to our view on the Cloudworks LAEP Inventory space.

Case Studies

Jenna Mittelmeier and Garron Hillaire take over.

Garron starts. We worked on six case studies, collectively. Three cases: Blue Canary, UTS, Kennisnet. Blue Canary, startup acquired by Blackboard. UTS is committed to being a data-driven university, and a centre focused on that. Kennisnet, NL school work.

Jenna – we also looked at Apereo, Norway’s work on SLATE, their national centre on LA, and the Open University UK ethics policy specifically on LA. These are in-depth, seeking different goals. There are key themes related to policy from these.

First is a holistic view of stakeholders – e.g. at OU involving students, at UTS engaging with academics and teachers, in Norway on taking LA research in to practice and engaging industry, teachers and students. It’s not just about those who create the policies, but leading with those stakeholders.

Secondly, how national or institutional practices for evaluation of schools can stifle students and teachers from adopting LA. The evaluation practice favoured old or outdated teaching methods. Can lead to fear of adopting new and flashy tools, such as LA.

Garron – different challenges. Data privacy and collaboration across data. There’s need for definitions about how to secure data, and at the same time, collaborating across organisations. This isn’t a checklist for policies to match, but areas of challenge around LA adoption.

Rebecca takes up again. Talk to Jenna and Garron, or Andrew, about those.

European projects (Lightning Presentations)

Anouschka Van Leeuwen, Utrecht University

Teaching analytics – not many focus only on teacher perspective. For teacher to provide adaptive feedback, they need to be informed about students and their progress. My thesis is also on this topic, came out last year. Investigated in secondary education whether providing tools to teachers changed their behaviour. Found that LA made teachers in some cases intervene more, and in other cases give more specific feedback. Working on a postdoc project investigating functions of teaching analytics, the best division of work between teacher and analytics. Much interpretational freedom, or advice from the system about when to? Comparing different functions of LA and effect on teacher behaviour. Using our own environment, the learner activity leads to teacher dashboards. Also blended learning dashboard for Higher Education context – but currently done by hand, provided weekly, seeing how this can help them iteratively. Looking for software or experience that can help.

Baltasar Fernández Manjón, BEACONING

It’s a new EUR 6m H2020 project, using games and gamification in different domains and settings. Classroom and physical spaces, breaking the barriers. It started in January. Aim to improve LA, running large pilots, need to prove that those interventions are effective. Working with content providers, in a very diverse consortium, including serious games providers, researchers, and others. Enriching LA that suppose a controlled environment to an uncontrolled one. Accessibility – using LA to improve accessibility to serious games for people with cognitive disabilities. Also geolocalised information, included in LA. The idea is to close all the aspects of this very diverse environment. Aim to reuse some of the RAGE project’s infrastructure – we are only partner in both. Idea is to build something that’s easy to deploy.

Hendrich Drachsler, LACE

Who knows about the LACE project? [most people put hands up] Hendrik cheers! It’s a Community Support Action, focused on pressing issues like interoperability. An outstanding report about that on the LACE website, there’s many deliverables. Working heavily on ethics and privacy, just released a checklist – eight points in a checklist – to provide a handy tool. It’ll be presented at the LAK conference. Also working with many of you. Sustainability plan is building up a SIG LA, so LACE becomes Learning Analytics Community Europe, have a proposal to SoLAR. European projects can align to this and become board members, a melting pot to bring all the stuff together. Proposal coming out soon, will say more about this at LAK.

Michael Kickmeier-Rust, LEA’s Box

From Graz University of Technology. Our community encountered a big loss with Erik Duval. He has an inerasable character in this community. I’ve learned a lot today about very cool projects. What I have is just a cardboard box. Very simple. Lea is a little character. Our project brings LA in to real educational settings, in to schools. We find situations where we sparsely have data. In most schools, in Europe, we don’t have a lot of data. Learning and teaching is a social process, an analog process. Teachers use learning apps, LMS, perhaps Moodle. We try to provide a central hub, a platform where teachers can smash in to the cardboard box and give them the best possible analytics solutions about their students progress and learning. That’s not an easy task. Also focus on psychological theories to improve analytics. A good metaphor is SPSS. A powerful statistics tool. It can do a lot. We try – the little girl is a metaphor – is translating this powerful SPSS in to meaningful answers for real-life teachers.

Katerina Riviou, PBL3.0

Started in January, 3y, Erasmus. LA, Learning Semantics, Problem Based Learning. Also interested in MOOCs. Want to use PBL with innovative use of LA and LS, maybe in a MOOC context. Five countries, four universities and one company. Final aim is to come out with pilots and recommend best practices and policy recommendations about the context of PBL.

Baltasar Fernández Manjón (again), RAGE

H2020 about serious games. 20 partners, EUR 9m. Developing industry in Europe. Building assets, modules, services to the game industry that is small. Developing a new ecosystem, promoting things already happenings in the US. We are building a full infrastructure to streamline process of applying LA to games. Games have been using telemetry data for years, but in a proprietary way, mainly used for increasing revenue. Here, working on providing game trackers than can be used with e.g. Unity3D. Using Apereo learning records score, xAPI, promoting this in to an industry afraid of using standards because they don’t want data to become a commodity. If you have a game, it will be easy to put in a new tracker, so there’s an easy way to deploy it in to your services. We’re using docker, push-button deployment. Running this in to large experiments with police in the UK, Randstad, university of Hull in the UK. Mainly industry-oriented. Want to streamline the process, but may be useful for research.

Noelia Cantero, SHEILA

Supporting HE to incorporate learning analytics. Erasmus+ Led by Dragan Gašević in Edinburgh. So new, there’s no logo yet. Want to impact policy development. Want many case studies. Using the ROMA approach – Rapid Outcome Mapping Approach. 4 institutional strategies, one for each. OUNL, Carlos III Madrid, Estonia. Student stakeholders, and also EU Association of QA agencies (?). Started in January.

Tom Broos, STELA

STELA, Erasmus+. Successful transition from secondary to higher education. KU Leuven, working with TU Delft, TU Graz, Nottingham Trent University, SEFI. Focused on transition, supporting students with LA. First-year students, in a new type of education, evolving from learning dependency to learning autonomy. Frequent high-quality feedback may help them. So we try to support that. Multidisciplinary team, also involving educational sciences, teachers, student counsellors. Our focus is holistic, taking in to account a set of courses, all requiring different learning strategies. Have to make something that can adapt to the learning strategy that’s best for that specific course. Our plan is for research, then to have something implemented in all these partner institutions seeing real benefits.

Jeroen Donkers, WATCHME

From Maastricht University. Long acronym, workplace e-assessment competency-based higher multi-professional education. 3rd year of the project. Large project, many partners. Some technical partners, some educational partners. Our research is in the domain of workplace-based learning. Learning to become medical specialists, veterinary specialists, or teachers. Two partners on each domain. It’s important to get insight in how you are working and learning. The way is to use workplace-based assessment. Your teachers are your peers, walking around with you. We use an eportfolio system to collect this information. That’s the evidence and context of learning. We add on top of this system personalised student models. They are used to inform the student and supervisor. It’s MEBN (multi-entity belief networks) – from general knowledge, very personalised models. We use this to produce visualisations and messages that turn up in the portfolio. Will test this in practice this year.

Daniel Spikol, PELARS

Project-based learning analytics for research and support. We talked at lunch about fun. This project is making sense of how students build with physical computing projects. Try to analyse different stages of how they plan, build and reflect. last year of FP7 project. 11 partners, SMEs to universities. Also trying to generate results – first attempt at data card for students to reflect on their project. Also a visualisation tool for researchers. Multi-modal data from how students move arms, gave, how they interact with sentiment buttons. Also reflective tools to plan, document and reflect on their work, which generate a visualisation. The idea is to enable students to learn to make better decisions in small groups. Students don’t reflect on how they got started or how they ended.



Bart Rienties takes over again. We have this list – our inventory – of Policies, Practices and Tools. Sit with your group, and if you have identified one of these that isn’t on this list. Put it on a green piece of paper – if you’re sure it’s the best ever but we’ve not mentioned. Yellow – is, Ok, I heard about this policy, I’m not sure about it, it would be potentially interesting. Green=essential, Yellow=maybe.

Two observations:

Marieke Van Der Schaaf: noted that the gap in teacher practices compared to this is not there. Another gap – in data management/data security/data protection/ownership.

Jan-Paul Van Staalduinen: work in TU Delft, different group to him, on aggregated data on social media – crawling data for learners from Facebook and Twitter. Also work on a programming course on EdX, where they get permission and then crawl the learners’ github profiles, looking for evidence of their programming improving. The learners are happier to give permission to scientists than to the government. General discussion about ethical issues.

Next exercise: reorder the post-its. Then think about which learning analytics practice you as a group want to defend.

Practice pitches

Next exercise: Pitching the practice to different groups in turn.

Pitch 1: Daniel Spikol: Fun!

Fun is important in LA. Fun leads to delight. For learning and for teaching. Teachers and learners should have fun. Empowering. Relaxing. Keeps the human factor in! (AlphaGo is not delighted that it beat Lee Sedol three times, but he is pleased to have beaten it once after that.) Intrinsic motivation. Fun as guide to your skills.

Pitch 2: Mark Brown: Address the demand side

We understand there are tools for LA, and people who argue that LA is all about the learner, and they should be the focus. Our argument is the teacher matters most. It extends, at the moment there is an oversupply of tools. Too many tools! A problem of lack of demand. So to address the demand side, classic for new innovations, we’re proposing to focus on the educators, the teachers. You can define whatever level. We want to develop a toolkit to teachers to help them understand the choices available to them to improve learning through evidence. A series of accelerator events, funded by the commission to support implementation of the toolbox.

That will address the demand side, people understanding more about it, to make more sensible, wise and informed decisions about the tools. And to influence, as stakeholders, the design of those tools in the future.

Q: In England, the system is very fragmented, hard to influence teachers because we have dispersed teacher training system, hard to influence schools. Influence over teachers is hard.

We thought about that! It’s easily resolved for the UK, you won’t be entitled to funding from later in the year because you’ll be out [laughter]

More seriously, tightly bound and assessment driven, opportunities for teachers are quite limited now. A policy level has to operate to enable that. The HE environment, there’s more opportunity. We might pitch at the HE level.

Q: Also smaller number of institutions. Thousands of schools in England alone.

Ye-es. IBM have contract for state of Georgia in the US. It’s very driven by standardised testing. But parents were key stakeholders, what they get, what they want to know – without interpretation the data you provide might be meaningless, or pose challenges.

Pitch 3: Jocelyn Manderveld: SURFnet practices!

SURFnet practices! Top down, and bottom up. Happening in the Dutch landscape. Bottom up, lots of experiments, 2012-2014. Funding in place for new ones from 2015 on. At the top, at management layers, they get the notion we should do something with this, with the quality. They’re also reluctant, they’re afraid about legal issues, security, consent etc. We are trying to vision the gap between top down and bottom up. We organise workshops, to get a dialogue started between management and researchers. Data scientists, teachers, deans of faculties, etc. Is your institute ready to use LA? Discussion going. Focus on leadership, if not leadership in the institute, it’ll still be at the bottom, it doesn’t have the full potential. As well, we make sure we have a simple, secure infrastructure in place. So if at the top, if want to do something, don’t want to invest in the tooling, they can still do some experiments of their own. Then have communication between researchers, and the management layer at the top. We are building this up now.

Q: To convince the top level that they should do something, do you have evidence that there’s positive things? From the experiments?

Yes, but also at this meeting, the researchers talk about the data, what’s happening. We have the management saying this is interesting – and they can bring their own evidence. I’ve noticed in last two years, we know the list of Purdue, and things in America. Typical Dutch, or European – yeah yeah, it’s American, but does it work for me? It’s is better – three people here from University of Utrecht. Talking to dean about LA. They were surprised, so we’re bringing them together.

This is the mandate of SURFnet. Eventually we hope – SURFnet is steered by the universities – we hope they will say this is what we need.

Q: And use the infrastructure you provide?

We are developing that now. It’s really there for experiments, not big use. Our mission is also to deliver education services to institutes that the market doesn’t. Security and privacy.

Pitch 4: Dirk Tempelaar: Only start LA if you have rich data!

Our main idea was taken away for the interview! This is our second one. The idea, if you want to start with LA, only start if you have rich data. If you don’t , forget LA. If you base yourself on activity data, what Blackboard or LMS brings you, click data, time on task – there’s no chance to make any interested model from that. What you need is more iterative. We have some pilots in NL coordinated by SURF combining formative assessment with LA, assessment for learning, brings very rich data. In my context, we have 1/3 surveys and dispositions, dispositional LA, [1/4] formative assessment, [1/6] clicks etc. Dispositions most powerful at the start [for predicting learner success], with a bit of others [such as demographics]. But then formative assessment is the dominating data for predictive models. The message from these studies, if you only have LMS data, don’t do LA.

Q: Do you have evidence, or is that anecdotal evidence from your approach?

Maastricht published this in Comp Human Behaviour, looking at how rich data is and looking at learning analytics.

Final exercise: One-minute pitch for ideas

All in one large circle, with a person in the middle pitching their idea.

#edugood Memento

Pitch A: Dirk Templaar.

Learning analytics needs rich data. What is it? More than LMS data which has clicks and time on task data. If you only have that, forget about doing any LA. What you need is rich data sources that are complementary to this data. Two examples from my experience. Data from formative assessment, quite rich data. Disadvantage is it comes late in the course. Second source is dispositions data. Then you can predict which will drop out.

Alan Berg: I agree. Have to break population in to subsets based on self-regulated.

I don’t think so. Our context is small group. Bring data to tutors, discuss it with their students. Don’t need to group students, depend on individual action.

Hendrik Drachsler: Challenge! I would say, who is against rich datasets? Sure, everyone wants that. Research shows activity in MOOCs leads to success. There are some effects that are there. You can say a lot already from activity data.

Activity data can be dangerous! In one, very non-linear effects of activity data. Not active are those who are high end, don’t need to participate, and low end. Only interested in drop out – but activity data groups those. Effects are non-linear, but if you combine them you can sort that out.

HD: Ok that’s good. We combine MOOC data with a LRS, only activity data w/o assessment and grades, we can do very little indeed. I second it.

Ed Foster: I would back that up. Rich data makes sense. But staff making use of things quickly. Just that activity data is beneficial, that sense of it happening. We have seen correlation between activity and progression and attainment. Good to make things easy.

I will use activity data, but combine with disposition data. That even predicts the activity data. But when I have first formative tests, I forget the dispositions and activity data.

Pitch B

We’ve had a lot of discussions about rich data, good analysis. One question, how do we get teachers to use data? To use LA? That’s a critical issue for this whole group. In Denmark at least, some teachers are quite conservative. How do we get the teachers to work with the data? Critical issue for policymakers. We need good data, convince them, but how do we trigger the movement? That’s my issue?

Riina: Easy. You evaluate teachers on LA. That’s the only way they do things. If it’s measured it gets done.

[some disagreement]

Someone1: It’s about starting the dialogue about improving their work from a perspective they think is relevant.

S2: Find the champions. Bill Gates, teaching awards now, world’s best teachers.

Alan Berg: U Mich, have fellowship program for LA. Sponsor specific evangelists.

We have a trigger in Denmark, demanding all municipalities buy an LMS within a year and a half. Bringing data in to the classroom. But still we need training, data, support.

Riina: My comment was funny, but has truth too.

Some municipalities setting up activity goals for these LMSs.

S3: Do teachers have time to do LA?

S1: If it’s different thing, it’s their regular job. If it is the same.

S3: Do we need every teacher to use LA? Is it relevant for every purpose?

Maybe one should see it as an investment in education. If you invest, you might save time later on, you know more about students, what works, what instruction methods work. You might save time in the long term.

S4: It gives you new ways to make decisions, changing your practice.

Riina: It’s a good question, does every teacher need to use LA. Maybe have different roles within a school. Like with ICT, we have some leading teachers, some have support, you have different levels of users. Maybe the same for analytics.

Pitch C

It’s teachers again. We’ve seen long lists of tools. The question is, if you give this list to teachers, what will they do with it? They look at it, ask what they should do. Our idea is to give an evaluation framework to the teacher to help them decide what tool to take for their specific context.

Bart: Can you buy it today?

Yes. Give us 3 weeks.

S5: What is the requirement for teachers?

It’s not necessarily a computer system. It’s helping us to ask the right question. So say, I’m a teacher in university, want to do something in assessment, I have a budget. But also ask what functionalities you need. Helping teachers to set requirements towards LA.

S7: Does the teacher have that decision? Not in Spain.

It depends on the teaching context. Sometimes, my budget is zero. Use open source software, or zero cost – that is one requirement.

S8: Isn’t it embedded in learning tools, and they already have trouble using those?

Then the decision is made in the wrong place. The teacher is responsible for teaching.

Ed Foster: Our institution, someone has to make decisions across the institution. There needs to be a basket of stakeholders involved, drawing in the computing angle too.

S9: The problem is an uninspiring list. The danger is to build a technical solution that makes it even more uninspiring. Challenge to find testimonials, user stories – this teacher had this problem, solved it with that tool, had this outcome. Bring it to life.

To list success of recommendations.

S10: If it’s recommended, did it solve your problem – so users can see whether it helped.

Use this data for future recommendations.

Pitch D

The teacher has already mentioned. For us, it was a big thing. Reflecting on what we heard today. It was about data, models, projects, but not about the connection with the education. It was there sometimes but fragile. When starting from the supply side, the IT side about LA, you always have to solve something. How can you get a team of teachers to make the change? It’s also possible to start at the other side, the demand side. Our suggestion is to put in the foreground the teacher and the things he thinks are important. If you have a professional dialogue about that, maybe in the background a LA question crops up.

S10: Sometimes people don’t know what’s good for them, can’t see in the future. We didn’t know we needed Powerpoint, now we know better [laughter] Have to educate them about what they really need.

S11: The data is in the background. The teacher’s needs are in the front, analytics in the back. There’s no technology there, the teacher is in front.

S12: Counter-argument, it’s a very economic point of view, to sell something. I support this idea to say we need to say what are the teacher’s needs. Glad you have a woman teacher, we have 90% of women in the profession.

The association of board members of primary education have on their website a day of men.

Alan Berg: I agree. But the conversation is that the tiger is coming in, how do you control the flow of water so it goes where you want? I’m not sure this is the tactic to control the market trend?

You have a choice, to anticipate market trends, or you can choose not to move immediately but to take space to have those professional dialogues. Something will come up, you will have the tool to facilitate those questions.

AB: Generational difference. Popper’s idea, old scientists die, new generation bring in new ideas. Probably going to see that with LA. Same as students with their telephones brining in apps.

Pitch E


Our proposition is simple: buy this policy maker HAL 3000. No! It relates to fun!

We think we have to have a human side to LA. Lots of discussion more focused on performance matrices, how this affects teachers, learners, policymakers. We need to remember some type of delight, motivation about education and LA systems. To empower people. It should work on empowerment. The human side – rich data is important. It’s not just the activity data.

S12: Is it feasible to make it fun?

Delight is feasible. Fun is a harder question. If you want teachers to learn to use LA, has to be delightful.

Marieke: Positive experiences, also for the teacher before they move to use more LA.

Riina: What would be the incentive?

Time management.

S13: Can we pitch LA as doing the boring administration stuff that you knew you were going to have to do when you signed up as a teacher, thought the rest was exciting enough to sign up. It’s back to the 80s vision that the computer does the boring work freeing us to do the exciting stuff. I think it’s still convincing.

S14: We go to schools, one of the biggest incentives we could offer is providing the coolest, newest most sophisticated rocket-science LA, the biggest incentive was make this system to allow me to print all the record cards for all my students with one click of the button, and I’ll use your system. That was the incentive! LA per se might have big big positive aspects for education. There are simple and trivial side aspects. We must be creative enough to find that and sell that to the teachers. The simple things are the better selling points.

Bart: What about policy? How do we go back to Brussels, how can we make LA fun, and embedding it in policy?

Doug: We must mandate fun!

Empower teachers to do something simple, give them a view to do more complex things.

Bart: Does it need policy then?

If everyone in Denmark is put in to SAP … it’s beyond policy for people to want to use it. Policy is one thing, everyday use is another.

Mark: Reconceptualising policy – it’s what you create.

Riina: What is a tool that policymaker love is PISA study. Can LA beat that PISA study?

Alan Berg: You can add extra information to the PISA study, LA can do a lot of it.

Hendrik: PISA is a bad example. It’s a policy that came over us, push for kids to get good results for honour of the country. LA would make that really bad. We should get away from summative assessment, to formative assessment. Get away from ranking the kids, ranking the teachers. [laughter]

Peeple’s just been re-launched hasn’t it?

Pitch F: Mark Brown

The uncomfortable truth: there is myth, propaganda about LA. We want to present the person who matters most. It’s the teacher. The engine of innovation in education. They subvert the system to ensure it works! The glue that holds it together. We’re swamped with tools, with the supply side. Look at the demand side. We need to be respectful of teachers. What teacher hasn’t had data to inform their decision-making? Our pitch is a toolbox for teachers and accelerators throughout Europe for only ten million.

Alan: We can do it for nine. I come in for building communities. I did this for Apereo. Instead of thinking of teachers alone, think of all communities together. Stop thinking about subpopulations, but whole population.

Once we’re successful with toolbox for teachers and accelerators, then do one for educational leaders. It’s an enabling discourse, LA is not done to you, but you can shape and inform.

Alan: It’s a multidisciplinary team. Let’s build those. Teaching community, developers, under a wider umbrella. At the atomic level, don’t get the advantage of the sum of the parts.

Yep. You don’t always know what you don’t know. Accept the need to take an ecological view. Our toolbox, one would be identifying the stakeholders.

S15: Every group is talking about the teacher. We don’t want to put this on top of the teacher. We’re leaving out the student. The teacher puts this on top of the student! I have to start from teacher focus, to flipped classroom. In universities, teacher has higher responsibility for younger students, but for university, student takes responsibility for it. I cannot choose for myself, I want the system to buddy me.

S16: Can we talk about not a toolbox, but a sandbox. Teachers, students, whoever come together as part of the process, try to play with LA. A new way of education.

S17: When we were developing the JISC Code of Practice, the National Union of Students were an active stakeholder. They produced briefing paper for student unions, saying when your university introduces LA, here’s what’s good, here’s what’s not.

Toolbox is different from a play box. Maybe opportunities to play. How you engage in K12 is harder and more complex than in HE.

Pitch G

We do this every morning in the NL, all the children in a circle. How do we bridge the gap to institutional use? Some bottom up needed as well as top down. We need top leadership, management saying this is important, we want to use data to help students. At the bottom, you have enthusiastic people wanting to do experiments. A lot of work is happening at the bottom, but at the management level they don’t know it’s happening. Let’s start working together from bottom up to top down.

Daniel: This would be a question for policy. To make top and bottom work together. Initiatives from governments, or EU, from other things.

Listening today to all stories, there’s a lot happening in the LA field in Europe, in different institutes. There’s no kind of institutional approach. I haven’t heard one example of institution saying data are important.

Bart: Open University UK does this!

Yes, but I haven’t heard this today. Legal issues, ethical issues. But need policy as an institute to stimulate people to work with this. The experiments are really good, but we have to get this at the top level. Otherwise, we’ll still be debating the same topics, and in 5y we’ll still be there.

Riina: This is the strategy you use at SURF. There is a policy in place in the NL that has given you a mandate.

Yes, true, but still looking for universities to put this in their vision for education. We are working on this, it’s interesting for them, [tentative], there are issues to address. We want to become, like assessment is an important issue, but data can also be an important issue.

S18: We had this big project, in the beginning, with Masters students, did in-house stakeholder analysis. How many depts have stakes? 22 of them. But, 14 or 15 of them could kill the project at any moment. Without knowing! We have to operate in this environment. If I would address something on organisational level, this would be it.

Pitch H

LA is a trend whether we do something or not. As we use digital tools, the analytics parts are there. All of these are built in, as we get new versions, new products, they are there. They give nice pictures and graphs. But teachers and managers need tools and information to separate the ones that are evidence-based to support their learning, from those that just have pretty pictures. We think providing information about practices that work, tools that help to improve teaching would be helpful.

Riina: How do customers know yours is not a pretty picture?

My solution? I’m doing consultancy to implement digital solutions. I’m afraid of how educated the customers are. They don’t have the tools or knowledge to separate one system from another. That’s frightening.

Marieke: That’s close to the other discussions.

?Barbara: Who’s going to quality assure?

Mark: I had a vision of the London Underground map, trains on different paths. With that map, I have a problem, I’m colourblind. For me it’s not a single pathway, it’s more complex than that. Ecological metaphor, the layers we have to think about.

Adam: I talk about evidence. This is about providing evidence to help. Who’s going to gather evidence? How are we going to get the contextual information? You can sort of move towards this with the case studies of this project, but the nuance needs to be there. So communities of practice [are important]. Allow self-organisation, the emergence of new practice through those communities, giving them space to change in ways not orchestrated, or instrumental, but emerge through that community.

Bart: Final exercise: things you liked on one side, things you didn’t on the other.

[As ever, apologies to people whose name I haven’t managed to capture, or whom I have misquoted. These are very rough notes to capture some of the discussion, and may not be as accurate a transcript as they might appear.]

This work by Doug Clow is copyright but licenced under a Creative Commons BY Licence.
No further permission needed to reuse or remix (with attribution), but it’s nice to be notified if you do use it.

Author: dougclow

Data scientist, tutxor, project leader, researcher, analyst, teacher, developer, educational technologist, online learning expert, and manager. I particularly enjoy rapidly appraising new-to-me contexts, and mediating between highly technical specialisms and others, from ordinary users to senior management. After 20 years at the OU as an academic, I am now a self-employed consultant, building on my skills and experience in working with people, technology, data science, and artificial intelligence, in a wide range of contexts and industries.

2 thoughts on “LAEP2016: Tues pm”

Comments are closed.

%d bloggers like this: