LASI13 Thursday (2): Analytics for C21st skills

Second liveblogging post from Thursday 4 July 2013 at  LASI13 #lasi13, the Learning Analytics Summer Institute, at Stanford University, CA.

Flint

Analytics for 21st Century Skills

Caroline Haythorthwaite, Rebecca Ferguson, Ruth Deakin Crick, Peter Foltz

Caroline Haythornthwaite

Open discussion around analytics literatices, for C21st learning. Book about 21st century skills. Skills we don’t know we need to talk about yet. What analytics literacies do peopl eneed? how can analytics tools/techs help people become literate? How can they help teach people to become literate?

Skills – some educators are fine, some prefer competencies, or literacies. Literacy as reading and writing, but in future, moving in to new kinds of skills, competencies and literacies. My bias towards socio-technical ideas. Literacy – we’re reading the environment, the joint result of people, technology, context. Context is important.

New literacies for (e-)learning – multimodal (platforms, forms), multi-actor, sociotechnical, collaboration, emergence – things are changing, being comfortable with fluidity.

Peter Foltz

Need to assess higher-level thinking skills. how they apply in the workplace. Transition from manufacturing to technology, requires these kinds of skills. Emphasis in many places – Common Core, PISA, P21, ATC21s, DEAG (??). Wide range of competencies – general cognitive ability, but also problem-solving skills, leadership, interpersonal skills, communication – all over digital.

Lower-level skills like language, maths, large interaction with these higher-level cognitive skills. Need to get people to generate these skills independently. Used in context related to the real world, jobs, working with others. Can’t be based on recognition, multiple choices – needs using recall, at the right time. Need to communicate effectively, generate own ideas, options for actions, team members. Multiple choice tests are not the appropriate way of getting at that.

The other way is having students write or speak to each other – problem in analysing that. Time-honoured technique is hand scoring of written and spoken tests. But it’s not scalable. Teachers can listen in, give feedback, but very hard to do at a large scale, and to do it in ways that it’s consistent and reliable. Thesis is that technology needs to meet this challenge. Convert performance into measures of skills and abilities – reliable, valid, efficient, cost-effective. Based on that, help with personalisation, realtime feedback, decision-making.

Automated language analysis is there approach. Take language, distil it with mathematical/computational methods, to permit automatic assessment and feedback on written and spoken communication – fast, accurate, not just writing style/grammar checking, but getting at these complex cognitive processes.

Some examples: writing, speaking, team interactions, and PISA 2015.

Writing example. Technologies that can do summative K-12 writing, formative writing with feedback. Make 5 or 6 revisions in an hour-long session. Teacher can read at any time, but students get feedback instantly. Also situation judgement tasks – here’s a situation, what would you do – they write, we judge, prompt with feedback. ALso in tasks and simulations e.g. write email to your boss summarising, or write a patient note explaining.

Speaking – English/language proficiency, fluency. Create a sentence with these three words, judge ability. Describe what you see in this video. Or listen to this lecture and speak a summary – understand, synthesise, produce understanding.

Team communication. Take comm streams from teams (spoken, written) predict measures like leadership, group cohesion, errors – warn instructors/commanders of problems. Live and in simulation. Also automated monitors for students and teacher in learning chat rooms. Monitoring what students say to each other, with computer giving auto feedback. Detect what they’re talking about, if going not in the direction of the goal, prompt with e.g. a document to bring them back on track.

Comment on PISA2015 – Program for International Student Assessment. Leading a group developing framework for collaborative problem-solving. Indicating that from OECD perspective, how do we incorporate new kinds of thinking to drive countries to think about different skills. The emphasis that OECD wants is how to get countries to emphasise how students can work together – a critical skill. Big challenge is how to assess this. From analytics POV, there’ll be a lot of data out of that which’ll be very rich.

Data analytics and focus on C21st skills fostering new breed of test designers, sim designers, cog scis, psychometricians. LA is central.

Questions

Stephen: Of the list, where are the biggest barriers to innovation and progress?

Peter: May be my bias. I see the psychometric community need to change. We don’t have a lot of them here. Not a major barrier, but have been very established techniques for how we do summative assessments. Have to bring them in to this community, see how they model progress. Move from standard IRT? to something better. Business models, I am seeing progression there. Education schools, some are making that change, others are not. Cog sci is moving in the right direction, can bring a lot, but some will have to be grounded in how to apply that.

Grace: We tried your intelligent essay assessor – examples shown were K12, have you been able to advance that beyond word count and limitations?

Peter: Much broader. Doing scoring now for HE. Embedded in Pearson products like My Writing Lab, My Psych Lab used in Higher Ed. Different models for scoring for HE.

Simon Knight: Any thoughts on whether the label 21st Century Skills is the appropriate label. There are other models – e.g. Denmark allowing internet in final exams. Not only because search engines are C21st tool, but have different model of what they’re assessing.

Peter: Always hesitant about using that term. Not well defined. We’re already 13y in to C21st. What’s the next label for these skills. Scandinavian countries are doing things differently. Labels always used, we need them, put some emphasis on getting beyond basic skills. Need label for them. Still somewhat soft.

Caroline: Some people object to ‘literacies’ plural, that there’s only a single sense of ‘literacy’. Can argue about that at length.

John Behrens: It’s a general linguistic NLP technologies, applied in lots of different context. Earlier were highly fit to specific problems, but now more robust.

Peter: Opened up to a wide range of different areas.

21C Skills: Enquiry Blogger – Ruth Deakin Crick & Rebecca Ferguson

Ruth starts. Project with Simon Buckingham Shum. Ruth outlines background, Rebecca do a demo.

Everybody has a list of 21st century skills, out there everywhere. Worked extensively with European Commission on flourishing in the C21st (Deakin Crick et al eds 2013 Learning to Learn, Routledge). Radically changing, uncertain world. Best discussion about what these skills/competencies are comes from OECD project – there’s a consensus that competence is more than one thing, not just a skill or knowledge, but involves real world success.

Model of competence (Deakin Crick 2012 Student Engagement in Christenson et al eds). Competence of learning to learn is key. Concentric circles model: competence in the world outside, then knowledge skills and understanding, learning power, and identity at the centre. If you want to have your hip replaced, you want a competent surgeon, not just a knowledgeable one. With ego under control, can listen to you, understand your situation. They have learning power – know what to do when they don’t know what to do. Of course want them to be knowledgeable, know where to get more. They accumulate like rings of bark on a tree in to competence in the real world. Different assessment at different levels. Different data at each of these levels. Have to be literate in different ways of analysing different data. Used this in Enquiry Blogger.

Built on learning dispositions work, over 13y. (Deakin Crick 2007 Learning to Learn, Curriculum J). Reliable, valid, important dimensions: Changing and learning, critical curiosity, meaning making, creativity, resilience (productive persistence), strategic awareness (metacognition plus managing feelings), learning relationships (with and from other people – not isolated or dependent).

Uses technology to feed back the profile. The assessment event is a trigger for a coaching conversation – is this a valid assessment, why am I like this, what do I want to change. Scaffolding knowledge construction profile forwards towards a negotiated purpose in a learning journey. Many qualities map on to C21st skills/competencies. They’re holistic.

Zygmunt Bauman – really big challenge is how we design and theorise learning journey when we don’t know the outcome at the start. Scaffold the journey of authentic enquiry where learner chooses start point and produces their own end.

Learning design – Purpose = learning power + knowledge structuring processes + learning relationships [who what where when]

Rebecca takes over for a demo.

This is an example of points from social analytics workshop yesterday. Three main ways to come at it. Through the data – I have a dataset, let me play with it. Or, I’m in an institution and need to show people are learning, need proxies and analytics for those. Or, from educational background, this is what we know about how people learn, how can we build analytics around that. Not easy – where and how do we get data? This project is ongoing since 2010. Not saying this is right, it’s work in progress.

EnquiryBlogger.

Example of post developed by school student. Have trialled it with 13-14 yo, postgraduates, primary school 9-11 N=200. Built on standard WordPress basis, multi-user WordPress. Whole load of blogs connected together – dashboard for educator. On the left, standard WordPress – write posts, reflect on what you’re learning, add comments, share with other people – helps you with learning relationships. Hope people will read and comment back. Engage in a conversation. Unusual form of writing to do in a classroom, especially younger children who are used to writing for audience of teacher. On the right, bits we’ve built: three widgets: Enquiry Spiral, ELLI Spider, Mood View

ELLI Spider – has seven legs, each aligned with the learning power elements (changing and learning, learning relationships, etc). As you post, can tag your blog post with categories related to this elements. Asking learner to reflect on their learning dispositions, what they’re doing at the moment. On the side of the blog, there’s a representation – the tags on posts affect the spider. Starts red, then yellow, then green, then big green, as you post more tagged with those elements. May struggle with meaning if you’re e.g. 10 or 11. So preparation with younger children – what are learning dispositions. Developed in to a story, narrative, some representation of what these learning power dimensions mean. Version used in primary school trial, they assign an animal to each dimension – so critical curiosity is the kitten. Can spell the animal if not the dimension name. Educator gets dashboard with multiple spiders showing what the students think they’re doing. Can focus group or individual contributions.

From yesterday’s workshop – don’t need a complicated algorithm always, can just ask the learners: you tell us what you think you’re doing, that’s the data we’ll work with. Not what the computer thinks you’re learning.

Called EnquiryBlogger because works round framework for enquiry – choosing, observing, questioning, narrating, mapping, connecting, formalising, validating, applying. Not necessarily linear, more like spiral. What are the stages of knowledge building. As you post your blog post, can pull down list of tags to say where you are in your enquiry. Then if you go to the widget, click on the blob, takes you to the posts in that category. Means can have a discussion about that particular aspect. Also gives you a representation to help you reflect on where you are in the process.

Final element: Mood view. Developed from Tony Hirst idea, developed for Facebook. At any point, can go to your mood graph and click on it – very happy emoticon to very unhappy. Click it, adds a point to your mood graph, and prompts you for a short blog post about why you’re feeling that. Bringing your attention to the affective element of your learning. Not always great, goes up and down. Can see change, movement, and see everyone else. Can hover over points and it gives you the blog post, then can click through. Educator can see multiple mood graphs overlaid. A reflective element for learners. Bringing attention to learning skills, challenging them to do some of these competencies, to write about their learning. Helps educator, or someone else, to suport them in the process.

EnquiryBlogger

Questions

Adam: You had systems thinking. Quite an interesting aspect of C21st skills.

Ruth: Happen to think that shift in to 21st century living requires us to embrace complexity and systems thinking. Not to abandon traditional ways of knowing, more reductionist approach. But incorporate that with a systems approach. You right perceived a systems thinking philosophy, unpacking the development of EnquiryBlogger. Look at discrete parts, but also the whole process. Drawn from engineers who are in to systems thinking. All tech systems are embedded in human systems.

Stephen Coller, Gates: God bless England, and a place called Pucklechurch.

Ruth: It’s very old.

Stephen: For Peter. I get nervous about psychometricians. Interested in ways to scale, like taking the definition to pornography to the supreme court. How to get scale so others weigh in. Rebecca, I was curious. The mood is an interesting tool. Asking a lot of someone to reflect on that. Talked to Facebook in their experience in trying to extend their ideas about extending Like.

Rebecca: No. The five categories weren’t based on any sound principle, we just had five emoticons, we’re seeing what works. Find if people are experienced, they do engage with mood quite a lot. The primary school children, they’ve never seen a blog, struggle, tend not to use it. Also, over short periods of time. But over longer time, more interest in nuance. We could be talking more widely about that.

Neil: Emotion aspect. Children are self-reporting their moods. Ryan and I are inferring emotional context. What insights do you have on what you can get teachers to learn about this, communicate with children? Anecdotes? As a field we want to avoid being creepy, but want to be effective in helping.

Rebecca: We never require students use this. Some never touch the mood graph. So don’t see it as a creepy thing, you must tell us how you are feeling. Talking to teachers about it, it’s very much time sensitive for children. If you say at the end of the day, anyone put something on the mood graph, could’ve had a brilliant morning but that’s faded by home time. Next morning you’ve forgotten why you said that. The times when they encourage children to do this is when it’s related to their enquiry and so has meaning to it. Important to encourage a comment, not just the emoticon alone, lose the context/meaning. It’s reflecting on why you feel like that in the context.

Caroline: Not interpreting it separate form the context.

Ruth: Also done work with Chris on internal structure of learning power. Resilience dimension operates independently of the five active ones and relationships. Hypothesis, resilience (vs fragility and dependence) is an emotional response to being open or closed to learning. Have datasets from a prison, teachers in prison able to ID young people in prison for violent offences, 100% resilient but lacked other dimensions. Emotions hugely important in learning. How we scaffold the process, internal reflection, is critical to being open to new opportunities, especially when challenging. Could be a neat joint project there.

Ilya: Seems really neat. Very free form. By design? Creating these mood manipulables. Question is, I can imagine a bunch of uses. Can you state if you have one in mind as a hypothesis. Introducing these tools could give insight, see if they use them. Do you have in mind something expressed as a cause and effect – introducing this will cause … something.

Ruth: It’s a hypothesis, not tested it. We think that it causes teachers to think differently about their pedagogy. Different relationship with their students. Has an impact on their pedagogy. And also on the student identity and self-efficacy around purpose and learning. Those’d be the hypotheses.

Rebecca: Also down to, you measure the things you care about. Usually in schools, very much pushed to measuring what the government cares about, so you can tell the government your school is good. This is saying, what do we really care about? Where should we be focusing our attention? Gives teachers tool to think about. Could use one out of the set. We think it focuses attention, brings them back to these points. It’s not just something you go through once at the beginning, and at the end, but every time.

Al Essa: We measure the things we care about. Wacky question, blame it on last night’s Scotch, urged by Phil Long. Why is learning a dance not a 21st century skills? So, a lot of creative people are in the arts, dancers, painters, singers. Problem with our schooling, we’re not providing a natural outlet for these highly creative people, we’re jamming a type of model that doesn’t work for these students. Learning is about creativity, imagination, passion, aesthetics. Are we saying LA doesn’t apply to these people? Why is learning to dance not a 21st century skills?

Peter: We’ve been asked about the assessment of creativity. See it listed as C21st skill. From a measurement perspective, a lot of disagreement about what it means. Also steep cliff – more and more creative then off topic and you’re wrong. (!) Don’t have agreed ways of doing it. That’s why you don’t see it defined explicitly.

Ruth: We do, there is one learning power dimension about creativity – you’re right. In our large datasets, the scores get significantly lower as children go through school, and creativity is the worst. 18-21 in higher ed are worst at risk here. This challenge is about how to rehabilitate playfulness, intuition, assess it differently. Go back to governments with data. That they’ll ignore like they do. Don’t know about dancing.

Alyssa: There’s interesting things going on. We may be confounding about what makes a difference. Learning power and what those reflect; step away from that. Different approach to have the students say where they’re at. That could have very powerful effects in other contexts too. Interesting opportunity in combining where do you think you’re at, where does the system say you’re at – reconcile the two. We’ve done some work, sometimes students don’t want to talk to the data, want to say what they’re thinking. Separate interesting piece.

This work by Doug Clow is copyright but licenced under a Creative Commons BY Licence.
No further permission needed to reuse or remix (with attribution), but it’s nice to be notified if you do use it.

Advertisements

Author: dougclow

Academic in the Institute of Educational Technology, the Open University, UK. Interested in technology-enhanced learning and learning analytics.