Liveblogs from Wednesday 2 July 2014 at LASI2014: Panels on strategy and policy, and on universal design.
George does a quick reference about SoLAR. Individual membership too. Appreciative of sponsorship support. Universities committed resources to develop the field initially. If you want your institution to join, let us know – speak with Grace. Commitment to developing doctoral students and the field.
Secondly, call for papers for LAK15 is out. http://lak15.solaresearch.org/ Look at the timelines and the agendas, October 15th deadline.
Panel: Learning Analytics Strategy & Policy
Shane Dawson (University of South Australia), Grace Lynch (SoLAR), Harrison Keller (University of Texas at Austin), Christopher Brooks (University of Michigan), and Doug Clow (Open University)
[Light liveblogging since I’m on this panel. Also I can’t see the slides because they’re behind me.]
Shane introduces. This is like an Olympic event, and this is the 4 x 100m. Gap between research and practice. What’s the role of strategy and policy at government, federal, state, institutional and individual level?
Grace Lynch up first. SoLAR is not-for-profit. Federal Government in Australia asked for a discussion paper for policymakers and senior university staff. Paper is here.[PDF] Looking at systems-level deployment. Many initiatives on a small, unit level. Wanted to look at policy, universities. You should know what you want to know. And if you’re not prepared to take action on it, don’t ask the question. Working collaboratively. I can make numbers say anything you want. Have to be careful around that interpretation. Making informed decisions around that data. Considering a leadership summit, a recommendation made to the Federal government. Access to data. LA sophistication model, from pockets to excellence to transformative practice.
Harrison Keller. VP at UT Austin. Director of Center for T&L. 52,000 residential students in UT system. 3000 faculty, 14000 staff. Budget $2.5bn (?!). Ambitious new infrastructure to support teaching and learning. Repositioning Center of Teaching and Learning. Substantial resources in to engage, inspire faculty. $2m this year to various initiatives with faculty in 5 areas. Social learning, experiential learning, online, blended, and learning analytics. Course coordinators funded in key departments, working up digital content. Large integrated Learning Sciences division, 30-40 staff. Integrated Learning analytics stack. External-facing groups too, as a business development group. Integrate different components. New governance structure for technology-enhanced education. Aim to serve more students. Working on throughput, new delivery models, and strategic partnerships with high schools and community colleges. Tremendously exciting. Students used to recommendations, but want to raise a flag about it. Optimism about online learning, but drumbeat of stories about data breaches. Trend towards focus on individual liberty. Wouldn’t take much to have much more restrictive policies about individual data. We need to be more careful about this. Policy and strategy challenges. Expensive to build infrastructures, but analogy with healthcare about investing in primary care, need less emergency room visits. Concerns about this idea. It’s surely the case that if we can address learning needs earlier, we can improve student success. Won’t necessarily save us money, might even increase budget pressures. Improving throughput, time to degree, puts pressure on upper division. The savings may not accrue to the departments. Costs of failure go back on the students. This is a good problem to have, but question about who pays looms large. Big question about how this develops in institutions with much faculty autonomy. Competing interpretations of FERPA, rights to access students data. Raises issue of how much data is going to be discoverable, subject to open records requests, Freedom of Information Act. Our faculty nervous about the data available already, how much more about more detailed information about instructor and student performance? Thanks to organisers.
Doug Clow finally. [that’s me]
Senior buy in necessary but not sufficient. But get going with it anyway.
Also idea that the expertise doesn’t have to be all in a single person, can be a team of people with a diversity of skills.
Panel: Universal design for learning and learning analytics
Gabrielle Rappolt-Schlichtmann (CAST), Mike Connell (CAST), and Garron Hillaire (CAST), Skip Stahl (CAST)
Garron introduces the panel.
Gabrielle. Not-for-profit, using framework of Universal Design for Learning. (Rose & Meyer 2002, Rose, Meyer & Hitchcock 2005)
About 30y ago when CAST started, focused on learners with disabilities and how they can get access. Focus on fixing the learner, until had major insight, that was obscuring the role the environment was playing and disabling learners. So shifted focus to fixing the curriculum. How they were disabling the learners, adaptation focus on the curriculum, not the student. The tech world was exploded, so we could create flexibility through use of tech and design. So idea for Universal Design for Learning. Inflexible one-size-fits-all curriculum is the barrier.
Universal – core assumption that diversity is the norm. If design just for the middle, you’re creating a situation where all learners don’t have full and equal opportunities to learn. Accommodate the wide variety of learners that exist. Design for learning, offering practical steps. Look to learning sciences, continuous improvement to improve the guidelines. Classic example is the kerb cut – may have had a kerb that was too high, didn’t work, so change to have kerb cut, universal for everyone, not just people with wheelchairs but baby carriers, skateboards. Sense of what we mean.
Someone from Google, focus on accessibility in R&D. Rubik’s cube expert. Was world record holder at one point. Video of him fixing a cube in 24s. He’s blind. The cube requires colour – but if you apply Braille, you don’t need vision to fix the puzzle. He doesn’t just have access to it, he could be the world elite at solving the Rubik’s cube.
Variability more broadly, not just disability. Think around the edges of the population, knowledge, skills and motivation. Create designs that work for everyone if we focus on the margins, overall better for everyone. In the 1950s, air force made strides in their tech, increase pilot situational awareness without overload. But trouble, the cockpits were designed for average size person. How many pilots actually fit? None of them actually fit the average designation. So shifted to accommodate wider range of pilots. Wanted the most talented pilots. Solution was adaptable seats, it can be repositioned. In fighter planes, accommodate from the 1st to the 99th percentile of size, and that increased the talent pool dramatically.
Support, features we can apply to accommodate wide range. So e.g. Braille, physical piece to adjust the seat in the cockpit. Second piece is not just access to the content, but access to the learning. That’s the main focus. Core idea there is scaffolding. NFL games – the yellow line. Looks simple on the TV, but tech very complex. A moveable, digital rendering that a team has to travel to move on in the game. Aim to capture more novice viewers to become fans. Complex and expensive to create. Sensors on the game cameras, capture pan/tilt/zoom, 3D models, colour palettes (team jerseys, field etc). Why go to huge expense for tiny thing? To increase the audience. Also working memory coming back from a break. Gaining traction for novice audience. Regular users of the game, more expert, managing the game between breaks is helpful. A cognitive scaffold, makes the game viewing challenge better, just-right. Appears to be an integral part. The line was so successful, referees use it too – it’s more accurate than line judges moving down the field, so is used in replays to just 1st down movements.
Set of guidelines to be transactional framework. Three principles – on knowledge, skills, motivation. Goal to create resourceful, strategic and motivated learners. Three main tiers. First is about accessibility, the traditional pieces. Then around emotion too. Last two tiers deal with access to learning piece, includes e.g. highlighting critical features, mastery-oriented feedback.
What does this look like in a curriculum? Just-right challenge. Connecting text to images. With more text required in inquiry science learning, publishers add images, makes more complicated. That’s a high-level reading comprehension skill. Created universally-designed system to connect text to the image. Click on the text, the image changes to highlight critical features. Can investigate – changing light vs iris. Can move things, investigate. Then express what they know at the bottom.
Big project, $10m/5y center, National Center on the Use of emerging technologies. 91% of 8th grade students with disabilities perform at or below grade, 60% below basic. That’s the ones who could be assessed without accommodations. Not budged over a decade. Many have focused on incremental pieces of reading, but when you explore the issue, it’s negative emotions that’s the barrier. Getting remedial reading education, but not motivated to have the ubiquitous practice to have the remediation have an impact. Proficient readers at middle school, they read 45 minutes a week at school. To gain that skill, it’s hours and hours a day. How do you motivate kids to engage with text. Theory on motivation – emotional strain, negative emotions about reading, and low motivation for reading. Give them access to high interest, age appropriate texts. Authentic valuable and relevant experiences, productive challenges, to drive ubiquitous independent practice. So aim to render the web as a place for anyone to read. Learning analytics is a big part of the picture. 3y in, launched our first version. Agile approach married to design-based research. Create new versions in 2w intervals, test with stakeholders, larger tests in 9month cycle. LA has strong role in design. Independent reading is impenetrable, invisible to teacher. For me to track how much I’m reading is hard.
It’s called Udio. Video demo. Four parts. Dashboard, explore – find high quality, high-interest, popular, academic, read, discuss. Create and share projects. [Interface looks very text-heavy, interestingly.]
Get info to each student
Garron: Making Data Meaningful
Followup to yesterday’s workshop. This fed in to design of Udio. IQWST system, translated in to online environment through authoring tool. UDL Curriculum Toolkit.
Product has lessons, core questions was supporting understanding of key concepts, from Common Core. When they built the supports, have hints, model answers. Principles around UDL is multiple means of action and expression. Open-ended questions, can write, draw or record. Also ask how hard the question was – very easy, easy, hard, very hard. Richer data. Evaluation questions to check comprehension at the end of the chapter. More high-stakes how are you performing test.
Supporting key concepts, designing a dashboard, aligned with the instructional theory of change. Theory of change – give frequent, immediate feedback.
Dashboard built from Excel, prototype used to illustrate it to students, get them to reflect. Also, in an authoring tool, you can only use its affordances. We used a two-question in-line with the test. Could quickly check comprehension. Student who doesn’t go through 3 options may not benefit for support, so lower the barrier since this is low stakes. N=156 in 6th grade, urband Detroit. Include students with disabilities.
Can middle students, incl with disabilities, understand the dashboard with complex info – and they could do well on factual questions, 92%-99% correct. Inferences, 77%-86% appropriate. More could infer from the data correctly than could point to where the data driving that was.
Does dashboard influence behaviour? [A bit.]
Examples of feedback – unintended behaviours. “my advice to me would be to keep doing what im doing” If they’re getting 100%, I’d be excited, but if 0, not so much. Behaviours aligned with the theory. Some unexpected interpretations – e.g. wanting to get 100% of the supports.
Skip – Acknowledging Each Other
Social perspective taking and students with disabilities
Focus on elementary and secondary learners. What are liabilities of those environments, positive affordances.
Frame it in the context of challenges that researchers have had in the center. Orientations about what constitutes research and interventions. US Dept of Education driven by IES, randomised controlled trials. Sampling tech, pull out a subset, do something to them, compare to another subset where that wasn’t applied, compare outcomes around an achievement standard. A lot of us in the center are interested in large datasets, digital exhaust, dwell time on activities, how that’s correlated to achievement. Some interesting findings based on large scale correlations. Did a data analysis on 7500 students in Detroit, K-12. Initial Q, what factors are associated with greater than expected achievement. Data chunks looking at various subject, against standards. Separated out students with disabilities vs those not, formal identification. Didn’t see any differentiation between students with/without special needs. First year, year and three quarters. Statistician said, we have stats about making 2y growth for less. Got a split. Students w disabilities were making 2y growth in math compared to counterparts. Something was happening. Allowed us to tunnel in further, design an RCT to ferret out what the reasons are for this differentiation. It’s been a challenge to convince the dept that there’s value in the analysis of large datasets; the gold standard of RCT is a challenge. We believe this is a critical part of this, we have access we didn’t before. Second challenge is about what is an intervention. Classic, work with a subset, do something to them that changes behaviours, outcomes and so on. Working online, an immersive virtual environment. We’re not doing something to them, they’re in an environment where things happen. Different orientation to an intervention. Argument that this constitutes an intervention. Challenges come, from one background and orienting to another.
One environmental intervention. A simulation. Social Perspective taking. Putting yourself in someone else’s shoes, in a stressful situation, coming out with a perspective different from your own. Students who can take the role of another come out with higher levels of empathy, collaboration, and so on. Also know particular challenges for students with disabilities, e.g. students on the autism spectrum don’t want to engage, students with challenging behaviours. So immersive online environment. High school environment with a bullying scenario. ‘School Life’ from Giant Otter Technologies.
One person is being bullied. You can be the person being bullied, or best friends with the bully. Faced with decision points, sim changes depending on your decisions. Can play more than one role. Play in pairs. Everything gets tracked. Using crowd-sourcing to make a richer array of responses in the social situations. Work with adolescents, try to incorporate crowd-sourcing to incorporate their responses, to make it more open-ended. Good example of the full immersion virtual environment. Microcosm of e.g. Minecraft, Grand Theft Auto, really engaged in something and their actions have a significant impact in the world. Three primary areas: affordances of the environment itself.
Finding with parents, students placed in online settings, often because they’re leaving an environment that was toxic – bullying, lack of understanding. Predominantly with emotional disturbances, autism spectrum, behavioural issues; much less physical disabilities. Is removing them the right answer, or simply the least effortful gesture? The context, how these students are taught. Parents are identified as learning coaches, average spend 3.5-4h a day with student at home at 4th grade. Interesting finding. Offloading of pedagogy to parent, providing resources and a bit of support, but demand very high. Another challenge, very few organisations, agencies, are keeping and data. And for the most part, it’s when they enrol, but have no idea about when they leave. Enrolment is associated with funding, so tracked. But when they fail to show up, it’s not. Enormous potential.
Panel has tech (Garron), research (Gabrielle), policy (Skip)
Chris: Built tools. Totally UCD, loved universal design, but also total paralysed. Could always find someone we’re not supporting right. Trying to use Agile methods, didn’t work well with universal design. Built something, people said, oh no, we’ve forgotten. How do we address those issues with analytics solutions?
Skip: For us, the lowest common denominator are Section 508 guidelines, for digital hardware and software accessibility. Workarounds for accessing content. Good place to start because it’s what commercial vendors in to US post-secondary and K12 are looking at. Aligned to WCAG 2A refresh. Key for us – I can hear your pain. Challenge to create an environment usable by everyone, but not impossible. Guidelines say, if can’t meet it, say why not, and why it’s Ok. Our vision is to applaud approximations. Less worried about your situation, since you’re aware of a challenge there. More worried about people who aren’t aware. Web Content Accessibility Guidelines or Section 508 as starting points.
Gabrielle: Don’t have to cover all of them all the time. Accessibility first, take the goal, theory of change, pick things you must do to make your system effective. Continuous improvement process. Launching this version, it’s a version, use data to improve the environment. Same effort applied to improving the environment. Advocate inclusive research activities.
Garron: Impelemtation team, I feel that pressure. Doing a lot around a little. Focusing effort on high level, build around that comprehensively.
Skip: Often prioritise access issues for e.g. blind or no vision. Discovered that accommodations don’t end to navigate information, those go a long way to helping others. Have to be there because no other alternative.
Martyn Cooper: Accessible design. Moving to an Agile way of working. Wrestled with how to do that best. Find it easier to have users embedded in the project, so their feedback is within the Agile context, not just waiting for an end product to be evaluated.
This work by Doug Clow is copyright but licenced under a Creative Commons BY Licence.
No further permission needed to reuse or remix (with attribution), but it’s nice to be notified if you do use it.