Closing panel on Monday afternoon at LAK12.
Linda Baer, Donald Norris, Alfred Essa: Building Organizational Capacity in Analytics
Simon Buckingham Shum introduces: Linda is from i4Solutions (previously Gates Foundation), Donald Strategic Initiatives Inc, and Alfred is from Desire to Learn, one of the sponsors of LAK.
Large-scale survey of institutions and their learning analytics practices.
Linda glad to be at LAK again. Remembers last year at Banff, a lot of people who only knew each other online meeting together over coffee.
Major survey of institutions and vendors about what’s going on in the student success space. In the literature, lot swirling about defining analytics. We are broader than learning analytics – other success too. What is it to have organisations, communities, to be ready, and to get it in to the hands of people to make decisions in time to affect student success.
Have 40 institutions so far, contacting members who are best able to answer – can’t just go to anyone to fill out. Looking for more participants! At about 20+ vendors developing solutions for students. Don did interviews at EDUCAUSE conferences. Will post results on this (LAK) website and EDUCAUSE. Big data report from McKinsey, education not high on the list of being ready willing and able.
Will develop a toolkit for building organisational capacity for analytics.
Richard Davenport books on analytics, really great. Case studies from business, organisation. Talk about what would be your goal. Campuses are awash in data. Goal to get more data available at the right level to make a difference. Look at trends, ask what happens if they continue – talk about those getting to high school and completing. Those getting to college, we admit them, say they are ready, we should support them. John Campbell on the obligation of knowing – what should we do about it?
What happens next? Coming together of research communities of practice.
Proven means for optimising student success – typology
Don picks up. Thanks Gates Foundation for sponsoring the research.
Proven means for optimising student success – answer richer than expected, created a typology. Not Maslow’s hierarchy.
Managing student pipeline, at risk students (strategic enrolment); eliminate impediments to success (analyse bottlenecks, first year experience); dynamic query/alert/intervention, at risk behaviour; learner relationship management systems and processes; personalised learning system environments and learning analytics (focus on the learner, not the institution); data mining/big data; extend success to include employability/work – life success. Join institutional datasets with employment datasets, government datasets. 110m resumes and job descriptions on Monster.com – big potential.
Captured all 40 institutions on these levels.
Linda: What we found was not so surprising.
Technology infrastructure was key. Technology audits, analytics audits. One area, technology, doesn’t exist on its own. Institutions working together, that’s where it leverages for student success.
Processes, policies and practices – what’s helping, what’s in the way. In state of Minnesota, thorough pipeline of K-12-university datasets was important. Took about 3y to get through state legislature. Where are records, how comprehensive, very important. Capacity – e.g. someone who understands the systems … gets hired by the vendor. Where does the expertise need to reside? Across all areas. Online course that could certify you in analytics (George Siemens).
Biggest factor is culture and behaviour. Astounding to have people say they’re data-deprived. Critical.
And finally, rests with leadership. Purdue leading on improving retention.
Sample, was wide cross-section – for-profit and online not-for-profit; research univs, comprehensive univs, private colleges/univs, community colleges, systems.
Will be in conversation wiht SoLAR to have an international perspective.
Don: The vendors.
Don: Want to capture Business Intelligence vendors. A spate of consolidations where ERP vendors bought up BI vendors. Also LMS providers;.Advising retention vendors, and analytics, consulting, advising outfits. List of prospective next vendors. Want to expand to 40 vendors. Big list of analytics, consulting, generalised advising, next gen learning vendors.
What is it possible for the industry to aspire to? How much tension is there in the rubber band pulling learning analytics forward? Also asking the vendors what run-of-the-mill institutions are doing.
Linda: Preliminary insights.
Linda: Every institution knows the terminology, at least. Faculty don’t always. The leading-edge institutions are showing good RoI on student success due to analytics.
In same place we were with IT – the President has to know about it. And not just them.
Variety of build/buy/mash-up strategies being used. In Minnesota, thought have datasets in multiple places, big issue that they’re not talking to each other (LMS, student engagement systems).
Don: Tremendous expansion on analytics. This past year EDUCAUSE meeting, in Oct/Nov bigger, will be huge this year.
Expanded solutions from BI/ERP/LMS; new vendors in many categories. Much new cloud-based applications – e.g. comparative analytics across institutions. Move to cloud across vendors. Strong market incentives for innovation, expansion to next gen learning. More consulting services.
Linda: Institutional leaders tend to overestimate their data, information and analytics capacity!
We need affordable solutions – can’t just keep building out. Capacity to use what’s there has to move throughout the institution, in to hands who can make the difference, including students themselves. Need resources to sustain this.
Institutional-based work, using legacy capacity, and new free-range tools.
Glad tidings for people in analytics. Worldwide talent gap here. Big in big data; huge in education. Difficulties in hiring.
Need ‘free-range’ personal capacity building. Continuing need for enhancements in tools, applications – the bar is being raised. Was ERP system and report writer 5y ago; has moved on. Tremendous need for consulting, know-how, institutional ‘capacity augmentation’. Constellations of cloud-based collaborators/vendor communities/federated solutions.
Build the case studies, extend the interviews. Developing FAQs, and a match-up service. Developing A Toolkit for Building Organizational Capacity, working with EDUCAUSE (Year of Data Analytics, YODA).
Hands over to Al.
Will engage in some gossip. Not talking about what we’re doing, but our observations about our clients. Analytics is strategic, there will be winners and losers. Engaging in handicapping: which institutions will succeed? Relates to capacity. As a vendor, we want to succeed, but it’s a partnership. Capacity for analytics is on three dimensions.
1 Technology, 2 Culture, 3 Business.
Technology: Institutions we think will do well. Infrastructure and IT. In my career, toiling in coal mines of IT. Dirty business, hard work. That complexity is not going away. Winners will have capacity in their IT environment to deal with complexity. Warehousing, modelling, a lot of complexity.
Culture: Traditional insitutions work in silos – faculty, institutional research. Those charging ahead have everyone at the table, viewing analytics in a systematic and coherent way.
Business: Prospects of an imminent hanging concentrate the mind. For-profits get it. Non-profits are still would we should we/can we. Urgency is seen with for-profits. Student success, lost seat = lost revenue. Something to be learned. Thinking of education as a business, we’re still dragging our feet.
Building capacity for analytics is not just technology, it’s culture, governance. For-profits see analytics as a matter of survival.
Don: Look at 7 layers of typology above, like layers of an onion. Each one can add several percentage points to retention. Amazing how much difference systems like Purdue make by paying attention. The efforts now will seem simple in 3-5y. But by simply paying attention, treating as a matter of urgency, institutions can dramatically increase student success. This is only a beginning. Potential is huge. Starfish can go in and improve retention 5%-points in a year, shows how serious it is.
Linda: In the midst of more analysis, how to move the information out. Match-up tool so people can go in with stronger ability.
Someone 1: This is all going to take senior leadership support. Can describe carrots and sticks they’re using to promote retention, for e.g., as a serious problem to address.
Linda: Is an outstanding question. Different leaders do it differently. One did a webinar on transforming education. Looking at full-year offerings, not summer gap. Retention is a piece for how you review faculty. No Child Left Behind – schools and teachers assessed on how well their students do. Depending on which state, some state boards more involved. See more in online education. For the for-profits. If you’re not at a for-profit, partner with one or have a conversation. They don’t have the same barriers. Can say we’re going to do this, know where students are each week, and intervene. Knows part of John Campbell’s work is supported.
John Campbell: Administrative incentives – the key is facilitating staff to innovate. One of the biggest hurdles is just getting access to the data. It’s amazing what you can do when you can get the data, explore, enter in to a dialogue. For building capacity, part of it is having a culture that provides for this sort of innovation.
Linda: When leader says innovation is important, keeps resource to support even when cutting back elsewhere. People go across campus sharing great practices, faculty forums – you too can improve student behaviour. Stick rather than carrot – issue of student debt. When relates to students who don’t complete. Will be an accounting for how students are doing when public, federal money is going in.
Someone 2: Have found any patterns in the kinds of data the for-profits collect vs non-profits, for impact on e.g. sacred cows. Often research institutions presidents say how important it is that they attract and retain students. Often hard to articulate that in to something actionable that shows how research grants has impact on undergraduate courses.
Don: For-profits are not all alike, except in dedication to using analytics strategically. All use analytics from day 1 or before, find out when students demonstrating at-risk behaviour, can act in days or hours based on their behaviours. Consistently, they all do that, but have different approaches. Capella focusing on their competency model. UPhoenix doing a major redesign activity, looking at personalisation, impact on outcomes and student success, and embed that in processes. Embedding in processes is very much part of their DNA.
Linda: Not just gathering the data, but having the people understanding what we do when we see these things. It’s part of the talent base that once you have the data, you are disseminating and discussing it. Concern – invest in analytics, but not in intervention tools and mapping, improve what they’re doing. Keep it simple! Improve gateway courses. Ensure people can get help they need. Math Emporium – longstanding activity, can make a difference.
Someone from U Phoenix: Lot of things we’re doing, in the track tomorrow morning. Looking at student success. Lost seat, lost revenue – I’m not a fan of that. That’s saying we only care about one more class. Really focusing on student success and student learning as much as any other institution. Everyone’s trying to solve the same problems.
Al: For-profits interested in, non-profits not: consistently hear for-profits want to see data around faculty participation, response times – is tied to student success. Non profits aren’t asking about that data, it’s off the table and we can’t even talk about it. That’s disturbing. Motivation to teach, are they responsive – these are prima facie contributors to student success. See more openness at for-profit on that.
Linda: NESSI, CESSI – research show those who have a sense of urgency, unrelenting culture or belief the can do better – even the leaders.
Maria Andersen: I work at a non-profit, looking at every seat as absolutely important. Every student we lose is revenue we lose. Curious, haven’t heard about Achieving the Dream in community colleges. Any reason?
Linda: Not by intention. Some colleges surveyed are Achieving the Dream colleges. It does give a template. Even when not funded through it, some decided to use the template anyway. Some who went for Gates funding, pursued anyway. Infusing in to the data capacity, the IR capacity. Community colleges might have 1.5 IR people on average. Can’t do data without data people.
Maria: That’s what we discovered. We can’t access our data. Don’t have resources to get better access.
Don: Collaborations. Have to get to constellations of institutions to get beyond that trap, having 1.25 FTEs to do the job of 5. There are mechanisms for that. It’s not going to be done by each institution standing on their own.
Linda: IT was not IT forever. There was a place where technology became what it is. Learn what happened when that became interesting, important, urgent. Will have comparable pathway with IR, partnering across an institution. Interesting to look at how IT formed.
Someone 4: Danger of overselling to senior managers? Was at conference in 1989 on LMS, sure we’ll have the perfect one next year (!). Part of bigger, continuing improvement?
Don: Most institutions, if intelligent investment in analytics for student success, it’ll pay for itself. The ROI is obvious. Not that you can’t fail to get an ROI, but so many examples of institutions making the investment that pays for itself, immediately, and then continuing. This is why you go after low-hanging fruit, pick things that will be demonstrable. Demonstrate capacity to do it. That’s how to avoid over-promising.
Al: As vendors, we tend to oversell analytics. Clients come in, they think we just bought analytics solutions and this will solve everything. Then in to huge cycle of disappointment and recrimination. We’re now realising that each institution is in a different place, has different needs. Part of the onboarding process is let’s find out what the sweet spot is, what are small things you can get started with – 3month, 6 month, 1y plan. Instant effect doesn’t happen. It’s different in every institution.
Simon Buckingham Shum: What do you know about how students feel about this?
Sheila MacNeil: Concerned about the stakeholders, want to know more about students as stakeholders – a big gap.
Simon: I asked that because I saw your tweet (!).
Linda: Pointing to John because they have students who identify with their signal – I’m a yellow. You’ve used student focus groups from the beginning?
John: I’d twist it around. If analytics is also about informing the students – give them the same data that the faculty are getting. The students have responded extremely positively. They’re part of the project. It’s not a separate analytics system, they get to see the results. We as a community have to provide that transparency about how that data is being used. [Hear, hear!]
Don: Looking at that typology. Eliminating bottlenecks for students, those are all things students can be involved in. Right down the list is about helping students succeed, the more they’re involved in that process the better.
Simon: Calls time. This panel encapsulates what makes LAK LAK. Really interesting organisational change issues. Thanks the panel.
Awards a SoLAR T-shirt to Sheila MacNeil for tweeting a picture of superman and Shane Dawson where it was hard to tell them apart. (!)
This work by Doug Clow is copyright but licenced under a Creative Commons BY Licence.
No further permission needed to reuse or remix (with attribution), but it’s nice to be notified if you do use it.