Rob Abel, Ed.D. | November 2019
"I don't ask for much, I only want trust, and you know it don't come easy."—Ringo Starr
"When it comes to data and analytics in education, trust is everything. All other progress is dependent on establishing trust in the data and in the ecosystem."–Rob Abel, EdD.
What a great week of meetings recently at the IMS quarterly in Redmond, including our 5th annual Learning Analytics Summit, an event that looks across K-12 and higher education in terms of data, analytics, and student success. IMS is as strong as a proponent on the potential of data and analytics in education as you will find. But the "data" topic is one that requires great care—and will require a much higher level of trust in the education sector than we have today.
To make the point, last week's EdWeek update had lead articles on how the U.S. education system may be to blame for poor adult literacy; how more than $1 billion in K-12 edtech licensing goes to waste; and an opinion on how personalized learning may be exacerbating educational inequity. And higher education has its share of image issues in terms of the enormity of student debt, a recent highly publicized admissions scandal, etc.
Data and analytics, especially learning analytics, requires levels of transparency and trust beyond anything we have seen in the past. So, how do we get there?
I tried to capture some of what I believe the learnings are regarding data and analytics in education in my short opening keynote. I also learned a lot at the event. I'm going to summarize right here, right now:
The right approach to learning analytics is to set measured expectations on the near-term return on investment from data. Over-hyping of data is not our friend in the land of edtech. The hard part about data isn't getting data. The hard part is making sense of data and taking action on data. Most institutions in their current forms are not set up to act on data. Add to that the fact that understanding how and if learning is improved is science under development. Today, most of the return on investment for data in education are data warehousing type applications on relatively obvious use cases, where the key challenges are a flexible architecture (able to handle all key data sources) coupled with building clear decision making and action paths from the data. This would be the correct expectation to set today while the data science and technologies evolve.
Institutions need both IT and instructional strategies for data. Data as a strategy is not a strategy or even the right starting point for educational improvement. How many decades now have we witnessed that the collection of data in vast quantities has only confirmed how poorly we are doing? For data to be useful, it needs to help measure the success of a set of strategies that encompasses the user experience with technology as well as the instructional models being pursued (some might call this academic or digital transformation). The usability of technology is by far the single most important determinant of whether technology may improve something. Without a focus on integration for usability, we can pretty much guarantee the data is not going to be very helpful. Right along-side usability is instructional diversity, agility, and choice. Then, most importantly, there must be an associated instrumentation strategy for the data to be useful.
Realize that we have a long way to go in the evolution of meaningful data to improve learning. The insightfulness of data is directly related to how good the instruments of measurement are. Why is this a challenge? Well, there is a lot of questioning these days about whether the current approaches to measurement that we have today are the ones needed for societies of the future (or even for today's societies). The data can only get better if the instruments of measurement advance. The comparison I like to make here is to healthcare, where it took decades of experimentation and research to discover and make useful the technologies that provide valuable data (e.g., electrocardiogram).
Partnering together—as institutions and with supplier partners—is essential to get to the future. Institutions will be deeply involved in edtech research on efficacy, but it will ultimately be up to suppliers to prove the efficacy of their products. In healthcare, hospitals are engaged in research studies, but it is the suppliers that must prove the efficacy. Same in any other industry. It is up to organizations like IMS to provide the data interoperability foundation to enable this collaboration. Even the relatively simple use case of gauging relative and appropriate "use of technology" is not a simple use case from an R&D perspective. Thus, if you are being sold on the idea that data on usage or efficacy that is generated without appropriate controls is going to tell you something, well, please be careful. Instead, suppliers are going to have to get much better at not only providing data but being very clear on what that data means, i.e., how should the user interpret the data?
This is a good time to be collaborating on data analytics architecture. Each institution is going to need a scalable real-time architecture for data collection, processing, and analysis. As IMS members have been learning through various implementations over the last five years, analytics requires the power of a data warehouse and the flexibility of a data lake at the speed of real-time in-memory processing. The bad news is that this is a rapidly developing area that almost ensures that whatever solution you pick today will evolve each year significantly. The good news is that this is where enterprise-class cloud products (e.g., SAP Hana, Microsoft Azure, AWS, etc.) are going to speed the evolution of the solution. Interoperability standards need to address specific key issues to enable the foundation for learning analytics without going too far in terms of limiting innovation.
Privacy and security need to be built into the architecture, and data usage must reflect institutional data and analytics principles (such as these). Collaboration on privacy is within our grasp, as shown by the uptake of the IMS App Vetting work as a collaborative process among institutions and supplier partners. The reason this is important is that privacy is all about trust—and we must build trust together. Privacy of data in education is an especially large challenge because the institutions will be held responsible for getting this right. But the suppliers also have a huge stake as edtech without trust is not going to work. The IMS App Vetting work has me feeling we are on a good path here—useful right now and will only get better as we go.
Beginning with student achievement in mind is the right way to go about developing the analytics strategy and tools. This is similar to #2 above in terms of instructional strategy, but here I am talking about addressing the level of student progress toward the credentials that an institution provides. In IMS, we see the world of educational credentials evolving considerably. But even in today's world of state learning standards for K-12, the ability to tie data collection directly to progress toward the final objective is going to be of most use to faculty, students, parents, and administrators. If the edtech learning analytics dashboards developed do not tie to the educational goals it will make them hard to interpret and act on. These dashboards must be available at the right place at the right time. Think interoperable messaging and usability of data. Think data that drives instructional diversity. In addition, the interventions are key data to be tracked and analyzed.
There is a lot to do, but hopefully, the above does not sound pessimistic. Realism is not pessimism. We are putting in place the technical foundation that enables a future that is going to take a while to develop. But in the meantime, we already have excellent standards for moving data around (such as OneRoster, LTI, Caliper, and CLR) in a semantically rich and transparent way. Just encouraging these standards and evolving them as a community will help us build that foundation. I know we can do it.
But we are also building a more important foundation. We are building trust in the edtech ecosystem and the ecosystem participants. When it comes to data and analytics in education, trust is everything. All other progress is dependent on establishing trust in the data and the ecosystem. So trust is our biggest challenge and opportunity. I know that I speak for all IMS member organizations when I say that I know we will do our part!
In IMS, we all learn together—and I encourage you to take advantage of these IMS quarterly meetings. The next one features the annual Digital Credentials Summit, February 10-13, near Atlanta. Join us!