Is That All There Is? Higher Education’s Struggle to Leverage Digital Teaching and Learning

Dr Keith Christopher Hampson
7 min readDec 10, 2016

“What would become of such a child of the 17th and 18th centuries, when he should wake up to find himself required to play the game of the 20th century?”
Henry Adams, The Education of Henry Adams

“Is That All There Is”. Words and Music by Jerry Leiber and Mike Stoller.

In the late 1990s and early 2000s, what we might now call “the early days” of online higher education, advocates of technology-mediated learning imagined themselves as outsiders, rebels working on the margins of higher education. Their goal, broadly, was to deliver the transformative power of Internet technology to a change-adverse, centuries-old institution. Keenly aware of the presence of naysayers among them in the institution who believed that technology was an anathema to “real education”, our rebels adopted an us-against-them stance; the technology evangelists against the Luddites; the cutting edge against those that simply “didn’t get it.”

It would be difficult for these rebels to maintain the posture of an outsider in 2017. During the last two decades, and particularly the last few years, instructional technology has become thoroughly mainstream across much of education, a major influence on corporate, K12, and higher education. In higher education, online learning has become synonymous with all that is thought to be forward-thinking and innovative in the sector, fairly or not. Digital learning has shifted from a little understood, and marginal activity carried out by a handful of restless academics and dishevelled tech staff working out of the university’s basement, to the single greatest hope for an institution facing unsustainable increases in operating costs, shifting student demographics, and demands for improved and demonstrable learning outcomes.

Even major news sources began to pay attention. The Atlantic, New York Times, Huffington Post and others have helped promote the idea that higher education is being transformed by technology. One after another, university Presidents, not typically revolutionaries, proudly proclaimed that their universities and colleges were part of this transformation. Others in and outside of the academy were less enthusiastic — envisioning Internet technology and its demands would serve as the trojan horse to upset all that was good and holy about this centuries-old institution. Public intellectuals argued that technology would disrupt higher education, just as it did the music recording industry, newspapers, and bookstores. We’re next, they warned.

The Potential . . .

I’ve been working in the digital higher education space since the late-1990s, first as a member of a university faculty, followed by an eight-year stint as the Director of a large online learning unit. I now serve as an analyst and consultant. Over the years, my views of digital higher education have evolved. But through it all, I count myself among the growing number of the allegiant. I think that thoughtfully designed instructional technology and media can play an important, even transformative role, in teaching and learning in higher education. I’ve seen enough evidence to state confidently that it has the potential to dramatically reduce the cost of learning, meet the needs of a wider range of people, and improve the overall quality of learning. In short, I’m a believer.

The potential is truly extraordinary. Given the unique economics of the Internet, it’s possible to produce and share instructional media with production value that rivals the best of Madison Avenue advertising. Storytelling and other creative arts can engage students in new ways. The rapidly expanding field of data analytics can help us understand how well students are learning and, when done properly, be used to modify curriculum in real-time to meet the unique needs of each learner. Dashboards can help students understand how they learn most effectively and where and when they need help. Simulations can be built that allow students to “learn-by-doing” in a realistic, risk-free environment. Games can increase the time students spend on tasks, thereby increasing their chances at mastery.

. . . And the Reality

I’m confident then about the potential of technology-enabled learning. However, I’ve grown increasingly less confident that the institution of higher education can play a major role in realising this potential. Evidence is mounting that the institution of higher education, as it is currently designed, is largely ill-suited to developing and leveraging more advanced uses of technology for teaching and learning. And given the institution’s near monopoly on widely recognised adult education in much of the West — higher education is likely inhibiting the development of more advanced forms of instructional technology and media, as well as new ways to bring these new forms to people at lower costs.

Since the spread of Internet access in the latter half of the 1990s, colleges and universities have demonstrated a remarkable inability to leverage these networks and related technologies to improve the quality and cost-efficiency of learning. This is the state of affairs despite the fact that universities were quick to turn to the Internet — and before that, various other technologies (e.g. CDs) — for teaching purposes. The situation exists despite the level of attention and investment directed at online learning during the past two decades, and despite the extraordinary advancements in technology that we have witnessed in other sectors over the same period of time.

In the late 90s and early 00’s, a handful of pundits concluded that higher education would have no choice but to be reconfigured by the extraordinary capacity of the Internet. They put two and two together and predicted that students all over the world — once connected — would have access to the very best educators, practitioners, intellectuals. Economies of scale would drive down costs dramatically, ensuring access to high-quality learning opportunities for even under-represented student populations. A new crop of talented professionals from education, design, and software would quickly start building digital-born instructional models that would stimulate learning in ways simply not possible in classrooms, lecture halls, and labs. Education is too important, demand growing too quickly, and costs declining too rapidly for us not to take full advantage of the opportunities the technology could obviously enable.

But for the past two decades, the institution of higher education has made few substantive changes to how it operates. While virtually every institution across the OECD has invested in digital learning, and university presidents now routinely pepper their speeches with the appropriate keywords signalling their commitment to digital education, the actual steps made to leverage the dramatic changes in technology in higher education have remained tentative, unimaginative, marred by self-interest, and ultimately lacking in ambition. Despite endless talk of “transformation”, “revolution”, and, of course, “disruption”, initiatives with the potential to improve learning and reduce costs through technology have either failed to gain sufficient traction, or were rejected out-of-hand because they challenged the culture, interests, and processes of the institution and its’ deeply ingrained conventions.

Tuition for online students has not dropped; indeed, online programs frequently have higher fees than on-campus versions. Students are regularly presented with digital course materials that are nothing more than repurposed classroom materials, reflecting the fact that the bulk of the responsibility for the design and development of course content falls largely on the shoulders of individual academics without the incentives, time, or skills required to do more ambitious work. The dominant technology in online education — the learning management system — serves primarily as a course management tool; an expensive and over-complicated filing cabinet for repurposed classroom materials. The LMS was quickly adopted across higher education not because of its capacity to transform learning, but because the technology fit so easily into the traditional practices, roles, and responsibilities of classroom education.

More troubling still is the mounting evidence that a common understanding has already begun to solidify in higher education about “how we do online learning”. For a surprisingly large number of professionals in higher education, simply “putting courses online” — shorthand for uploading static classroom instructional content into an LMS — is taken as evidence that an institution is a bonafide member of the digital age. After twenty years of online learning, the use of high-quality educational media, simulations, adaptivity, game-based learning, and other experiences made possible by advances in technology and the economics of the Internet constitute a mere fraction of the total higher education experience in North America. Can it be that the value higher education is able to extract from the Internet already reaching its peak? Has the proverbial S-curve of innovation already flat-lined? Is that all there is? (With all due respect to Peggy Lee. Here’s a darker version from my teens by Christina).

By no means am I tech evangelist. I don’t believe that digital learning is the silver bullet for all that ails higher education. Despite the great attention it currently receives, digital learning is just one piece of the very large and very complex puzzle of how we improve student learning outcomes. And learning takes many forms. Conversation, reading, writing, travel; all are important. I certainly don’t want my daughters to learn online exclusively. But if we’re going to make digital learning part of the education mix, and I think we should, we need to take it seriously; we need to actually to begin to leverage the possibilities it affords us, which we are currently failing to do.

In a series of upcoming posts, I set out to decipher what stands in the way of significant improvements for the use of instructional technologies and digital media to improve the quality and cost-effectiveness of learning in traditional colleges and universities. This effort takes the form of a series of essays (see “Notes” below); each a vehicle for the author — and hopefully the reader — to understand why higher education has yet to take substantial steps toward leveraging the new possibilities and why, in certain cases, it may not.

Notes

I like the logic of the “essay”. Wikipedia provides a good definition: From late 15th century (as a verb in the sense ‘test the quality of’): alteration of assay, by association with Old French essayer, based on late Latin exagium ‘weighing,’ from the base of exigere ‘ascertain, weigh’; the noun (late 16th century) is from Old French essai ‘trial.’ Source: Wikipedia.

--

--

Dr Keith Christopher Hampson

Advisor to freelancers and enterprises in the retail education industry.