A colleague and I went to a presentation by Apple Computer on using their technology in higher education. They showed us some things I’d seen before such as lectures by some of the most famous professors online for anyone to see. My colleague was stunned and began to develop a view that higher education as we know it is over. After all, if anyone can have access to the substance of an ultra-fine education whenever they want it, why have a multitude of institutions doing the work separately across the land?
I am less sure than my colleague that the revolution is on the horizon. There are several reasons. Just because someone can witness all the lectures from Verygood business school does not make them a Verygood business grad. They don’t necessarily have the scores to get into Verygood business. They won’t have spent a few years interacting with students in Verygood business. Nor will they have had their work evaluated by Verygood business professors over a period of years. If access to content were the whole ball game, then the profusion of good public libraries in the twentieth century would have spelled the end of higher education. Content is NOT king in higher education. Part of the answer is that education is about more than the information a professor can convey in a series of lectures. Grades matter. Class dynamics matter. Class participation matters. Interaction with the professor matters. Community matters.
The basic question remains, though. Why can’t I just watch all the Verygood business lectures and then do my own homegrown projects and show up for a job interview with a portfolio I have compiled? Isn’t it true that in the past many lawyers simply studied on their own and then took the bar?
Better yet, why don’t employers create their own schools to teach exactly what they want and then recruit students out of high school to attend? Why doesn’t Apple have an Apple Institute full of teachers of all things relevant to making Apple great and profitable?
The answer on both counts is that the current system makes life easier for employers. Sure, human resources departments could look at the entirely self-taught individual’s portfolio and try to make a judgment as to his or her skill. But it is much easier (and you will never go broke betting on easy) to find a recognizable credential and filter out the ones who have it for further consideration. College and university degrees provide a time-saving way to evaluate large numbers of applicants. The same is true of a law school class rank, for example. A firm might weed people out by considering only graduates in the top 15% of their class.
And why doesn’t Apple run its own college or school? Why should the shareholders of Apple foot the bill for such an exercise when they can sit back and wait for lots of promising young people to graduate (at their own expense) so they can skim the cream with no more cost than what it takes to recruit and evaluate?
No, higher education is not dead simply because the content is easy to access for potential do-it-yourselfers. Nor is it going to be replaced by expensive institutes run by employers for employers. They would rather students pay and they pick the winners afterwards.