Educating the 21st Century cyberstudent…or not?
Don Tapscott has some radical new ideas about education. Here’s a sampling (as related by ReadWriteWeb):
- “…the age of learning through the memorization of facts and figures is coming to an end. Instead, students should be taught to think creatively and better understand the knowledge that’s available online.”
- “…Google, Wikipedia, and other online libraries means that rote memorization is no longer a necessary part of education.”
- “Teachers are no longer the fountain of knowledge; the internet is…”
- “Kids should learn about history to understand the world and why things are the way they are. But they don’t need to know all the dates. It is enough that they know about the Battle of Hastings, without having to memorize that it was in 1066. They can look that up and position it in history with a click on Google.”
(These last two are quotes directly from Tapscott, by the way, and I need to go pick up this book. It seems awfully interesting – but for now the RWW report will have to do.)
That one item – “Teachers are no longer the fountain of knowledge; the internet is…” – is among the most terrifying concepts I’ve ever run across, by the way. I don’t know exactly how he intends us to understand the pronouncement, but the Internet is not a fountain of knowledge, at least not in the absence of strong thinking skills. It’s a firehose of data, to be sure, but as I’ve noted before, data isn’t quite information, information isn’t knowledge, and knowledge isn’t wisdom. More on this later.
Way, way back in 1989 my fellow scrogue, Dr. Jim Booth, and I did a series of seminars where we actually argued that these kinds of changes were already happening (the essay linked here was updated slightly in the mid-’90s to account for the emergence of the early Internet). At that point we weren’t talking about the Net so much, of course, but were mostly focused on how the socializing process of television was altering the function and utility of the human brain.
Thanks to television and instantaneous global communications, thanks to the electronic data base, to the video game system, word processor, hand-held calculator, digital synthesizer, computer billboard and infonet – thanks to a boggling array of modern and post-modern amusements and conveniences, humans have evolved, perhaps more rapidly and more dramatically than at any time in our history.
From a traditional perspective, we simply don’t know all the things we’re supposed to know. A number of writers and researchers have argued, quite persuasively, that American students are impoverished in basic geography, history, literature, and math skills.
However, while Jane can’t perform long division, she is pretty handy with a calculator. Maybe Johnny can’t spell, but his word processor, like mine, has a built-in spell-checker. And while Danny is probably beyond hope, Jimmy knows exactly where to go to find out all he needs to know about Mexico – especially if his computer is on-line with an interactive infonet like The Source or CompuServe.
We went so far as to argue that the moment we were in – or more accurately are in – represented a critical leap ahead in human evolution.
A cursory glance at the Geologic Timetable in Webster’s Dictionary reveals that major evolutionary and anthropological events often parallel significant geological shifts. The first evidence of humanity, for example, roughly coincides with the onset of the Quaternary Period some two million years ago. A Wake Forest University Anthropology professor we consulted recently pointed out certain major changes in human living patterns at the beginning of the Holocene Epoch – the “recent,” or post-glacial period.
It isn’t at all unreasonable to wonder whether we are in the midst of what geologists 10,000 years from now might see as the transition from Holocene to whatever comes next. The difference between the dawn of this epoch and all others before it, though, is that this time it will be engineered. The environmental changes which loom now are the exclusive product of human technology.
For the heck of it, we termed this transition from human to posthuman the “cyberlithic.”
There’s no question, as Tapscott believes, that our brains are being re-wired – Jim and I made that point in these seminars, too – but I wonder how hellish the cost may prove to be. As RWW notes:
Today’s students are growing up in a world where multi-tasking has them completely immersed in digital experiences. They text and surf the net while listening to music and updating their Facebook page. This “continuous partial attention” and its impacts on our brains is a much-discussed topic these days in educational circles. Are we driving distracted or have our brains adapted to the incoming stimuli?
I know that much has been made of the digital generation’s ability to multi-task, for instance, but I have yet to see any evidence that doesn’t make clear how doing several things at once reduces overall efficiency.
Dr. Gary Small, a researcher at UCLA, is also examining how daily use of digital tech re-wires the brain. His particular concern has to do with the erosion of social skills.
When the brain spends more time on technology-related tasks and less time exposed to other people, it drifts away from fundamental social skills like reading facial expressions during conversation, Small asserts.
So brain circuits involved in face-to-face contact can become weaker, he suggests. That may lead to social awkwardness, an inability to interpret nonverbal messages, isolation and less interest in traditional classroom learning.
Small says the effect is strongest in so-called digital natives — people in their teens and 20s who have been “digitally hard-wired since toddlerhood.” He thinks it’s important to help the digital natives improve their social skills and older people — digital immigrants — improve their technology skills.
(Of course, Small’s brainiac theories are thoroughly refuted by “at least one 19-year-old Internet enthusiast” who “lives near Pasadena” and “spends six to 12 hours online a day.” This, though, is probably something that should wait until my next missive on the sorry state of science reporting in America.)
The “brain as computer” model that Jim and I discuss only works if education does a good job of developing the processing and search functions. Yes, all the data is online, or soon will be. So maybe it’s not critical that we have it all memorized. But, are we capable of finding what we need quickly and efficiently? Are we adept at sorting information from disinformation? And most importantly, are we able to think critically about the data we retrieve?
All the evidence I see around me says we’re failing on all fronts. A former colleague, an incredibly accomplished man who these days teaches undergrads for a living, once observed something to the effect that “once they get past downloading music, IMing their friends and surfing porn, these kids are helpless with computers.” Maybe he exaggerates a little for emphasis, but my experience (and volumes of research supporting it) say that the techspertise of “today’s youth” is overrated. Their lives are dominated by electronic technology, to be sure, and they can become fluid end-users, but you have to be careful about using the word “savvy.”
On top of this, the Millennial Generation has been trained to be very good at short-term tasks with readily identifiable objectives. This has come at the expense of teaching them abstraction, critical evaluation and problem solving skills. They’re far better in teams than my generation (which is nearly feral in its individualistic approach), and this mitigates the problem significantly when they’re allowed to work in groups, but still, the premise that the ubiquitous availability of every scrap of information in the world somehow renders obsolete old ways of knowing and learning is … suspect. At best.
In some ways it’s nice to reflect, after being so wrong about so many things in my life, on something that I was part of getting right a long time ago. Still, seeing the early emergence of an important (and perhaps obvious) trend is hardly the same thing as having a robust solution for all the problems that will result. The fact is that we do, absolutely, have new tools at our disposal that can enable us to dramatically improve the sum total of what is known. The Net can help us generate new data, and as I suggested above, with the proper kind of education we can develop that data into information, and from there transform information into knowledge and eventually sift wisdom from the knowledge.
But the machines won’t do it by themselves. It’s up to us to craft the policies and processes that turn the machinery to the uses we want and need, and we have barely taken the first step down that road…