Techno-determinists take a hit
This popped up in my Edupage e-mailer:
STUDY OF LAPTOP USE ON CAMPUS FINDS MIXED RESULTS
Researchers at Carnegie Mellon University found that although laptop-using students have more flexibility in when and where they study and spend more time on assignments than students who visit computer labs, their academic work shows no measurable improvement. Both groups of students get approximately the same grades. The researchers discovered that the laptop users spent considerable time on e-mail, instant messaging, and surfing the Web. (Chronicle of Higher Education, 29 November 2006 [sub. req’d])
Wait – so you adopt a technology that doesn’t address the real problem, because you haven’t bothered to understand the problem, and it doesn’t work? Wow – who could have seen that coming?
A note for the If-You-Build-It-They-Will-Come crowd: yes, they will come, assuming the “it” looks cool. But it’s not guaranteed that they’ll behave the way you expect once they arrive. As Gibson said, “the street finds its own uses for things.” Those uses tend to emphasize leisure and raw commerce. I wonder where computing and Net development would be right now without porn, gaming, and other forms of revenue generating entertainment.
There’s no arguing the magnificence of computing and the Internet, of course. But when a new technology emerges, the first wave of commentators always promotes it as a panacea for all the world’s ills. Eventually a more informed and less euphoric comprehension of the technology’s strengths and limitations evolves, but that can take awhile – especially in a world where there are so many powerful people with so much to gain from the sale of said technology.
Here we have a classic example. Our culture has been swallowed whole by the computing hype, and sure, it’s easy to see how that could happen. But when the dust clears, some things need to be emphasized. While computers and the Internets are great for gathering, distributing, processing and moving around information, electronic technologies do not assure better thinking. Or more thinking. Thinking can result, but if it does, it does so as a result of drivers that have nothing to do with the device.
In fact, I think there are any number of cases we can point to where tech seems to serve as an impediment to thinking, particularly in the world of education. Students who can’t discern the credibility of online sources, for instance. Who think that volume of undifferentiated data can cover for their inability to analyze.
Probably nowhere is the failure of digital tech more obvious than in online discussion forums. Some wingnut somewhere has information on his/her Web page to “prove” just about any wack theory you can think of, and if you attempt to assert that maybe, just maybe, a Nobel laureate’s work is a bit more credible than the conspiracy theory some guy living in his parents’ basement has cooked up, you quickly discover the dark underbelly of the computing revolution: leveling. All opinions are equal, and any attempt to assert credibility attaching to traditional sources of status – job experience, academic attainment, publications, etc. – earns nothing but scorn. This doesn’t describe all such forums, of course. But when you build it, a lot of the “they” that come are going to be trolls.
In other words, computing doesn’t automatically allow smart people to elevate people who are…less smart. Instead, it’s the perfect weapon for stupid people who want to drag intelligent folks down to their level.
Here’s hoping education administrators across the country will see this study and pause to reflect on the possibility that the technology is no better than the pedagogical context in which it’s employed.
Remember: data is not knowledge. Knowledge is not wisdom. And the culture’s need to inculcate wisdom is not a function of how portable your computer is.
:xpost:


Great post.
I remember using an IBM 1620 engineering computer back in 1972 that we had to use Fortran IV to program. We thought it was so cool, but we didn’t learn a thing using it.
Incidently, my HP scientific calculator has about 10,000 times the speed, memory and performance compared to that old IBM….which cost over $200K new.
Technology might improve, but wisdom stays the same.
Aloha,
Jeff
Great post.
I remember using an IBM 1620 engineering computer back in 1972 that we had to use Fortran IV to program. We thought it was so cool, but we didn’t learn a thing using it.
Incidently, my HP scientific calculator has about 10,000 times the speed, memory and performance compared to that old IBM….which cost over $200K new.
Technology might improve, but wisdom stays the same.
Aloha,
Jeff
If by the same,” you mean in terribly short supply, then I’m with you… 🙂
If by the same,” you mean in terribly short supply, then I’m with you… 🙂
Yeah, short supply does apply. I don’t think we’re any wiser today than we were in the days of ancient Rome.:)
Aloha,
Jeff
Yeah, short supply does apply. I don’t think we’re any wiser today than we were in the days of ancient Rome.:)
Aloha,
Jeff
Some days I think I’d argue. Others, maybe not. Maybe what happens is that the cultivation of information inherently complicates the process of developing knowledge. The more bits of data you have, the better your filters and processors and analysis functions need to be. Transforming the raw ore of information into the rare metal of wisdom is incredibly complicated, and more info automatically means a worse signal:noise ratio.
Maybe.
Some days I think I’d argue. Others, maybe not. Maybe what happens is that the cultivation of information inherently complicates the process of developing knowledge. The more bits of data you have, the better your filters and processors and analysis functions need to be. Transforming the raw ore of information into the rare metal of wisdom is incredibly complicated, and more info automatically means a worse signal:noise ratio.
Maybe.
On this one I get regularly and astonishingly angry about the ‘digital divide’ belief that Africans are poor because they don’t have computers. Which is why MIT is working on the $ 100 notebook. That said, there’s a warehouse down the road where second hand Pentium II and III computers are available for about $ 50. Strangely enough, their availability hasn’t reduced our unemployment from the current 40%.
On this one I get regularly and astonishingly angry about the ‘digital divide’ belief that Africans are poor because they don’t have computers. Which is why MIT is working on the $ 100 notebook. That said, there’s a warehouse down the road where second hand Pentium II and III computers are available for about $ 50. Strangely enough, their availability hasn’t reduced our unemployment from the current 40%.
Despite countless examples throughout history, we still haven’t figured out that technological advances don’t eliminate key elements of the huiman condition, like class division. They merely provide new forms for it.
Despite countless examples throughout history, we still haven’t figured out that technological advances don’t eliminate key elements of the huiman condition, like class division. They merely provide new forms for it.