I have been known to say that William Gibson is arguably the most important author of the past 30 years. That’s a mouthful of an assertion, especially since we’re talking about a genre writer, I know. But even if I’m wrong, I’m not off by much. The man who more or less invented Cyberpunk, then abandoned it as quickly as he defined it, did more than simply alter the direction of science fiction, he literally helped shape the computing and Internet landscape as we know it today. That’s pretty big doings for a guy who had never so much as played with a computer before he wrote his first novel.
This story we’ve heard before, but here’s the Reader’s Digest version for those late to the party. Gibson’s Neuromancer (the first novel to ever win the SF triple crown – the Hugo, the Nebula, and the Philip K. Dick awards) introduced us to cyberspace, a “consensual hallucination” in which humans used computers to navigate around the global online network. He imagined it as an immense, three-dimensional virtual space, and as his “Cyberspace Trilogy” (Neuromancer, Count Zero, Mona Lisa Overdrive) unfolded, we also encountered killer viruses, psychic online projections of humans whose flesh was being kept technically alive in protein baths out in meatspace, and even artificial life forms that had evolved from advanced artificial intelligences created by powerful corporate interests. Read more
Last night we watched the Final Cut of Blade Runner again, and if you don’t have this package I can’t recommend it highly enough. 25 years on, Ridley Scott was able to finally re-craft the film as he wanted it originally, and the result is a stunning achievement. Scott has been one of our greatest directors for a very long time, but this may be his finest moment to date.
This viewing (probably my 35th or 40th – I lost count a long time ago) got me to thinking, all over again, about how little the film was acknowledged at the time of its release. Read more
Samuel R. Smith, University of Colorado
Jim Booth, Surry Community College
She held out her hands, palms up, the fingers slightly spread, and with a barely audible click, ten double-edged, four centimeter scalpel blades slid from their housings beneath the burgundy nails. She smiled. The blades slowly withdrew.
– William Gibson, Neuromancer (1984)
Pat Diener…is 26 years old, and she is going deaf. Landing her in the annals of science are the microscopic electrodes that doctors have buried deep inside her brain. Two fine platinum wires – as thin as a human hair and insulated in teflon – run underneath the young woman’s skull, connecting the electrical circuitry inside her head to a black plastic plug that sticks out from behind her left ear. From there, Diener can wire herself into a pocket-sized “speech processor” that picks up sound and transmits it to the electrodes, enabling the brain to interpret it.
– Associated Press Wire Report, 12/2/92
The technological explosion of the last few decades has made workaday fact of once-wild science fictions like genetic engineering, space travel, laser surgery and computer-generated animation – not to mention the handy little construct used to produce this document, the IBM-compatible 386-SX personal computer. Read more