July 27, 2004
George Orwell Meets the Matrix
by Maureen Farrell
"We appear to be edging towards an era of 'mind control' -- a time when human brains might be manipulated routinely by highly sophisticated technology." -- Nicholas Regush, ABC News, Sept 5, 2001
"I've spoken about this at academic conferences. I find that the first reaction people have is, maybe, disbelief. But if I talk for two minutes, suddenly they begin to turn somber and say, 'This is the scariest thing I have ever seen.'" – University of Kansas research professor Jerome Dobson, on technology that could make George Orwell's "Big Brother nightmare...look amateurish," the Kansas City Star, March, 7, 2003
* * *
On July 19, Denzel Washington appeared on the Late Show with David Letterman to discuss his role in the Manchurian Candidate. Saying that his character and others are carted off to be "manipulated" after being attacked during Desert Storm, Washington addressed the real-life implications behind this fictional drama:
Not only is it creepy, but it sounds out-and-out ridiculous -- sort of like when Timothy McVeigh told friends that he thought the Army had implanted a microchip in his buttocks so the government could keep track of him.
Of course everyone (except for a handful of researchers and McVeigh’s prison counselors) thought the Oklahoma City bomber was 100% Grade-A delusional, but microchip technology to monitor humans (and manipulate behavior) is very real.
While The Manchurian Candidate itself is generating plenty of buzz, with folks speculating on everything from the corporation that lies at its center ("Think of it as the Carlyle Group or Halliburton on steroids," Frank Rich wrote) to Meryl Streep’s muse (She told Entertainment Weekly that Peggy Noonan and Karen Hughes offered hyperbolic inspiration, but Noonan keeps pushing the Hillary Clinton myth), the real life technology behind the plot is particularly intriguing.
Because regardless who the "Prentiss family dynasty" is supposed to be (they’re just "one syllable removed" from the Bushes, Rich said), such talk is mostly a matter of educated guesswork, sprinkled with a side of speculation. And while gossip usually fades fast, implants, on the other hand, are making their mark. Literally. For real. Even as you read this.
But, even so, could Denzel Washington’s "creepy" comment be considered a case of thriller-induced paranoia? Could he be taking his movie role a tad too seriously? After all, it’s clear that microchip technology, despite fears of Big Brother monitoring and other sinister applications, will improve countless lives.
With the future fast approaching, it’s questionable, as always, how we "ethical infants" will handle it. Yet, remembering that this journey began, to some degree, in collaboration with Nazis, it’s wise to remain somewhat suspicious.
With that in mind, here is a brief history of the technology of the future:
1950: Robert G. Heath and Dr. Russell Moore, funded largely by the U.S. military and the CIA, experiment with mind manipulation by inserting up to 125 electrodes into subjects’ brains (alongside drugs such as LSD). Heath also suggests that lobotomies be performed on patients, not for therapeutic reasons, but for the convenience of the hospital staff.
1953: John Lilly, of the National Institute of Mental Health, discovers that he can simulate a variety of emotions by placing electrodes inside a monkey’s brain. (A male monkey, for example, when given a switch to prompt orgasm, pushes the button approximately every three minutes.) Lilly's work draws the CIA’s attention and is later described in John Marks's The Search for the "Manchurian Candidate": The CIA and Mind Control (1979) and George Andrews's MKULTRA: The CIA's Top Secret Program in Human Experimentation and Behavior Modification (2001). [The Atlantic]
May 17, 1965: A front page New York Times story entitled, "Matador' With a Radio Stops Wired Bull: Modified Behavior in Animals Subject of Brain Study" features the work of Dr. Jose M.R. Delgado, inventor of the "stimoceiver," a miniature transponder implanted in subjects’ heads to control behavior and emotions. The article describes Delgado’s most famous experiment, wherein he steps into a pen with a ‘wired" bull and stops the raging animal, mid-lunge, via remote control. Delgado later suggests that this technology be used to curb criminal and obsessive behavior in humans and urges Congress to make "control of the mind" a national goal.
May 1, 1989: Former BBC producer and veteran foreign correspondent Gordon Thomas publishes Journey Into Madness, The True Story of Secret CIA Mind Control and Medical Abuse, connecting Jose Delgado’s views to those endorsed by Sidney Gottlieb, of CIA/ MK-ULTRA fame. He explains: "Dr. Gottlieb and behaviorists of ORD [Office of Research and Development, CIA, Central Intelligence Agency] shared Jose Delgado's views that the day must come when the technique would be perfected for making not only animals but humans respond to electrically transmitted signals."
June 16, 1995: Time magazine features an ad for an implantable pet transponder, oddly enough, aside an article about a militia man’s fears about the encroaching New World Order. By Aug. 2002, such devices are so commonplace that the Christian Science Monitor reports on how the military is "adopting a Big Brother approach" to "implanting microchips in cats and dogs that live on government land" in order to track down and penalize military families who abandon their pets.
Nov. 1996: "Click Here to upload your soul," advises one of many articles on British Telecom’s Martlesham Heath Laboratories’ "Soul Catcher" implant chip, which, as Personal Computer World explains, "will be implanted behind a person’s eye and will record all the thoughts and experiences of their lifetimes." Dr. Chris Winter tells London’s Daily Telegraph, "This is the end of death... By combining this information with a record of the person’s genes, we could recreate a person physically, emotionally and spiritually."
Oct. 15, 1998: The BBC reports on "bionic brain implants" developed by American scientists. "Over several months, the implant becomes naturally 'wired' into the patient's brain as neurons grow into the cones and attach themselves to the electrodes mounted inside," the report asserts.
Sept. 23, 1998: Cybernetics Prof. Kevin Warwick becomes the first known human to communicate with machines via a microchip implanted in his body. Predicting that such implants will eventually replace time cards, tracking devises and credit cards, Warwick tells ABC News, "I feel mentally different." Later, he tells Salon.com, "After a few days I started to feel quite a closeness to the computer, which was very strange. When you are linking your brain up like that, you change who you are. You do become a 'borg.' You are not just a human linked with technology; you are something different and your values and judgment will change." He also admits, "It does make me feel that Orwell was probably right about the Big Brother issue."
Dec. 7, 2000: CNN reports on Dr. Kevin Warwick’s next step, implanting a chip that interacts with his central nervous system. "This summer, a professor plans to take a step closer to becoming a cyborg -- part human, part computer -- by implanting a silicon chip that communicates with his brain," CNN says. With his wife also getting "chipped," Warwick later discusses the possibility that couples might one day read each other’s minds and experience each other’s pleasure (making faked orgasms obsolete). Their experience is recorded in the book, I Cyborg.
Sept. 5, 2001: ABC News’ Nicholas Regush warns that "mind control" could be on the horizon. "On the bright side, the powers of this science could be used to mend broken and diseased brains," he says. "On the dark side, there would be plenty of opportunity to tinker with consciousness and control human behavior in menacing fashion."
May 1, 2002: An ABC report entitled, "Scientists Develop Remote-Controlled Rats" describes a Defense Advanced Research Projects Agency (DARPA)-funded project, wherein rats, "each wired with three hair-fine electrical probes to their brains," are "directed through remote control by an operator typing commands on a computer." Lead scientist Dr. Sanjiv Talwar admits to the BBC that "the idea is sort of creepy" and tells the Guardian that remote controlled animals could be used for nefarious purposes, such as assassinations.
May 10, 2002: A family has microchips inserted into their bodies on national TV. An Applied Digital Solutions press release boasts: "VeriChip has been the subject of widespread media attention for the past few months, everything from Time Magazine to the Today Show, the Early Show, CNN's American Morning with Paula Zahn, CBS Weekend Evening News, and the O'Reilly Factor on Fox News. We're delighted that Good Morning America and CBS Evening News will cover the first-ever "chipping" procedures on May 10th.
Aug. 15, 2002: During the height of the ‘Summer of the Abducted Child,’ the Philadelphia Inquirer runs a front page story on the new "high-tech approach to child security" -- i.e. the "chipping" of children. Pointing to Applied Digital Solutions’ "prototype for an implantable GPS unit that could pinpoint a child's location," the article asks: "Would a parent really place a device under the skin of his or her child to guard against a vague threat?" before offering ADS spokesman Matthew Cossolotto’s reply: "We have GPS units for our cars. If your car is stolen, we can locate it. Do we love our cars more than our children?"
March 7, 2003: An article in the Kansas City Star features University of Kansas research professor Jerome Dobson "a respected leader in the field of geographic information technologies" who warns that GPS technology might lead to a form of "geoslavery" which could make "George Orwell's 'Big Brother' nightmare...look amateurish."
March 12, 2003: The BBC runs an article entitled "Scientists develop 'brain chip,' which states that "US scientists say a silicon chip could be used to replace the hippocampus, where the storage of memories is coordinated." The testing, beginning on rats and rapidly proceeding to monkeys, will ultimately be conducted on humans.
June 2003: "In a few months, researchers at the University of Southern California will test the world's first prosthetic brain part," Popular Science asserts, crediting biomedical engineer Theodore Berger with creating "a 2 mm-wide silicon chip that he hopes will one day substitute for damaged or diseased brain regions." Potential military uses for the brain chip, which is partially funded by DARPA, includes building "sophisticated electronics" and integrating them into human brains to possibly "one day lead to cyborg soldiers and robotic servants."
June 2003: In an article published on DARPA's Web site, Dr. Alan Rudolph explains how the agency's "Brain Machine Interfaces Program" will "create new technologies for augmenting human performance" by "access[ing] codes in the brain" and "integrat[ing] them into peripheral device or system operations. [BuzzFlash] Though the article is no longer available (and the term "brain interface program" is nowhere to be found) the link now directs browsers to an article on "Human Assisted Neural Devices," which also discusses accessing "codes in the brain."
Jan. 16, 2004: The headline, "Is It Possible to Download Knowledge into the Brain?: Mind-machine interfaces will be available in the near future, and several methods hold promise for implanting information" alerts readers of Better Humans to futuristic possiblities.
Welcome to the Matrix
April 14, 2004: The Associated Press blasts the headline "FDA Approves Brain Implant Devices." Citing benefits to those with physical impairments and brain disease, scientist Richard Andersen notes that "surgeons are already implanting devices into human brains -- sometimes deeply -- to treat deafness and Parkinson's disease" and says, "I think there is a consensus among many researchers that the time is right to begin trials in humans."
May 28, 2004: Reporting for the Chicago Tribune on today’s "transhumanists" (those who believe we’re in a "transitional phase between our human past and post-human future") Margie Wylie asserts that "Humanity is on its way out."
June 25, 2004: Washington University reports that, "For the first time in humans, a team headed by University researchers has placed an electronic grid atop patients' brains to gather motor signals that enable the patients to play a computer game using only the signals from their brains."
July 12, 2004: Japanese school children will soon be tagged with tracking devices, albeit non-intrusively, an article in CNETAsia explains. "The rights and wrongs of RFID-chipping human beings have been debated since the tracking tags reached the technological mainstream. Now, school authorities in the Japanese city of Osaka have decided the benefits outweigh the disadvantages and will now be chipping children in one primary school," the article asserts.
While this is just a sampling of research and technological advances, one thing is clear, a brave new world awaits. "Humanity’s ability to alter its own brain function might well shape history as powerfully as the development of metallurgy in the Iron Age," cognitive neuroscientist Martha Farah and others wrote in the May issue of Nature Reviews Neuroscience. Yet, given technology’s potential to go haywire, warnings issued in films like 2001: A Space Odyssey, The Terminator and The Matrix have more at their heart than mere celluloid.
And movies like I Robot and The Manchurian Candidate aside, the original Star Wars remains the ultimate in archetypal entertainment. When Bill Moyers interviewed Star Wars collaborator and legendary scholar Joseph Campbell for PBS' "Power of Myth" series, Campbell explained its underlying timeless theme. "Man should not be in the service of society, society should be in the service of man," he said. "When man is in the service of society, you have a monster state, and that's what is threatening the world at this minute. . . [Star Wars] shows the state as a machine and asks, 'Is the machine going to crush humanity or serve humanity?'"
But it’s not just filmmakers asking such questions. Scientists like Bill Joy have weighed in, too. As he explained in Wired: "Failing to understand the consequences of our inventions while we are in the rapture of discovery and innovation, seems to be a common fault of scientists and technologists; we have long been driven by the overarching desire to know that is the nature of science's quest, not stopping to notice that the progress to newer and more powerful technologies can take on a life of its own."
And so, from 2001's HAL to Darth Vader to "chipped" assassins, we’ve had a series of fictional reminders that the battle between what is cold and hard and inhuman perpetually rages against what is human and loving and kind. And, of course, it’s naive to think that the fight begins and ends with what’s up there on the screen.
As we enter this new era, wherein technology can either free or enslave us, it’s best to remain mindful of the monster that has, throughout the ages, paced hungrily through History’s darker halls.
Though it remains faceless, and for many, nameless, it exists, just as surely as love and hope and compassion exist. And it is out there, crouching, ready, once again, to devour what is uniquely and gloriously human.
Maureen Farrell is a writer and media consultant who specializes in helping other writers get television and radio exposure.
© Copyright 2004, Maureen Farrell