Category: Food For Thought
“If the human brain were so simple that we could understand it, we would be so simple that we couldn’t.” – Emerson Pugh
“How many of our most joyful memories have been created in front of a screen?”
– Jon Freeman
“Based on first-hand evidence of your own senses – the improved health and later ages at which acquaintances die nowadays as compared with the past; the material goods that we now possess; the speed at which information, entertainment, and we ourselves move freely throughout the world – it seems to me that a person must be literally deaf and blind not to perceive that humanity is in a much better state than ever before.”
–Julian Simon, author of The Ultimate Resource
“Today’s world is one in which the age-old risks of humankind – the drought, floods, communicable diseases- are less of a problem than ever before. They have been replaced by the risks of humanity’s own making – the unintended side-effects of beneficial technologies and the intended effects of the technologies of war. Society must hope that the world’s ability to assess and manage risks will keep pace with its ability to create them.”– J. Clarence Davies, quoted in Conservation Foundation: State of the Environment, An Assessment at Mid-Decade, 1984
“Can a machine be a genuine cause of harm? The obvious answer is affirmative. The toaster that flames up and burns down a house is said to be the cause of the fire, and in some weak sense, we might even say that the toaster was responsible for it; but the toaster is broken or defective, not immoral and irresponsible, though possibly the engineer who designed it is. But what about machines that decide things before they act, that determine their own course of action? Somewhere between digital thermostats and the murderous HAL of 2001: A Space Odyssey, autonomous machines are quickly gaining in complexity, and most certainly a day is coming when we will want to blame them for genuinely causing harm, even if philosophical issues concerning their moral status have not been fully settled. When will that be?”
-Anthony F. Beavers, Ph.D., in Review of Moral Machines: Teaching Robots Right from Wrong by Wendall Wallach and Colin Allen. Read more here.
“Christian Licoppe and Jean-Philippe Heurtin have argued that cell phone use must be understood in a broader context; they note that the central feature of the modern experience is the “deinstitutionalization of personal bonds.” Deinstitutionalization spawns anxiety, and as a result we find ourselves working harder to build trust relationships. Cell phone calls “create a web of short, content-poor interactions through which bonds can be built and strengthened in an ongoing process.”
But as trust is being built and bolstered moment by moment between individuals, public trust among strangers in social settings is eroding. We are strengthening and increasing our interactions with the people we already know at the expense of those who we do not. The result, according to Kenneth Gergen, is “the erosion of face-to-face community, a coherent and centered sense of self, moral bearings, depth of relationship, and the uprooting of meaning from material context: such are the dangers of absent presence.”
-Christine Rosen, Our Cell Phones, Ourselves. Read more here.
“It is now abundantly clear that we have at our fingertips all of the tools we need to solve the climate crisis. The only missing ingredient is collective will.”
-Former Vice President Al Gore. Read more here.
“With the genome no less than with the Internet, information wants to be free, and I doubt that paternalistic measures can stifle the industry for long (but then, I have a libertarian temperament). For better or for worse, people will want to know about their genomes. The human mind is prone to essentialism — the intuition that living things house some hidden substance that gives them their form and determines their powers. Over the past century, this essence has become increasingly concrete. Growing out of the early, vague idea that traits are “in the blood,” the essence became identified with the abstractions discovered by Gregor Mendel called genes, and then with the iconic double helix of DNA. But DNA has long been an invisible molecule accessible only to a white-coated priesthood. Today, for the price of a flat-screen TV, people can read their essence as a printout detailing their very own A’s, C’s, T’s and G’s.
A firsthand familiarity with the code of life is bound to confront us with the emotional, moral and political baggage associated with the idea of our essential nature. People have long been familiar with tests for heritable diseases, and the use of genetics to trace ancestry — the new “Roots” — is becoming familiar as well. But we are only beginning to recognize that our genome also contains information about our temperaments and abilities. Affordable genotyping may offer new kinds of answers to the question “Who am I?” — to ruminations about our ancestry, our vulnerabilities, our character and our choices in life.”
“Over the next 15 years, some of the most notable advances in computing will be in its relationship to people: distributing human mindpower to solve problems both large and small, and monitoring and ultimately altering people’s bodies and actions in ways previously impossible. These are not phenomena to be avoided so much as they are to be organized and perhaps regulated so that their ubiquity will enhance rather than debase the human condition.
Shaping them will require an informed and widespread debate with tools drawn from many disciplines. Philosophers will attempt to construct a utilitarian calculus. Computing security professionals will ask what information we want to protect, and then seek to construct a security system. Without being able to foresee every problem ahead, it makes sense to reflect on the values we consider most important, such as autonomy, privacy, and health, and the case studies to place them appropriately in tension, so we can build and refine systems sensitive to them.”
-Jonathan Zittrain, Ubiquitous Human Computing. Read more here.
“I was born for a very specific purpose. I wasn’t the result of a cheap bottle of wine or a full moon or the heat of the moment. I was born because a scientist managed to hook up my mother’s eggs and my father’s sperm to create a specific combination of precious genetic material. In fact, when (my brother) Jesse told me how babies get made and I, the great disbeliever, decided to ask my parents the truth, I got more than I bargained for. They sat me down and told me all the usual stuff, of course – but they also explained that they chose little embryonic me, specifically, because I could save my sister, Kate. ‘We loved you even more,’ my mother made sure to say, ‘because we knew what exactly we were getting.”
-A passage from My Sister’s Keeper by Jodi Picoult, written from the perspective of Anna, a child conceived using PGD to save the life of her cancer-stricken sister. Read more here.
“I begin, a little sheepishly, with a question that strikes me as sensationalistic, nonscientific, and probably unanswerable by someone who’s been professionally trained in the discipline of cautious objectivity: Are we living through a crisis of attention?
Before I even have a chance to apologize, Meyer responds with the air of an Old Testament prophet. “Yes,” he says. “And I think it’s going to get a lot worse than people expect.” He sees our distraction as a full-blown epidemic—a cognitive plague that has the potential to wipe out an entire generation of focused and productive thought. He compares it, in fact, to smoking. “People aren’t aware what’s happening to their mental processes,” he says, “in the same way that people years ago couldn’t look into their lungs and see the residual deposits.”
I ask him if, as the world’s foremost expert on multitasking and distraction, he has found his own life negatively affected by the new world order of multitasking and distraction.
“Yep,” he says immediately, then adds, with admirable (although slightly hurtful) bluntness: “I get calls all the time from people like you. Because of the way the Internet works, once you become visible, you’re approached from left and right by people wanting to have interactions in ways that are extremely time-consuming. I could spend my whole day, my whole night, just answering e-mails. I just can’t deal with it all. None of this happened even ten years ago. It was a lot calmer. There was a lot of opportunity for getting steady work done.”
-Sam Anderson, interviewing “multitasking expert” David Meyer in the article In Defense of Distraction. Read more here.
“The next generation, presumably, is the hardest-hit. They’re the ones way out there on the cutting edge of the multitasking revolution, texting and instant messaging each other while they download music to their iPod and update their Facebook page and complete a homework assignment and keep an eye on the episode of The Hills flickering on a nearby television. (A recent study from the Kaiser Family Foundation found that 53 percent of students in grades seven through 12 report consuming some other form of media while watching television; 58 percent multitask while reading; 62 percent while using the computer; and 63 percent while listening to music. “I get bored if it’s not all going at once,” said a 17-year-old quoted in the study.) They’re the ones whose still-maturing brains are being shaped to process information rather than understand or even remember it.”
-Walter Kirn, The Autumn of The Multitaskers. Read more here.
“Now in market terms, this potential transaction makes perfect sense—matching a willing seller and a willing buyer. Both parties get what they need—tuition money, the seeds of a new child—and no one is coerced into anything. But what is the human meaning of what is happening?”
-Eric Cohen, Biotechnology and The Spirit of Capitalism. Read more here.
“Thanks to the ubiquity of text on the Internet, not to mention the popularity of text-messaging on cell phones, we may well be reading more today than we did in the 1970s or 1980s, when television was our medium of choice. But it’s a different kind of reading, and behind it lies a different kind of thinking—perhaps even a new sense of the self. “We are not only what we read,” says Maryanne Wolf, a developmental psychologist at Tufts University and the author of Proust and the Squid: The Story and Science of the Reading Brain. “We are how we read.” Wolf worries that the style of reading promoted by the Net, a style that puts “efficiency” and “immediacy” above all else, may be weakening our capacity for the kind of deep reading that emerged when an earlier technology, the printing press, made long and complex works of prose commonplace. When we read online, she says, we tend to become “mere decoders of information.” Our ability to interpret text, to make the rich mental connections that form when we read deeply and without distraction, remains largely disengaged.
Reading, explains Wolf, is not an instinctive skill for human beings. It’s not etched into our genes the way speech is. We have to teach our minds how to translate the symbolic characters we see into the language we understand. And the media or other technologies we use in learning and practicing the craft of reading play an important part in shaping the neural circuits inside our brains. Experiments demonstrate that readers of ideograms, such as the Chinese, develop a mental circuitry for reading that is very different from the circuitry found in those of us whose written language employs an alphabet. The variations extend across many regions of the brain, including those that govern such essential cognitive functions as memory and the interpretation of visual and auditory stimuli. We can expect as well that the circuits woven by our use of the Net will be different from those woven by our reading of books and other printed works.”
-Nicholas Carr, Is Google Making Us Stupid? Read more here.
“Will unguided information lead to an illusion of knowledge, and thus curtail the more difficult, time-consuming, critical thought processes that lead to knowledge itself? Will the split-second immediacy of information gained from a search engine and the sheer volume of what is available derail the slower, more deliberative processes that deepen our understanding of complex concepts, of another’s inner thought processes, and of our own consciousness?”
-Mary Wolf, Proust and the Squid: The Story and Science of the Reading Brain