Feeds:
Posts
Comments

Archive for the ‘Psychology’ Category

In her book Release 2.0: A Design for Living in the Digital Age, Esther Dyson tries to assure those who worry too much about the new electronic world that human nature will stay the same. Of course. If we mean by “human nature” our genetic structure or biological needs or fundamental emotions, no one has argued that technology will alter human nature (at least not by much). But human nature is not the issue. What is at issue are the changes that might occur in our psychic habits, our social relations, and, most certainly, our political institutions, especially electoral politics. Nothing is more obvious than that a new technology changes the structure of discourse. It does so by encouraging certain uses of the intellect, by favoring certain definitions of intelligence, and by demanding a certain kind of content. Ronald Reagan, for example, could not have been president were it not for the bias of television. This is a man who rarely spoke precisely and never eloquently (except perhaps when reading a speech written by someone else). And yet he was called The Great Communicator. Why? Because he was magic on television. His televised image projected a sense of authenticity, intimacy, and caring. It did not much matter if citizens agreed with what he said or understood what he said. This does not in itself suggest that he shouldn’t have been president or that he did his job poorly. It is to say that television gives power to some while it deprives others. It is not human nature we worry about here but rather what part of our humanness will be nurtured by technology. I have often wondered how Abraham Lincoln would have fared on television. Because of the invention of photography in the 1840s, he was the first president to be the subject of continuous comment about his looks (ugly and ungainly, many said). Would it be too much to say that Americans must be eternally grateful for the absence of television when Lincoln made his run for the presidency? Or perhaps we might say that had television existed, no such person as Lincoln could have become a serious candidate for president.

— Neil Postman, Building a Bridge to the 18th Century: How the Past Can Improve Our Future

Read Full Post »

In the secular view, suffering is never seen as a meaningful part of life but only as an interruption. With that understanding, there are only two things to do when pain and suffering occur. The first is to manage and lessen the pain. And so over the past two generations, most professional services and resources offered to sufferers have moved from talking about affliction to discussing stress. They no longer give people ways to endure adversity with patience but instead use a vocabulary drawn from business, psychology, and medicine to enable them to manage, reduce, and cope with stress, strain, or trauma. Sufferers are counseled to avoid negative thoughts and to buffer themselves with time off, exercise, and supportive relationships. All the focus is on controlling your responses.

The second way to handle suffering in this framework is to look for the cause of the pain and eliminate it. Other cultures see suffering as an inevitable part of the fabric of life because of unseen forces, such as the illusory nature of life or the conflict between good and evil. But our modern culture does not believe in unseen spiritual forces. Suffering always has a material cause and therefore it can in theory be “fixed.” Suffering is often caused by unjust economic and social conditions, bad public policies, broken family patterns, or simply villainous evil parties. The proper response to this is outrage, confrontation of the offending parties, and action to change the conditions. (This is not uncalled for, by the way. The Bible has a good deal to say about rendering justice to the oppressed.)

Older cultures sought ways to be edified by their sufferings by looking inside, but Western people are often simply outraged by their suffering—and they seek to change things outside so that the suffering never happens again. No one has put the difference between traditional and modern culture more succinctly than C. S. Lewis, who wrote: “For the wise men of old the cardinal problem had been how to conform the soul to reality, and the solution had been knowledge, self-discipline, and virtue. For . . . [modernity] the problem is how to subdue reality to the wishes of men: the solution is a technique. . . .” Philosopher Charles Taylor, in his magisterial book A Secular Age, recounts how Western society made what he calls “the anthropocentric turn,” the rise in the secular view. After this turn, Taylor says the “sense of God’s ordering presence begins to fade. The sense begins to arise that we can sustain the order [of the world] on our own.” As a result, Western society’s “highest goal is to . . . prevent suffering.”

In Western culture, then, sufferers are not told that their primary work is any internal adjustment, learning, or growth. As Shweder points out, not only is moral responsibility virtually never assigned to sufferers but to even hint at it is considered “blaming the victim,” one of the main heresies within our Western society. The responses to suffering, then, are always provided by experts, whether pain management, psychological or medical treatment, or changes in law or public policy.

— Timothy Keller, Walking with God through Pain and Suffering 

Read Full Post »

Shweder says that under the metaphor of accident or chance, “suffering is to be treated by the intervention of . . . agents who possess expert skills of some kind, relevant to treating the problem.” Traditional cultures believe that the main responsibility in dark times belongs to the sufferers themselves. The things that need to be done are forms of internal “soul work”—learning patience, wisdom, and faithfulness. Contemporary culture, however, does not see suffering as an opportunity or test—and certainly never as a punishment. Because sufferers are victims of the impersonal universe, sufferers are referred to experts—whether medical, psychological, social, or civil—whose job is the alleviation of the pain by the removal of as many stressors as possible.

But this move—making suffering the domain of experts—has led to great confusion in our society, because different guilds of experts differ markedly on what they think sufferers should do. As both a trained psychotherapist and an anthropologist, James Davies is in a good position to see this. He writes, “During the twentieth century most people living in contemporary society have become increasingly confused about why they suffer emotionally.” He then lists “biomedical psychiatry, academic psychiatry, genetics, modern economics” and says, “As each tradition was based on its own distinctive assumptions and pursued its own goals via its own methods, each largely favored reducing human suffering to one predominant cause (e.g., biology, faulty cognition, unsatisfied self-interest).” As the saying goes, if you are an expert in hammers, every problem looks like a nail. This has led to understandable perplexity. The secular model puts sufferers in the hands of experts, but the specialization and reductionism of the different kinds of experts leaves people bewildered.

Davies’s findings support Shweder’s analysis. He explains how the secular model encourages psychotherapists to “decontextualize” suffering, not seeing it, as older cultures have, as an integral part of a person’s life story. Davies refers to a BBC interview with Dr. Robert Spitzer in 2007. Spitzer is a psychiatrist who headed the taskforce that in 1980 wrote the DSM-III (third edition of the Diagnostic and Statistical Manual of Mental Disorders) of the American Psychiatric Association. The DSM-III sought to develop more uniformity of psychiatric diagnoses. When interviewed twenty-five years later by the BBC, Spitzer admitted that, in hindsight, he believed they had wrongly labeled many normal human experiences of grief, sorrow, and anxiety as mental disorders. When the interviewer asked: “So you have effectively medicalized much ordinary human sadness?” Spitzer responded, “I think we have to some extent. . . . How serious a problem it is, is not known . . . twenty percent, thirty percent . . . but that is a considerable amount.”

— Timothy Keller, Walking with God through Pain and Suffering

Read Full Post »

A single retinal image is certainly not adequate to the task of specifying the world, but the visual stimulus received over time by an observer in motion is adequate, Gibson argues, and so on his account the whole motivation for conceiving perception as involving inference and computation collapses. This is completely revolutionary. The brain does not have to construct a representation of the world. The world is known to us because we live and act in it, and accumulate experience.

Surprisingly, it is in the field of robotics that some of the most convincing evidence has emerged that inference, calculation, and representation are a grossly inefficient way to go about negotiating a physical environment. In his now-classic article “Intelligence Without Representation,” published in the journal Artificial Intelligence in 1991, Rodney Brooks wrote that “the world is its own best model.” Roboticists are learning a lesson that evolution learned long ago, namely, that the task of solving problems needn’t be accomplished solely by the brain, but can be distributed among the brain, the body, and the world.

Consider the problem of catching a fly ball. According to the standard view, we might suppose that the visual system provides inputs about the current position of the ball, and a separate processor (the brain) predicts its future trajectory. How we might do this is a bit mysterious, given that most of us wouldn’t be able to calculate such a trajectory consciously, with pencil and paper. The Gibsonian approach suggests we don’t need to do any such thing, whether consciously or subconsciously. And in fact what we do, it turns out, is run in such a way that the image of the ball appears to move in a straight line, at constant speed, against visual background. It so happens that finding and exploiting this invariant, which is available in the optic flow if you run just right, puts you in the right spot to catch the ball. (The same strategy appears to be used by dogs who catch Frisbees, even on windy days.) You don’t need an inner model of the pseudo-parabolic trajectories that baseballs follow, with corrections for air resistance at different altitudes and so forth. It’s a good thing, too.

We think through the body. The fundamental contribution of this school of psychological research is that it puts the mind back in the world, where it belongs, after several centuries of being locked within our heads. The boundary of our cognitive processes cannot be cleanly drawn at the outer surface of our skulls, or indeed of our bodies more generally. They are, in a sense, distributed in the world that we act in.

— Matthew Crawford, The World Beyond Your Head: On Becoming and Individual in an Age of Distraction

Read Full Post »

Consider the person talking on his cell phone while cruising through a crowded suburban commercial district, with a motorcyclist in the lane next to him. Driving wile talking on a cell phone impairs performance as much as driving while legally drunk. It doesn’t matter whether the phone is hands-free or not; the issue is that having a conversation uses attentional resources, of which we have a finite amount. It especially impairs our ability to notice and register novel things in the environment; psychologists call this inattentional blindness. Pedestrians who walk while talking on a cell phone weave more, change direction more, cross the street in a riskier way, are less likely to acknowledge others (that is, be sociable), and, in the findings of a recent experiment, are less likely to notice the clown on a unicycle who just rode past. Put a person with this level of impairment behind the wheel of a two-ton, two-hundred-horsepower car and his blindness becomes an apt topic in discussions of what we owe one another. In the attentional commons, circumspection—literally looking around—would be one element of justice.

One of the more interesting findings to come out of the research on distracted driving is that, while having a cell phone conversation impairs driving ability, having a conversation with someone present in the car does not. A person who is present can cooperate by modulating the conversation in response to the demands of the driving situation.For example, if the weather is bad he tends to be quiet. A passenger acts as another pair of eyes on the situation he inhabits with the driver, and tends to improve a driver’s ability to notice and quickly respond to out-of-the-ordinary challenges.

— Matthew Crawford, The World Beyond Your Head: On Becoming and Individual in an Age of Distraction

Read Full Post »

I have been inputting. as they say, one bit of hard data after another into my brain all my life, some of it thruputting and outputting from the other ear, but a great deal held and stored somewhere, or so I am assured, but I possess no reliable device, anywhere in my circuitry, for retrieving it when needed. If I wish for the simplest of things, someone’s name for example, I cannot send in a straightforward demand with any sure hope of getting back the right name. I am often required to think about something else, something quite unrelated, and wait, hanging around in the mind’s lobby, picking up books and laying them down, pacing around, and then, if it is a lucky day, out pops the name. No computer could be designed by any engineer to function, or malfunction, in this way.

I have learned, one time or another, all sorts of things that I remember learning, but now they are lost to me. I cannot place the Thirty Years War or the Hundred Years War in the right centuries, nor have I at hand the barest facts about the issues involved. I once knew Keats, lots of Keats, by heart; he is still there, I suppose, probably scattered across the lobes of my left hemisphere, or maybe translated into the wordless language of my right hemisphere and preserved there forever as a set of hunches, but irretrievable as language. I have lost most of the philosophers I studied long ago and liked; the only sure memory I retain of Heidegger, even when I reread him today, is bewilderment. I have forgotten how to do cube roots, and will never learn again. Slide rules. Solid geometry. Thomas Hardy. Chinese etymology, which I learned in great volumes just a few years ago. The Bible, most of all the Sunday-school Bible, long since gone, obliterated.

— Lewis Thomas, The Youngest Science

Read Full Post »

A notable feature of the gangsterish regimes that rule in many formerly Communist countries is the apparent absence, or impotence, of any notion of a common good. Wherever communism was established by coercion, when it later collapsed and private interests were allowed to assert themselves it became clear that there was no well-established intellectual foundation for defending such shared resources as clean air and water. Many citizens of these countries now live in the environmental degradation that results when privatization has no countervailing force of public-spiritedness. We in the liberal societies of the West find ourselves headed toward a similar condition with regard to the resource of attention, because we do not yet understand it to be a resource.

Or do we? Silence is now offered as a luxury good. In the business-class lounge at Charles de Gaulle airport, what you hear is the occasional tinkling of a spoon against china. There are no advertisements on the walls, and no TVs. This silence, more than any other feature of the space, is what makes it feel genuinely luxurious. When you step inside and the automatic airtight doors whoosh shut behind you, the difference is nearly tactile, like slipping out of haircloth into satin. Your brow unfurrows itself, your neck muscles relax; after twenty minutes you no longer feel exhausted. The hassle lifts.

Outside the lounge is the usual airport cacophony. Because we have allowed our attention to be monetized, if you want yours back you’re going to have to pay for it.

As the commons gets appropriated, one solution, for those who have the means, is to leave the commons for private clubs such as the business-class lounge. Consider that it is those in the business lounge who make the decisions that determine the character of the peon lounge and we may start to see these things in a political light. To engage in playful, inventive thinking, and possibly create wealth for oneself during those idle hours spent at an airport, requires silence. But other people’s minds, over in the peon lounge (or at the bus stop) can be treated as a resource–a standing reserve of purchasing power to be steered according to innovative marketing ideas hatched by the “creatives” in the business lounge.

— Matthew Crawford, The World Beyond Your Head: On Becoming and Individual in an Age of Distraction

Read Full Post »

Older Posts »