Feeds:
Posts
Comments

Archive for the ‘Psychology’ Category

[Note: BUMMER = Behaviors of Users Modified, and Made Into an Empire for Rent; BUMMER Platforms = Facebook, Google, etc.]

The ability to theorize about what someone else experiences as part of understanding that person is called having a theory of mind. To have a theory of mind is to build a story in your head about what’s going on in someone else’s head. Theory of mind is at the core of any sense of respect or empathy, and it’s a prerequisite to any hope of intelligent cooperation, civility, or helpful politics. It’s why stories exist.

You’ve heard expressions like “Don’t judge someone until you’ve walked a mile in their shoes.” You can’t understand people without knowing a little of what they’ve gone through.

Most animals get by without theory of mind, but people need it.

When you can only see how someone else behaves, but not the experiences that influenced their behavior, it becomes harder to have a theory of mind about that person. If you see someone hit someone else, for instance, but you did not see that they did it in defense of a child, you might misinterpret what you see.

In the same way, if you don’t see the dark ads, the ambient whispers, the cold-hearted memes, and the ridicule-filled customized feed that someone else sees, that person will just seem crazy to you. And that is our new BUMMER world. We seem crazy to each other, because BUMMER is robbing us of our theories of one another’s minds.

— Jaron Lanier, Ten Arguments for Deleting Your Social Media Accounts Right Now

Read Full Post »

When people are solitary wolves, then each individual has access to slightly different information about the world, and slightly different ways of thinking about that information. I’ve been talking about the relationship between the Solitary setting and personal character, but there are other reasons to keep the switch in the Solitary position. Consider a demonstration that is often enacted on the first day of business school. A professor shows a class a big jar of jelly beans and asks each person to estimate the number of beans. Averaging all the estimates usually results in a pretty accurate count. Each person brings different perspectives, cognitive styles, skills, and strategies to the mystery, and the average gets at the agreements between them. (This only works for single-number answers. If you ask a committee to design a product or write a novel, the result comes out like something made by a committee.)

Now suppose that the students could look at the jar only through photos in a social media feed. Different camps of people with different ideas about the number of beans would form and would ridicule each other. Russian intelligence services would add pictures of similar jars with different numbers of beans. Bean promoters would motivate trolls to argue that there aren’t enough beans and you must buy more. And so on. There would no longer be a way to guess the number of beans because the power of diversity will have been compromised. When that happens, markets can no longer offer utility to the world.

You can replace the jar with a political candidate, a product, or anything else.

— Jaron Lanier, Ten Arguments for Deleting Your Social Media Accounts Right Now

Read Full Post »

[Note: BUMMER = Behaviors of Users Modified, and Made Into an Empire for Rent; BUMMER Platforms = Facebook, Google, etc.]

Customized feeds become optimized to “engage” each user, often with emotionally potent cues, leading to addiction. People don’t realize how they are being manipulated. The default purpose of manipulation is to get people more and more glued in, and to get them to spend more and more time in the system. But other purposes for manipulation are also tested. For instance, if you’re reading on a device, your reading behaviors will be correlated with those of multitudes of other people. If someone who has a reading pattern similar to yours bought something after it was pitched in a particular way, then the odds become higher that you will get the same pitch. You might be targeted before an election with weird posts that have proven to bring out the inner cynic in people who are similar to you, in order to reduce the chances that you’ll vote.

BUMMER platforms have proudly reported on how they’ve experimented with making people sad, changing voter turnout, and reinforcing brand loyalty. Indeed, these are some of the best-known examples of research that were revealed in the formative days of BUMMER.

The digital network approach to behavior modification flattens all these examples, all these different slices of life, into one slice. From the point of view of the algorithm, emotions, happiness, and brand loyalty are just different, but similar, signals to optimize.

— Jaron Lanier, Ten Arguments for Deleting Your Social Media Accounts Right Now

Read Full Post »

Forty percent of the people in this country still don’t use a seat belt, which I find simply amazing because it costs nothing to buckle up and clearly has the potential to save you from exiting through the windshield like Superman. (Vermont, which is one of the few states to keep careful track of these things, reported that in the first ten months of 1998, eighty-one people were killed on the state’s roads-and 76 percent of those people were not wearing seat belts.) Even more remarkably, since a spate of recent newspaper reports about young children being killed by airbags in minor crashes, people have been rushing to get their airbags disconnected. Never mind that in every instance the children were killed because they were sitting in the front seat, where they should not have been in the first place, and in nearly all cases weren’t wearing seat belts. Airbags save thousands of lives, yet many people are having them disabled on the bizarre assumption that they present a danger.

Much the same sort of statistical illogic applies to guns. Forty percent of Americans keep guns in their homes, typically in a drawer beside the bed. The odds that one of those guns will ever be used to shoot a criminal are comfortably under one in a million. The odds that it will be used to shoot a member of the household-generally a child fooling around- are at least twenty times that figure. Yet over 100 million people resolutely ignore this fact, even sometimes threaten to pop you one themselves if you make too much noise about it.

Nothing, however, better captures the manifest irrationality of people toward risks as one of the liveliest issues of recent years: passive smoking. Four years ago, the Environmental Protection Agency released a report concluding that people who are over thirty-five and don’t smoke but are regularly exposed to the smoke of others stand a 1 in 30,000 risk of contracting lung cancer in a given year. The response was immediate and electrifying. All over the country smoking was banned at work and in restaurants, shopping malls, and other public places.

What was overlooked in all this was how microscopically small the risk from passive smoking actually is.  A rate of 1 in 30,000 sounds reasonably severe, but it doesn’t actually amount to much. Eating one pork chop a week is statistically more likely to give you cancer than sitting routinely in a roomful of smokers. So, too, is consuming a carrot every seven days, a glass of orange juice twice a month, or a head of lettuce every two years. You are five times more likely to contract lung cancer from your pet parakeet than you are from secondary smoke.

Now I am all for banning smoking on the grounds that it is dirty and offensive, unhealthy for the user, and leaves unsightly burns in the carpet. All I am saying is that it seems a trifle odd to ban it on grounds of public safety when you are happy to let any old fool own a gun or drive around unbuckled.

— Bill Bryson, I’m a Stranger Here Myself

Read Full Post »

In her book Release 2.0: A Design for Living in the Digital Age, Esther Dyson tries to assure those who worry too much about the new electronic world that human nature will stay the same. Of course. If we mean by “human nature” our genetic structure or biological needs or fundamental emotions, no one has argued that technology will alter human nature (at least not by much). But human nature is not the issue. What is at issue are the changes that might occur in our psychic habits, our social relations, and, most certainly, our political institutions, especially electoral politics. Nothing is more obvious than that a new technology changes the structure of discourse. It does so by encouraging certain uses of the intellect, by favoring certain definitions of intelligence, and by demanding a certain kind of content. Ronald Reagan, for example, could not have been president were it not for the bias of television. This is a man who rarely spoke precisely and never eloquently (except perhaps when reading a speech written by someone else). And yet he was called The Great Communicator. Why? Because he was magic on television. His televised image projected a sense of authenticity, intimacy, and caring. It did not much matter if citizens agreed with what he said or understood what he said. This does not in itself suggest that he shouldn’t have been president or that he did his job poorly. It is to say that television gives power to some while it deprives others. It is not human nature we worry about here but rather what part of our humanness will be nurtured by technology. I have often wondered how Abraham Lincoln would have fared on television. Because of the invention of photography in the 1840s, he was the first president to be the subject of continuous comment about his looks (ugly and ungainly, many said). Would it be too much to say that Americans must be eternally grateful for the absence of television when Lincoln made his run for the presidency? Or perhaps we might say that had television existed, no such person as Lincoln could have become a serious candidate for president.

— Neil Postman, Building a Bridge to the 18th Century: How the Past Can Improve Our Future

Read Full Post »

In the secular view, suffering is never seen as a meaningful part of life but only as an interruption. With that understanding, there are only two things to do when pain and suffering occur. The first is to manage and lessen the pain. And so over the past two generations, most professional services and resources offered to sufferers have moved from talking about affliction to discussing stress. They no longer give people ways to endure adversity with patience but instead use a vocabulary drawn from business, psychology, and medicine to enable them to manage, reduce, and cope with stress, strain, or trauma. Sufferers are counseled to avoid negative thoughts and to buffer themselves with time off, exercise, and supportive relationships. All the focus is on controlling your responses.

The second way to handle suffering in this framework is to look for the cause of the pain and eliminate it. Other cultures see suffering as an inevitable part of the fabric of life because of unseen forces, such as the illusory nature of life or the conflict between good and evil. But our modern culture does not believe in unseen spiritual forces. Suffering always has a material cause and therefore it can in theory be “fixed.” Suffering is often caused by unjust economic and social conditions, bad public policies, broken family patterns, or simply villainous evil parties. The proper response to this is outrage, confrontation of the offending parties, and action to change the conditions. (This is not uncalled for, by the way. The Bible has a good deal to say about rendering justice to the oppressed.)

Older cultures sought ways to be edified by their sufferings by looking inside, but Western people are often simply outraged by their suffering—and they seek to change things outside so that the suffering never happens again. No one has put the difference between traditional and modern culture more succinctly than C. S. Lewis, who wrote: “For the wise men of old the cardinal problem had been how to conform the soul to reality, and the solution had been knowledge, self-discipline, and virtue. For . . . [modernity] the problem is how to subdue reality to the wishes of men: the solution is a technique. . . .” Philosopher Charles Taylor, in his magisterial book A Secular Age, recounts how Western society made what he calls “the anthropocentric turn,” the rise in the secular view. After this turn, Taylor says the “sense of God’s ordering presence begins to fade. The sense begins to arise that we can sustain the order [of the world] on our own.” As a result, Western society’s “highest goal is to . . . prevent suffering.”

In Western culture, then, sufferers are not told that their primary work is any internal adjustment, learning, or growth. As Shweder points out, not only is moral responsibility virtually never assigned to sufferers but to even hint at it is considered “blaming the victim,” one of the main heresies within our Western society. The responses to suffering, then, are always provided by experts, whether pain management, psychological or medical treatment, or changes in law or public policy.

— Timothy Keller, Walking with God through Pain and Suffering 

Read Full Post »

Shweder says that under the metaphor of accident or chance, “suffering is to be treated by the intervention of . . . agents who possess expert skills of some kind, relevant to treating the problem.” Traditional cultures believe that the main responsibility in dark times belongs to the sufferers themselves. The things that need to be done are forms of internal “soul work”—learning patience, wisdom, and faithfulness. Contemporary culture, however, does not see suffering as an opportunity or test—and certainly never as a punishment. Because sufferers are victims of the impersonal universe, sufferers are referred to experts—whether medical, psychological, social, or civil—whose job is the alleviation of the pain by the removal of as many stressors as possible.

But this move—making suffering the domain of experts—has led to great confusion in our society, because different guilds of experts differ markedly on what they think sufferers should do. As both a trained psychotherapist and an anthropologist, James Davies is in a good position to see this. He writes, “During the twentieth century most people living in contemporary society have become increasingly confused about why they suffer emotionally.” He then lists “biomedical psychiatry, academic psychiatry, genetics, modern economics” and says, “As each tradition was based on its own distinctive assumptions and pursued its own goals via its own methods, each largely favored reducing human suffering to one predominant cause (e.g., biology, faulty cognition, unsatisfied self-interest).” As the saying goes, if you are an expert in hammers, every problem looks like a nail. This has led to understandable perplexity. The secular model puts sufferers in the hands of experts, but the specialization and reductionism of the different kinds of experts leaves people bewildered.

Davies’s findings support Shweder’s analysis. He explains how the secular model encourages psychotherapists to “decontextualize” suffering, not seeing it, as older cultures have, as an integral part of a person’s life story. Davies refers to a BBC interview with Dr. Robert Spitzer in 2007. Spitzer is a psychiatrist who headed the taskforce that in 1980 wrote the DSM-III (third edition of the Diagnostic and Statistical Manual of Mental Disorders) of the American Psychiatric Association. The DSM-III sought to develop more uniformity of psychiatric diagnoses. When interviewed twenty-five years later by the BBC, Spitzer admitted that, in hindsight, he believed they had wrongly labeled many normal human experiences of grief, sorrow, and anxiety as mental disorders. When the interviewer asked: “So you have effectively medicalized much ordinary human sadness?” Spitzer responded, “I think we have to some extent. . . . How serious a problem it is, is not known . . . twenty percent, thirty percent . . . but that is a considerable amount.”

— Timothy Keller, Walking with God through Pain and Suffering

Read Full Post »

Older Posts »