Showing articles with label Personality.
Show all articles

Author
Friday
What predicts a happy life? As I explained long ago in The Pursuit of Happiness, there are some things that I might have guessed would matter. But one’s age, gender, or race, for example, matter little. Money helps to a point—better to be able to afford life’s necessities and to feel control over one’s life than not. Yet ever-increasing wealth provides diminishing well-being returns. Moreover, the last half-century’s remarkable growth in average real income and purchases (albeit with rising inequality) us has left people a bit less happy. What does matter—what best predicts whether people report being “very happy”—is close, supportive relationships. We are, as Aristotle recognized, social animals. Our ancestral history has destined us to flourish with others. For our hunter-gatherer forebears, six hands were better than two. We therefore have what today’s social psychologists call a “need to belong.” When supported by intimate friendships or a committed marriage, we are much likelier to declare ourselves “very happy.” When socially deprived—when exiled, ostracized, bereaved, or imprisoned in solitary confinement—we feel lonely and adrift. Social support matters. Friend number predicts happiness. “Happiness seems made to be shared,” noted the French dramatist Pierre Corneille. So it seems from answers to a question asked of Americans by the National Opinion Research Center: “Looking over the last six months, who are the people with whom you discussed matters important to you?” Compared with those who could name no such intimate, those who named five or more such friends were 60 percent more likely to feel “very happy.” And then a curious thing happened on the way into the twenty-first century. Our close face-to-face relationships have waned. We’re marrying later, and less often. We’re not just more often Bowling Alone, to use political scientist Robert Putnam’s famous metaphor, but also spending less time socializing with others. The American Time Use Survey reveals that we are (in minutes per day) hanging out together less. (To replace the diminished engagement, some have created interactive online AI friends.) Decreased face-to-face social connections are most striking among teens and young adults. As my Social Psychology text co-author Jean Twenge has amply documented, today’s teens are dating less, partying less, and being with friends less. Much less. They also have fewer friends. Their decreasing time with friends is not, as some have speculated, because they are working more or doing more homework. If anything, they’re employed less and doing less homework. But, you say, being alone needn’t mean being lonely. Sometimes we savor solitude. Moreover, texting and social media posting enable social connections. Nevertheless, especially among teens and young adults in Western countries, depression, anxiety, suicide ideation, and loneliness have increased concurrently with the increasing homebound solitude, and mainly for those spending long daily hours staring at social media screens rather than engaging with people in-person, face-to-face. In one of her informative Substack essays, Jean Twenge recently displayed Monitoring the Future survey responses of U.S. 13- to 18-year-olds, showing the percentage reporting depressive symptoms (with data from the depressing Covid years excluded). Our screens are not only a time suck from sleeping, reading, and schooling, but also from relationships. And consider other societal sources of our social malnutrition: At-home remote work. Many people love the convenience and lessened travel time and expense, but they come at the price of connecting with colleagues in the mailroom, over the coffee pot, and in office meetings. Take-out food, sometimes with contactless delivery. We love the convenience of take-out foods, albeit with less time leisurely dining out with friends. In the UK, the result has been a long-term decline in the number of people in pubs and nightclubs. Online shopping. We love the efficiency of one-click purchases and home delivery, even if putting out of business some shops where we once mingled with others and had chance conversations. Decreased attendance at churches, museums, and school sports. Fewer folks are joining others at worship places, museum galleries, and high school and small college sporting events. During the 1996–1997 college season, the NCAA Division III men’s basketball attendance at the three schools with the most attendance averaged 2467 fans per game; last year it was just 1397. I confess that I love being able to watch my school’s livestreamed out-of-town basketball games. But that convenience means that I’m less likely to enjoy being with fellow fans traveling together to cheer them on. The drains on our in-person, empathy-enabling relationships seem baked into modern life. Yet we are not helpless. We can reinvigorate the priority we give to close relationships. We can put down our phones and give conversational partners our focused attention. We can resolve to take the initiative to dine more often with friends, meet more often with colleagues, exchange confidences more often with family members. We can stick our head into coworkers’ work spaces. We can video-call relatives. We can establish sit-down family mealtimes. We can initiate micro-friendships—pleasing brief relationships with our baristas, seatmates, and ride-share drivers. Today’s digital world enriches our lives—but especially so when we retain a central place for face-to-face active listening and engagement. Sharing our lives in person with those who love and support us has two effects, observed the seventeenth-century sage Francis Bacon: “It redoubleth joys, and cutteth griefs in half.” David Myers, a Hope College social psychologist, authors psychology textbooks and trade books, including his recent essay collection, How Do We Know Ourselves? Curiosities and Marvels of the Human Mind.
... View more
Labels
-
Emotion
-
Personality
-
Sensation and Perception
-
Social Psychology
0
0
499

Author
10-30-2020
12:37 PM
I see you. My psychic powers enable me, from a great distance, to peer into your heart and to sense your unease. Regardless of your political leanings, you understand the upcoming U.S. election to be momentous, world-changing, the most important of your lifetime. Part of you is hopeful, but a larger part of you is feeling tense. Anxious. Fearing a future that would follow the outcome you dread. Hungering for indications of the likely outcome, you read the latest commentary and devour the latest polls. You may even glean insight from the betting markets (here and here), which offer “the wisdom of the crowd.” They are akin to stock markets, in which people place bets on future stock values, with the current market value—the midpoint between those expecting a stock to rise and those expecting it to fall—representing the distillation of all available information and insight. As stock market booms and busts remind us, the crowd sometimes displays an irrational exuberance or despair. Yet, as Princeton economist Burton Malkiel has repeatedly demonstrated, no individual stock picker (or mutual fund) has had the smarts to consistently outguess the efficient marketplace. You may also, if you are a political geek, have welcomed clues to the election outcome from prediction models (here and here) that combine historical information, demographics, and poll results to forecast the result. But this year, the betting and prediction markets differ sharply. The betting markets see a 34 percent chance of a Trump victory, while the prediction models see but a 5 to 10 percent chance. So who should we believe? Skeptics scoff that the poll-influenced prediction models erred in 2016. FiveThirtyEight’s final election forecast gave Donald Trump only a 28 percent chance of winning. So, was it wrong? Consider a simple prediction model that predicted a baseball player’s chance of a hit based on the player’s batting average. If a .280 hitter came to the plate and got a hit, would we discount our model? Of course not, because we understand the model’s prediction that sometimes (28% of the time, in this case), the less likely outcome will happen. (If it never does, the model errs.) But why do the current betting markets diverge from the prediction models? FiveThirtyEight modeler Nate Silver has an idea: The Dunning-Kruger effect, as psychology students know, is the repeated finding that incompetence tends not to recognize itself. As one person explained to those unfamiliar with Silver’s allusion: Others noted that the presidential betting markets, unlike the stock markets, are drawing on limited (once every four years) information—with people betting only small amounts on their hunches, and without the sophisticated appraisal that informs stock investing. And what are their hunches? Surely, these are informed by the false consensus effect—our tendency to overestimate the extent to which others share our views. Thus, in the University of Michigan’s July Survey of Consumers, 83 percent of Democrats and 84 percent of Republicans predicted that voters would elect their party’s presidential candidate. Ergo, bettors are surely, to some extent, drawing on their own preferences, which—thanks to the false consensus effect—inform their predictions. What we are, we see in others. So, if I were a betting person, I would wager based on the prediction models. Usually, there is wisdom to the crowd. But sometimes . . . we shall soon see . . . the crowd is led astray by the whispers of its own inner voices. ----- P.S. At 10:30 a.m. on election day, the Economist model projects a 78 percent chance of a Biden Florida victory, FiveThirtyEight.com projects a Biden Florida victory with a 2.5 percent vote margin, and electionbettingodds.com betting market average estimates a 62% chance of a Trump Florida victory. Who's right--the models or the bettors? Stay tuned! P.S.S. on November 4: Mea culpa. I was wrong. Although the models--like weather forecasts estimating the percent change of rain--allow for unlikely possibilities, the wisdom of the betting crowd won this round--both in Florida and in foreseeing a closer-than-expected election. (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com.)
... View more
Labels
-
Current Events
-
Personality
-
Teaching and Learning Best Practices
3
1
13.6K

Author
08-03-2020
09:01 AM
“As we pull down controversial statues and reassess historical figures” let’s also examine our own moral blind spots, urges Nicholas Kristof. Although our moral failings may not be on the horrific scale of those who enslaved their fellow humans, we likely still have what Kristof calls “moral myopia.” Kristof suggests three possible contenders for such blind spots: the animal cruelty of factory farming, indifference to suffering in impoverished countries, and climate change. He anticipates that a century from now, future generations may judge our actions in these areas as “bewilderingly immoral.” Many of us can already look back on events in our own lives with embarrassment. I recall reveling, with other Pacific Northwesterners 55 years ago, in the first killer whale captures. Today we understand those captures as a brutal separation of orcas from their family and a contribution to the endangered status of our region’s beloved 72 Southern Resident orcas. And might morally enlightened future people want to remove my name from something for attitudes or actions I have more recently embraced—perhaps for eating the flesh of factory-farmed animals, or for flying on climate-destroying flights? Perhaps even for attitudes and behaviors I am now too short-sighted to imagine as problematic to my descendants? When judging the speck in someone else’s eye, do I fail to notice what is in my own? An oft-demonstrated truth is that most of us have a great reputation with ourselves and therefore may miss the large specks in our own lives. Psychologists call this the self-serving bias. We accept more responsibility for our good deeds than for our bad. And we tend to see ourselves as better than average—as, for example, better-than-average drivers, voters, and employees. The better-than-average phenomenon extends to people’s feelings of moral superiority: Virtues. In the Netherlands, most high school students have rated themselves as more honest, persistent, original, friendly, and reliable than the average high school student. Prosocial behavior. Most people report that they are more likely than others to give to a charity, donate blood, and give up their bus seat to a pregnant woman. Ethics. Most businesspeople perceive themselves as more ethical than the average businessperson. Morals and values. When asked in a national survey, “How would you rate your own morals and values on a scale from 1 to 100 (100 being perfect)?” 50 percent rated themselves at 90 or above. This self-serving bias can lead us to view ourselves as morally superior to others, including our ancestors. We presume that, had we stood in their shoes, we would have behaved differently. We are like the people who—when told about experiments in which people have conformed to falsehoods, followed orders to administer painful shocks, or failed to help someone—predict that they would have acted more truthfully and courageously. But psychology’s experiments have indicated otherwise. Princeton legal scholar Robert George recently tweeted that he sometimes asks students “what their position on slavery would have been had they been White and living in the South before abolition. Guess what? They all would have been abolitionists! They all would have bravely spoken out against slavery, and worked tirelessly against it.” But this is “nonsense,” he adds. Had we been White Southerners, embedded in that time and culture’s systemic racism, most of us would likely have been, to a lesser or greater extent, complicit. He challenges those who think they would have been the exception to tell him how they have, in their current life, done something similarly unpopular with their peers, causing them to be abandoned by friends and “loathed and ridiculed by powerful, influential individuals and institutions.” Of course, a brave minority in the South did join the abolitionist cause and enabled the underground railway. Under Hitler a few brave souls did protest and suffer, including Pastor Martin Niemöller, who, after seven years in Nazi concentration camps, famously spoke for many: “First they came for the socialists, and I did not speak out—because I was not a socialist. Then they came for the trade unionists, and I did not speak out— because I was not a trade unionist. Then they came for the Jews, and I did not speak out—because I was not a Jew. Then they came for me—and there was no one left to speak for me.” But such heroes are heroes because they are the exception. Experiments (here and here) show that most people do err when confidently predicting that they would intervene where others have not if witnessing a sexist or racist slur. T. S. Elliot anticipated as much: “Between the idea and the reality . . . Falls the Shadow.” So, should we and can we advocate for a more just world while also being mindful that we may similarly be judged by our descendants? As Steven Pinker documents in Enlightenment Now: The Case for Reason, Science, Humanism, and Progress, we have made moral progress. Wars, genocide, murders, blatant racism, homophobia, and sexism, as well as illiteracy, ignorance, and lethal diseases, have all, over time, been on the decline. So, amid today’s hatreds and chaos, there is hope for continued progress. Perhaps the ancient prophetic admonition can be our guide: Do justice. Love kindness. Walk humbly. (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com.)
... View more
Labels
-
Personality
0
0
4,784

Author
06-13-2019
06:35 AM
You surely know why you chose your town, your partner, and your vocation—all for good reasons, no doubt. But might other unknown reasons—operating below the level of your conscious awareness–also have nudged your choices? Such is the implication of some clever studies of implicit egotism—an automatic tendency to like things we associate with ourselves. For example, we like better a politician or stranger whose face has been morphed with some features of our own (see here and here). I see you yawning: “You needed research to know that we love ourselves and things that resemble us?” The surprise—astonishment, really—comes with the subtle ways in which this phenomenon has been documented. Consider: The name–letter effect. People of varied nationalities, languages, and ages prefer the letters that appear in their own name. People also tend to marry someone whose first or last name resembles our own. The birthdate–number effect. People likewise prefer the numbers that appear in their birthdate. For example, people tend to be attracted to people whose laboratory participant number resembles their birth date. The name–residence effect. Philadelphia, having many more people than Jacksonville, has also had (no surprise) 2.2 times more men named Jack . . . but also 10.4 times more named Philip. Ditto Virginia Beach, which has a disproportionate number of women named Virginia, and St. Louis which, compared to the national average, has 49 percent more men named Louis. Likewise, folks named Park, Hill, Beach, Rock, or Lake are disproportionately likely to live in cities (for example, Park City) that include their names. If that last finding—offered by implicit egotism researchers Brett Pelham, Matthew Mirenberg, and John Jones—doesn’t surprise you, consider an even weirder phenomenon they uncovered: People seem to gravitate to careers identified with their names. In the United States, Dennis, Jerry, and Walter have been equally popular names. But dentists have twice as often been named Dennis as Jerry or Walter, and 2.5 times more often named Denise than the equally popular Beverly or Tammy. Among geoscientists (geologists, geophysicists, and geochemists) people named George and Geoffrey are similarly overrepresented. The phenomenon extends to surname–occupation matching. In 1940 U.S. Census data, people named Baker, Barber, Butcher, and Butler were all 40 percent more likely than expected to work in occupations with their names. Ah, but do Pelham and colleagues have cause-and-effect reversed? For example, aren’t towns often named after people whose descendants stick around? And are people in Virginia more likely to name girls with the state name? Are Georgians more likely to christen their babies Georgia or George? Wasn’t the long-ago village baker—thus so-named—likely to have descendants carrying on the ancestral work? Likely so, grants Pelham. But could that, he asks, explain why states have an excess of people sharing a last-name similarity? California, for example, has an excess of people whose names begin with Cali (as in Califano). Moreover, he reports, people are more likely to move to states and cities with name resemblances—Virginia to Virginia, for example. If the Pelham team is right to think that implicit egotism, though modest, is nonetheless a real unconscious influence on our preferences, might that explain why, with long-ago job offers from three states, I felt drawn to Michigan? And why it was Suzie who sold seashells by the seashore? (For David Myers’ other essays on psychological science and everyday life, including a 2016 essay on much of this implicit egotism research, visit TalkPsych.com.)
... View more
Labels
-
Personality
0
0
6,685

Author
10-04-2018
06:59 AM
Nearly two-third of Americans, reports a recent PLOS One article, agree that “I am more intelligent than the average person.” This self-serving bias—on which I have been reporting for four decades (starting here)—is one of psychology’s most robust and reliable phenomena. Indeed, on most subjective, socially desirable dimensions, most of us see ourselves as better-than-average . . . as smarter, more ethical, more vocationally competent, more charitable, more unprejudiced friendlier, healthier, and more likely to outlive our peers—which calls to mind Freud’s joke about the husband who told his wife, “If one of us dies, I shall move to Paris.” My own long-ago interest in self-serving bias was triggered by noticing a result buried in a College Board survey of 829,000 high school seniors. In rating themselves on their “ability to get along with others,” 0 percent viewed themselves below average. But a full 85 percent saw themselves as better than average: 60 percent in the top 10 percent, and 25 percent as in the top 1 percent. As Shelley Taylor wrote in Positive Illusions, “The [self-]portraits that we actually believe, when we are given freedom to voice them, are dramatically more positive than reality can sustain.” Dave Barry recognized the phenomenon: “The one thing that unites all human beings, regardless of age, gender, religion, economic status, or ethnic background is that deep down inside, we all believe that we are above average drivers.” Self-serving bias also takes a second form—our tendency to accept more responsibility for our successes than our failures, for our victories than our defeats, and for our good deeds than our bad. In experiments, people readily attribute their presumed successes to their ability and effort, their failures to bad luck or an impossible task. A Scrabble win reflects our verbal dexterity. A loss? Our bad luck in drawing a Q but no U. Perceiving ourselves, our actions, and our groups favorably does much good. It protects us against depression, buffers stress, and feeds our hopes. Yet psychological science joins literature and religion in reminding us of the perils of pride. Hubris often goes before a fall. Self-serving perceptions and self-justifying explanations breed marital conflict, bargaining impasses, racism, sexism, nationalism, and war. Being mindful of self-serving bias needn’t lead to false modesty—for example, smart people thinking they are dim-witted. But it can encourage a humility that recognizes our own virtues and abilities while equally acknowledging those of our neighbors. True humility leaves us free to embrace our special talents and similarly to celebrate those of others. (For David Myers’ other weekly essays on psychological science and everyday life visit TalkPsych.com)
... View more
Labels
-
Personality
1
0
3,270

Author
10-04-2016
07:13 AM
Originally posted on July 7, 2016. In authoring textbooks (and this blog) I seek to steer clear of overtly partisan politics. That’s out of respect for my readers’ diverse views, and also because my calling is to report on psychological science and its application to everyday life. But sometimes psychology speaks to politics. Recently, more than 750 psychotherapists have signed “A Public Manifesto: Citizen Therapists Against Trumpism.” Its author, University of Minnesota professor William Doherty, emphasizes that the manifesto does not seek to diagnose Trump, the person. Rather it assesses Trumpist ideology, which it sees as “an emerging form of American facism” marked by fear, scapegoating, and exaggerated masculinity. An alternative statement, drafted by public intellectual David Blankenhorn of the bipartisan “Better Angels” initiative (and signed by 22 of us), offers “A Letter to Trump Supporters”—some arguments for rethinking support of Donald Trump. Social psychologists will recognize this as an effort at “central route” persuasion (offering reasons for rethinking one’s position). But in this presidential season, are rational arguments or emotional appeals more likely to sway voters—or some combination of both? What do you think?
... View more
Labels
-
Personality
-
Social Psychology
0
0
1,581

Author
07-19-2016
01:01 PM
Originally posted on April 10, 2014. A footnote to the name-vocation analyses: Who would you rather hire for a managerial (rather than employee) role—John Knight or George Cook? Jill Prince or Judy Shepherd? David King or Donald Farmer? Helen Duke or Hazel Baker? Raphael Silberzahn and Eric Luis Uhlmann studied nearly a quarter million German names corresponding to high and lower status occupations, such as Kȍnig (King) and Koch (cook). Those with names linked with high status occupations were modestly more often appointed to high status roles. Silberzahn and Uhlmann speculate that the name association may have made those with high status names seem more worthy. As former U.S. President Jimmy Carter famously said, “Life isn’t fair."
... View more
Labels
-
Personality
0
0
1,253

Author
07-19-2016
07:51 AM
Originally posted on November 25, 2014. The November APS Observer is out with an essay by Nathan, “Why Self-Control and Grit matter—and Why It Pays to Know the Difference.” It describes Angela Duckworth’s and James Gross’s research on laser-focused achievement drive (grit) and on self-control over distracting temptations. . . and how to bring these concepts into the classroom. In the same issue, I reflect on “The Psychology of Extremism.” I describe the social psychological roots of extreme animosities and terrorist acts, including a description of Michael Hogg’s work on how people’s uncertainties about their world and their place in it can feed a strong (even extreme) group identity.
... View more
Labels
-
Personality
-
Social Psychology
0
0
1,800

Author
07-18-2016
01:42 PM
Originally posted on March 10, 2015. Nathan and David’s monthly synopses of important new findings reported in Current Directions in Psychological Science continue, and include their teaching ideas. In the February APS Observer, Nathan shines a light on “dark personalities.” “Some people have hidden lusts or greed,” he notes, “whereas others embezzle millions. Understanding the science of dark personality helps us avoid labeling people as simply good or bad. By shining a light on the ingredients of a dark personality, we can learn who we ought to fear and when to fear them.” In the same issue, David summarizes the emerging field of health neuroscience, and suggests ways to help students think about brain ßà body interaction. In the upcoming March issue, Nathan explains “When Two Emotions are Better than One” and suggests how to teach students the importance of emotional differentiation. Also in the March issue, David identifies ways in which “Psychological Science Meets Religious Faith”—a topic of increasing interest in psychology:
... View more
Labels
-
Emotion
-
Personality
0
0
1,430

Author
07-18-2016
01:27 PM
Originally posted on April 2, 2015. Facebook, Google, and Twitter, among others, are enabling psychologists to mine giant data sets that allow mega-scale naturalistic observations of human behavior. The recent Society of Personality and Social Psychology convention offered several such “big data” findings, including these (some also recently published): “Computer-based personality judgments are more accurate than those of friends, spouses, or family.” That’s how Michal Kosinski, Youyou Wu, and David Stillwell summed up their research on the digital trail left by 86,220 people’s Facebook “likes.” As a predictor of “Big Five” personality test scores, the computer data were more significantly accurate than friends’ and family members’ judgments. (Such research is enabled by the millions of people who have responded to tests via Stillwell’s myPersonality app, and who have also donated their Facebook information, with guarantees of anonymity.) Another study, using millions of posts from almost 69,792 Facebook users, found that people who score high on neuroticism tests use more words like “sad,” “fear,” and “pain.” This hints at the possibility of using social media language analysis to identify people at risk for disorder or even suicide. Researchers are also exploring Smartphones as data-gathering devices. Jason Rentfrow (University of Cambridge) offers an app for monitoring emotions (illustrated here), and proposes devices that can sense human behavior and deliver interventions. In such ways, it is becoming possible to gather massive data, to sample people’s experiences moment-to-moment in particular contexts, and to offer them helpful feedback and guidance. Amid the excitement over today’s big data, psychologist Gary Marcus offers a word of caution: “Big Data is brilliant at detecting correlation....But correlation never was causation and never will be...If we have good hypotheses, we can test them with Big Data, but Big Data shouldn’t be our first port of call; it should be where we go once we know what we’re looking for.”
... View more
Labels
-
Emotion
-
Personality
-
Research Methods and Statistics
-
Social Psychology
0
0
1,423