-
About
Our Story
back- Our Mission
- Our Leadership
- Accessibility
- Careers
- Diversity, Equity, Inclusion
- Learning Science
- Sustainability
Our Solutions
back
-
Community
Community
back
- Macmillan Community
- :
- Psychology Community
- :
- Talk Psych Blog
- :
- Talk Psych Blog - Page 2
Talk Psych Blog - Page 2
Options
- Mark all as New
- Mark all as Read
- Float this item to the top
- Subscribe
- Bookmark
- Subscribe to RSS Feed
Talk Psych Blog - Page 2

Macmillan Employee
08-18-2022
10:58 AM
As we approach the start of a new semester, psychology teachers can use this 5-minute animation written and narrated by David Myers to help students effectively learn and remember course material: https://t.co/24UIFUnWFy
... View more
1
0
1,663

Author
07-21-2022
11:07 AM
“The best time to plant a tree was 20 years ago. The second-best time is now.” ~ Anonymous proverb Character education’s greatest task is instilling a mark of maturity: the willingness to delay gratification. In many studies, those who learn to restrain their impulses—by electing larger-later rather than smaller-now rewards—have gone on to become more socially responsible, academically successful, and vocationally productive. The ability to delay gratification, to live with one eye on the future, also helps protect people from the ravages of gambling, delinquency, and substance abuse. In one of psychology’s famous experiments, Walter Mischel gave 4-year-olds a choice between one marshmallow now, or two marshmallows a few minutes later. Those who chose two later marshmallows went on to have higher college graduation rates and incomes, and fewer addiction problems. Although a recent replication found a more modest effect, the bottom line remains: Life successes grow from the ability to resist small pleasures now in favor of greater pleasures later. Marshmallows—and much more—come to those who wait. The marshmallow choice parallels a much bigger societal choice: Should we prioritize today, with policies that keep energy prices and taxes low? Or should we prioritize the future, by investing now to spare us and our descendants the costs of climate change destruction? “Inflation is absolutely killing many, many people,” said U.S. Senator Joe Manchin, in explaining his wariness of raising taxes to fund climate mitigation. Manchin spoke for 50 fellow senators in prioritizing the present. When asked to pay more now to incentivize electric vehicles and fund clean energy, their answer, on behalf of many of their constituents, is no. But might the cost of inaction be greater? The White House Office of Management and Budget projects an eventual $2 trillion annual federal budget cost of unchecked climate change. If, as predicted, climate warming increases extreme weather disasters, if tax revenues shrink with the economy’s anticipated contraction, and if infrastructure and ecosystem costs soar, then are we being penny-wise and pound-foolish? With the worst yet to come, weather and climate disasters have already cost the U.S. a trillion dollars over the past decade, with the total rising year by year. nattrass /E+/Getty Images The insurance giant Swiss Re also foresees a $23 trillion global economy cost by 2050 if governments do not act now. The Big 4 accounting firm Deloitte is even more apprehensive, projecting a $178 trillion global cost by 2070. What do you think: Are our politicians—by seizing today’s single economic marshmallow—displaying a mark of immaturity: the inability to delay gratification for tomorrow’s greater good? A related phenomenon, temporal discounting, also steers their political judgments. Even mature adults tend to discount the future by valuing today’s rewards—by preferring, say, a dollar today over $1.10 in a month. Financial advisors therefore plead with their clients to do what people are not disposed to do . . . to think long-term—to capitalize on the power of compounding by investing in their future. Alas, most of us—and our governments—are financially nearsighted. We prioritize present circumstances over our, and our children’s, future. And so we defend our current lifestyle by opposing increased gas taxes and clean-energy mandates. The western U.S. may be drying up, sea water creeping into Miami streets, and glaciers and polar ice retreating, but even so, only “1 percent of voters in a recent New York Times/Siena College poll named climate change as the most important issue facing the country, far behind worries about inflation and the economy.” The best time to plant a tree, or to have invested in climate protection, was 20 years ago. The worst time is 20 years from now, when severe climate destruction will be staring us in the eye. As we weigh our present against our future, psychological science reminds our political representatives, and all of us, of a profoundly important lesson: Immediate gratification makes today easy, but tomorrow hard. Delayed gratification makes today harder, but tomorrow easier. (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com. Follow him on Twitter: @davidgmyers.)
... View more
1
1
1,370

Author
07-05-2022
07:49 AM
With 45,000+ annual U.S. firearm deaths (123 per day, from homicide, suicide, and accidents), America has a gun problem, of which the recent Buffalo, Uvalde, and Highland Park mass killings are horrific examples. In response, we often hear that the problem is not America’s 400 million guns but its people. “We have a serious problem involving families, involving drugs, involving mental health in this country,” asserted Colorado congressional representative Ken Buck. “We have become a less safe society generally. Blaming the gun for what’s happening in America is small-minded.” To protect against mass killings by 18-year-olds, as in Buffalo and Uvalde, we are told that we don’t need to match the minimum age for assault rifle purchase, still 18 after the new gun safety act, with the age for beer purchase, 21. We don’t need to train and license gun owners as we do drivers. We don’t need safe-storage laws or restrictions on large-capacity magazines. Instead, we need to fix the problem perceived by Texas Governor Greg Abbott and National Rifle Association chief executive Wayne LaPierre: evil people. To solve the gun violence problem, we need better people, enabled by commendable social changes: more married fathers, less pornography, fewer violent video games. And most importantly, we’re told, we need to deal with mass killers’ mental sickness. “People with mental illness are getting guns and committing these mass shootings,” observed former U.S. Speaker of the House Paul Ryan. While president, Donald Trump agreed: “When you have some person like this, you can bring them into a mental institution.” Mass killers, he later added, are “mentally ill monsters.” However, reality intrudes. As I documented in an earlier essay, most mentally ill people are nonviolent, most violent criminals and mass shooters have not been diagnosed as mentally ill, and rare events such as mass shootings are almost impossible to predict. As much as we psychologists might appreciate the ostensible high regard, today’s psychological science lacks the presumed powers of discernment. If mental-health assessments cannot predict individual would-be killers, three other factors (in addition to short-term anger and alcohol effects) do offer some predictive power: Demographics. As the recent massacres illustrate, most violent acts are committed by young males. The late psychologist David Lykken made the point memorably: “We could avoid two-thirds of all crime simply by putting all able-bodied young men in cryogenic sleep from the age of 12 through 28.” Past behavior. It’s one of psychology’s maxims: The best predictor of future behavior is past behavior. The best predictor of future GPA is past GPA. The best predictor of future employee success is past employee success. The best predictor of future class attendance or smoking or exercise is, yes, the recent past. Likewise, recent violent acts are a predictor of violent acts. Guns. Compared to Canada, the United States has 3.5 times the number of guns per person and 8.2 times the gun homicide rate. Compared to England, the U.S. has 26 times as many guns per person—and 103 times the gun homicide rate. To check U.S. state variations, I plotted each state’s gun-in-home rate with its gun homicide rate. As you can see, the correlation is strongly positive, ranging from (in the lower left) Massachusetts, where 15 percent of homes have a gun, to Alaska, where 65 percent of homes have a gun—and where the homicide rate is 7 times greater. Of these three predictor variables, gun policy is one that, without constraining hunters’ rights, society can manage with some success: When nations restrict gun access, the result has been fewer guns in civilian hands, which enables fewer impulsive gun uses and fewer planned mass shootings. Many people nevertheless believe that, as Senator Ted Cruz surmised after the Uvalde shooting, “What stops arms bad guys is armed good guys.” Never mind that in one analysis of 433 active shooter attacks on multiple people—armed lay citizens took out active shooters in only 12 instances. Many more—a fourth of such attacks—ended in a shooter suicide. Moreover, if the answer to bad guys with guns is to equip more good guys with guns, then why are states with more armed good guys more homicidal? What explains the state-by-state guns/homicide correlation? Are the more murderous Alaskans (and Alabamians and Louisianans) really more “evil” or “mentally ill”? Or is human nature essentially the same across the states, with the pertinent difference being the weapons? (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com. Follow him on Twitter: @davidgmyers.)
... View more
3
0
1,639

Author
06-02-2022
12:19 PM
“Donald Trump . . . unleashed something, that is so much bigger than he is now or ever will be: He pushed the limits of acceptability, hostility, aggression and legality beyond where other politicians dared push them.” ~ Charles Blow, 2022 This is a venomous time. From 2015 to 2019, FBI-reported U.S. hate crimes increased 25 percent. Social psychologists have therefore wondered: Is this increase, and the accompanying resurgence of white nationalism, fed by Donald Trump’s rhetoric—his saying, for example, that the torch-carrying Charlottesville white nationalists included some “very fine people”? Or did the president’s tweets and speeches merely reflect existing prejudices, which, in a few disturbed individuals, fed hateful acts? Simply put: Do political leaders’ words merely voice common prejudices and grievances? Or do they also amplify them? They do both. Leaders play to their audiences. And when prominent leaders voice prejudices, it then becomes more socially acceptable for their followers to do likewise. When disdain for those of another race or religion or sexual orientation becomes socially acceptable, insults or violence may ensue. Social norms mutate, and norms matter. A new Nature Human Behaviour report of 13 studies of 10,000+ people documents the norm-changing influence of President Trump’s rhetoric. During his presidency, “Explicit racial and religious prejudice significantly increased amongst Trump’s supporters,” say the report’s authors, social psychologists Benjamin Ruisch (University of Kent) and Melissa Ferguson (Yale University). Some of their studies followed samples of Americans from 2014 to 2017, ascertaining their attitudes toward Muslims (whether they agreed, for example, that “Islam is quite primitive”). As seen on this 7-point scale of anti-Muslim prejudice, Trump supporters’ anti-Muslim sentiments significantly increased. But what about those for whom Donald Trump was not a positive role model? Would they become less imitative? Might they be like those observed in one study of jaywalking, in which pedestrians became less likely to jaywalk after observing someone they didn’t admire (a seeming low-status person) doing so? Indeed, Trump opponents exhibited decreased Muslim prejudice over time. Moreover, Ruisch and Ferguson found, “Trump support remained a robust predictor of increases in [anti-Muslim] prejudice” even after controlling for 39 other variables, such as income, age, gender, and education. Trump support also predicted increases in other forms of prejudice, such as racial animus (“Generally, Blacks are not as smart as Whites are”) and anti-immigrant attitudes. Pew national surveys similarly find that the attitude gap between those voting for and against Trump widened from 2016 to 2020 (see the 2020 data below). The 57 percent of Democrat voters who, in 2016, agreed that it’s more difficult to be a Black American than a White American increased to 74 percent in 2020, illustrating the general historical trend toward more egalitarian attitudes. But no such positive shift occurred among Trump supporters, whose agreement with the concept of Black disadvantage actually declined slightly, from 11 percent in 2016 to 9 percent in 2020. Ruisch and Ferguson see shifting social norms at work. “Trump supporters perceive that it has become more acceptable to express prejudice since Trump’s election . . . and the perception that prejudice is more acceptable predicts greater personal prejudice among Trump supporters.” Their conclusion aligns with the earlier finding of a political science research team—“that counties that had hosted a 2016 Trump campaign rally saw a 226 percent increase in reported hate crimes over comparable counties that did not host such a rally.” The happier news is that political leaders’ speech can work for better as well as for worse. In one month during 2021, a Stanford research team randomly assigned 1014 counties with low vaccination rates to receive 27-second YouTube ads (via TV, website, or app). Each featured Donald Trump’s expressed support for Covid vaccinations. After a Fox News anchor’s introduction, shown below, clips featured the Trumps getting the vaccine, with Ivanka saying, “Today I got the shot. I hope you do too” and Donald explaining “I would recommend it, and I would recommend it to a lot of people that don’t want to get it.” At a cost of about $100,000, the add was viewed 11.6 million times by 6 million people. When compared with 1018 control counties, the experimental treatment counties experienced an additional 104,036 vaccinations, for an additional cost of less than $1 each. The simple lesson: Under the influence of powerful leaders, social norms and behaviors can change. And social norms matter—sometimes for worse, but sometimes for better. (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com. Follow him on Twitter: @davidgmyers.)
... View more
Labels
1
0
3,694

Author
05-10-2022
07:44 AM
“All effective propaganda must be limited to a very few points and must harp on these in slogans.” ~ Adolf Hitler, Mein Kampf, 1926 Among psychology’s most reliable phenomena is the power of mere repetition. It comes in two forms, each replicated by psychological research many times in many ways. Repetition Fosters Fondness We humans are naturally disposed to prefer what’s familiar (and usually safe) and to be wary of what’s unfamiliar (and possibly dangerous). Social psychologists led by the late Robert Zajonc exposed the power and range of this mere exposure effect. In study after study, the more frequently they showed people unfamiliar nonsense syllables, Chinese characters, geometric figures, musical selections, artwork, or faces, the better they liked them. Repetition breeds liking. Mere exposure also warms our relationships. As strangers interact, they tend to increasingly like each other and to stop noticing initially perceived imperfections or differences. Those who come to know LGBTQ folks almost inevitably come to accept and like them. By three months, infants in same-race families come to prefer photos of people of their own (familiar) race. We even prefer our own familiar face—our mirror-image face we see while brushing our teeth, over our actual face we see in photos. Advertisers understand repetition’s power. With repetitions of an ad, shoppers begin to prefer the familiar product even if not remembering the ad. Indeed, the familiarity-feeds-fondness effect can occur without our awareness. In one clever experiment, research participants focused on repeated words piped into one earpiece while an experimenter simultaneously fed a novel tune into the other ear. Later, they could not recognize the unattended-to tune—yet preferred it over other unpresented tunes. Even amnesia patients, who cannot recall which faces they have been shown, will prefer faces they’ve repeatedly observed. Repetition Breeds Belief As mere exposure boosts liking, so mere repetition moves minds. In experiments, repetition makes statements such as “Othello was the last opera of Verdi” seem truer. After hearing something over and over, even a made-up smear of a political opponent becomes more believable. Adolf Hitler understood this illusory truth effect. So did author George Orwell. In his world of Nineteen Eighty-Four, the population was controlled by the mere repetition of slogans: “Freedom is slavery.” “Ignorance is strength.” “War is peace.” And so does Vladimir Putin, whose controlled, continuous, and repetitive propaganda has been persuasive to so many Russians. Barack Obama understood the power of repetition: “If they just repeat attacks enough and outright lies over and over again . . . people start believing it.” So did Donald Trump: “If you say it enough and keep saying it, they’ll start to believe you.” And he did so with just the intended effect. What explains repetition’s persuasive power? Familiar sayings (whether true or false) become easier to process and to remember. This processing fluency and memory availability can make assertions feel true. The result: Repeated untruths such as “taking vitamin C prevents colds” or “childhood vaccines cause autism” may become hard-to-erase mental bugs. But can mere repetition lead people to believe bizarre claims—that a presidential election was stolen, that climate change is a hoax, that the Sandy Hook school massacre was a scam to promote gun control? Alas, yes. Experiments have shown that repetition breeds belief even when people should know better. After repetition, “The Atlantic Ocean is the largest ocean on Earth” just feels somewhat truer. Even crazy claims can seem truer when repeated. That’s the conclusion of a new truth-by-repetition experiment. At Belgium’s Catholic University of Louvain, Doris Lacassagne and her colleagues found that, with enough repetition, highly implausible statements such as “Elephants run faster than cheetahs” seem somewhat less likely to be false. Less extreme but still implausible statements, such as “A monsoon is caused by an earthquake” were especially vulnerable to the truth-by-repetition effect. For those concerned about the spread of oft-repeated conspiracy theories, the study also offered some better news. Lacassagne found that barely more than half of her 232 U.S.-based participants shifted toward believing the repeated untruths. The rest knew better, or even shifted to greater incredulity. At the end of his life, Republican Senator John McCain lamented “the growing inability, and even unwillingness, to separate truth from lies.” For psychology educators like me and some of you, the greatest mission is teaching critical thinking that helps students winnow the wheat of truth from the chaff of misinformation. Evidence matters. So we teach our students, “Don’t believe everything you hear.” And, after hearing it, “Don’t believe everything you think!” (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com. Follow him on Twitter: @davidgmyers.)
... View more
2
0
2,122

Author
03-24-2022
01:01 PM
“There are trivial truths and great truths,” the physicist Niels Bohr reportedly said.[1] “The opposite of a trivial truth is plainly false. The opposite of a great truth is also true.” Light is a particle. And light is a wave. Psychology, too, embraces paradoxical truths. Some are more complementary than contradictory: Attitudes influence behavior, and attitudes follow behavior. Self-esteem pays dividends, and self-serving bias is perilous. Memories and attitudes are explicit, and they are implicit. We are the creatures of our social worlds, and we are the creators of our social worlds. Brain makes mind, and mind controls brain. To strengthen attitudes, use persuasion, and to strengthen attitudes, challenge them. Religion “makes prejudice, and it unmakes prejudice” (Gordon Allport). “When I accept myself just as I am, then I can change” (Carl Rogers). Psychology also offers puzzling paradoxical concepts. Paradoxical sleep (aka REM sleep) is so-called because the muscles become near-paralyzed while the body is internally aroused. The immigrant paradox refers to immigrant U.S. children exhibiting better mental health than native-born children. And the paradox of choice describes how the modern world’s excessive options produce diminished satisfaction. Even more puzzling are seemingly contradictory findings from different levels of analysis. First, consider: Who in the U.S. is more likely to vote Republican—those with lots of money or those with little? Who is happier—liberals or conservatives? Who does more Google searches for “sex”—religious or nonreligious people? Who scores highest on having meaning in life—those who have wealth or those who don’t? Who is happiest and healthiest—actively religious or nonreligious people? As I have documented, in each case the answer depends on whether we compare places or individuals: Politics. Low-income states and high-income individuals have voted Republican in recent U.S. presidential elections. Happy welfare states and unhappy liberals. Liberal countries and conservative individuals manifest greater well-being. Google “sex” searches. Highly religious states, and less religious individuals, do more Google “sex” searching. Meaning in life. Self-reported meaning in life is greatest in poor countries, and among rich individuals. Religious engagement correlates negatively with well-being across aggregate levels (when comparing more vs. less religious countries or American states), yet positively across individuals. Said simply, actively religious individuals and nonreligious places are generally flourishing. As sociologist W. S. Robinson long ago appreciated, “An ecological [aggregate-level] correlation is almost certainly not equal to its individual correlation.” Thus, for example, if you want to make religion look good, cite individual data. If you want to make it look bad, cite aggregate data. In response to this paradoxical finding, Nobel laureate economist Angus Deaton and psychologist Arthur Stone wondered: “Why might there be this sharp contradiction between religious people being happy and healthy, and religious places being anything but?”[2] To this list of psychological science paradoxes, we can add one more: the gender-equality paradox—the curious finding of greater gender differences in more gender-equal societies. You read that right. Several research teams have reported that across several phenomena, including the proportion of women pursuing degrees in STEM (science, technology, engineering, and math) fields, gender differences are greater in societies with more political and economic gender equality. In the February, 2022, issue of Psychological Science, University of Michigan researcher Allon Vishkin describes “the myriad findings” that societies with lower male-superior ideology and educational policy “display larger gender differences.” This appears, he reports, not only in STEM fields of study, but also in values and preferences, personality traits, depression rates, and moral judgments. Moreover, his analysis of 803,485 chess players in 160 countries reveals that 90 percent of chess players are men; yet “women participate more often in countries with less gender equality.” Go figure. Vishkin reckons that underlying the paradox is another curious phenomenon: Gender unequal societies have more younger players, and there’s greater gender equality in chess among younger people. Paradoxical findings energize psychological scientists, as we sleuth their explanation. They also remind us of Bohr’s lesson. Sometimes the seeming opposite of a truth is another truth. Reality is often best described by complementary principles: mind emerges from brain, and mind controls brain. Both are true, yet either, by itself, is a half-truth. (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com. Follow him on Twitter: @davidgmyers.) [1] I first read this unsourced quote in a 1973 article by social psychologist William McGuire. Neils Bohr’s son, Hans Bohr, in his biography of his father, reports that Neils Bohr discerned “two sorts of truths, profound truths recognized by the fact that the opposite is also a profound truth, in contrast to trivialities where opposites are obviously absurd.” [2] For fellow researchers: The paradox is partially resolved by removing income as a confounding factor. Less religious places also tend to be affluent places (think Denmark and Oregon). More religious places tend to be poorer places (think Pakistan and Alabama). Thus, when we compare less versus more religious places, we also are comparing richer versus poorer places. And as Ed Diener, Louis Tay, and I observed from Gallup World Poll data, controlling for objective life circumstances, such as income, eliminates or even slightly reverses the negative religiosity/well-being correlation across countries.
... View more
1
0
1,621

Author
03-03-2022
02:54 PM
If you live in one of 30 U.S. states with recently legalized sports betting, you’ve surely noticed: Your online media and television have been awash with sports betting ads from DraftKings, FanDuel, Caesars Sportsbook, and more. For this, we can thank the 2018 U.S. Supreme Court’s overturning of the sports betting ban, which also led the NFL in 2021 to allow sports betting ads even during its broadcasts and live-streams. With the deluge of ads, which sometimes offer new customers free money to lure initial bets, the gaming industry hopes to hook new betters and expand its customer base from the current 50 million or so Americans who gamble on sports. For most, the few dollars wagered may be nothing more than a bit of exciting fun. But for some—those who develop a gambling disorder—the betting becomes compulsive and debilitating, as gamblers crave the excitement, seek to redeem their losses, and lie to hide their behavior. Family finances suffer. Bankruptcies happen. Divorces result. And with the sports betting floodgates now opened, problem gambling is increasing. “The National Problem Gambling helpline is receiving an average of more than 22,500 calls a month this year,” reports the Wall Street Journal, “up from a monthly average of about 14,800 last year.” Pgiam/E+/Getty Images It’s no secret that, over time, the house wins and gamblers nearly always lose. So how does the gambling industry manage to suck nearly a quarter-trillion dollars annually from U.S. pockets? Are state lotteries, like Britain’s National Lottery, merely (as one of my sons mused) “a tax on the statistically ignorant”? (My state’s lottery pays out as winnings only 61 cents of each dollar bet.) To remind folks of the power of psychological dynamics, and to prepare them to think critically about the allure of gambling inducements, we can ask: What psychological principles does the gambling industry exploit? Consider these: Partially (intermittently) reinforced behavior becomes resistant to extinction. Pigeons that have been reinforced unpredictably—on a “variable ratio” schedule—may continue pecking thousands more times without further reward. Like fly fishing, slot machines and sports gambling reward people occasionally and unpredictably. So hope springs eternal. The judgment-altering power of the availability heuristic. As Nobel laureate psychologist Daniel Kahneman has shown, people tend to estimate the commonality of various events based on their mental availability—how readily instances come to mind. Casinos have the idea. They broadcast infrequent wins with flashing lights, while keeping the far more common losses invisible. Likewise, gamblers, like stock day-traders, may live to remember and tell of their memorable wins, while forgetting their more mundane losses. Illusory correlations feed an illusion of control. People too readily believe that they can predict or control chance events. When choosing their own lottery number (rather than being assigned one), people demand much more money when invited to sell their ticket. If assigned to throw the dice or spin the wheel themselves, their confidence increases. Dice players also tend to throw hard if wanting high numbers, and soft for low numbers. When winning, they attribute outcomes to their skill, while losses become “near misses.” Losing sports gamblers may rationalize that their bet was actually right, except for a referee’s bad call or a freakish ball bounce. Difficulty delaying gratification. Those who from childhood onward have learned to delay gratification—who choose two marshmallows later over one now (as in the famous “marshmallow test” experiment)—become more academically successful and ultimately productive. They are also less likely to smoke, to commit delinquent acts, and to gamble—each of which offer immediate reward, even if at the cost of diminished long-term health and well-being. The gaming industry seeks present-minded rather than future-minded folks. They aim to hook those who will elect that figurative single marshmallow satisfaction of today’s desire over the likelihood of a greater deferred reward. Credible, attractive communicators exploit “peripheral route persuasion.” Endorsements by beautiful, famous, or trusted people can add to the allure. As former gaming industry marketing executive Jack O’Donnell notes, the sports gambling industry harnesses sports celebrity power when paying former all-star receiver Jerry Rice to dump Gatorade on a winning DraftKings bettor, when trusted sportscaster Brent Musburger encourages placing a bet, and when legendary quarterback and former Super Bowl MVP Drew Brees admonishes people to live your “Bet Life.” Each of these psychological dynamics has its own power. When combined, they help us understand the gaming industry’s lure, and, for some, its tragic addictive force. (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com. Follow him on Twitter: @davidgmyers.)
... View more
0
0
1,344

Author
02-09-2022
12:41 PM
As members of one species, we humans share both a common biology (cut us and we bleed) and common behaviors (we similarly sense the world around us, use language, and produce and protect children). Our human genes help explain our kinship—our shared human nature. And they contribute to our individual diversity: Some people, compared with others, are taller, smarter, skinnier, healthier, more temperamental, shyer, more athletic . . . the list goes on and on. Across generations, your ancestors shuffled their gene decks, leading to the hand that—with no credit or blame due you—you were dealt. If you are reading this, it’s likely that genetic luck contributed to your having above average intelligence. Others, dealt a different hand and a different life, would struggle to understand these words. Andrew Brookes/Image Source/Getty Images Individual variation is big. Individuals vary much, much more within groups (say, comparing Danes with Danes or Kenyans with Kenyans) than between groups (as when comparing Danes and Kenyans). Yet there are also group differences. Given this reality (some groups struggle more in school), does behavior genetic science validate ethnocentric beliefs and counter efforts to create a just and equal society? In The Genetic Lottery: Why DNA Matters for Social Equality, University of Texas behavior geneticist Kathryn Paige Harden answers an emphatic no. She documents the power of genes, but also makes the case for an egalitarian culture in which everyone thrives. Among her conclusions are these: We all are family. Going back far enough in time to somewhere between 5000 and 2000 B.C., we reach a remarkable point where “everyone alive then, if they left any descendants at all, was a common ancestor of everyone alive now.” We are all kin beneath the skin. We’re each highly improbable people. “Each pair of parents could produce over 70 trillion genetically unique offspring.” If you like yourself, count yourself fortunate. Most genes have tiny effects. Ignore talk of a single “gay gene” or “smart gene.” The human traits we care about, including our personality, mental health, intelligence, longevity, and sexual orientation “are influenced by many (very, very, very many) genetic variants, each of which contributes only a tiny drop of water to the swimming pool of genes that make a difference.” Individual genes’ tiny effects may nevertheless add up to big effects. Today’s Genome Wide Association Studies (GWAS) measure millions of genome elements and correlate each with an observed trait (phenotype). The resulting miniscule correlations from thousands of genetic variants often “add up to meaningful differences between people.” Among the White American high school students in one large study, only 11 percent of those who had the lowest GWAS polygenic index score predicting school success later graduated from college, as did 55 percent of those who had the highest score. “That kind of gap—a fivefold increase in the rate of college graduation—is anything but trivial.” Twin studies confirm big genetic effects. “After fifty years and more than 1 million twins, the overwhelming conclusion is that when people inherit different genes, their lives turn out differently.” Parent-child correlations come with no causal arrows. If the children of well-spoken parents who read to them have larger vocabularies, the correlation could be environmental, or genetic, or some interactive combination of the two. Beware the ecological fallacy (jumping from one data level to another). Genetic contributions to individual differences within groups (such as among White American high school students) provide zero evidence of genetic differences between groups. Genetic science does not explain social inequalities. Harden quotes sociologist Christopher Jencks’ illustration of a genetically influenced trait eliciting an environmentally caused outcome: “If, for example, a nation refuses to send children with red hair to school, the genes that cause red hair can be said to lower reading scores.” Harden also quotes social scientist Ben Domingue: “Genetics are a useful mechanism for understanding why people from relatively similar backgrounds end up different. . . . But genetics is a poor tool for understanding why people from manifestly different starting points don’t end up the same.” Many progressives affirm some genetic influences on individual traits. For example, unlike some conservatives who may see sexual orientation as a moral choice, progressives more often understand sexual orientation as a genetically influenced natural disposition. Differences ≠ deficits. “The problem to be fixed is society’s recalcitrant unwillingness to arrange itself in ways that allow everyone, regardless of which genetic variants they inherit, to participate fully in the social and economic life of [their] country.” An example: For neurodiverse individuals, the question is how to design environments that match their skills. Behavior genetics should be anti-eugenic. Advocates of eugenics have implied that traits are fixed due to genetic influences, and may therefore deny the value of social interventions. Alternatively, some genome-blind advocates shun behavior genetics science that could inform both our self-understanding and public policy. Harden advocates a third option, an anti-eugenic perspective that, she says, would reduce the waste of time and resources on well-meaning but ineffective programs. For example, by controlling for genetic differences with a GWAS measure, researchers can more accurately confirm the actual benefits of an environmental intervention such as an educational initiative or income support. Anti-eugenics also, she contends, uses genetic information to improve lives, not classify people, uses genetic information to promote equity, not exclusion, doesn’t mistake being lucky in a Western capitalist society for being “good” or deserving, and considers what policies people would favor if they didn’t know who they would be. Harden’s bottom line: Acknowledging the realities of human diversity, and discerning the powers and limits of various environmental interventions, can enhance our quest for a just and fair society. (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com. Follow him on Twitter: @davidgmyers.)
... View more
2
0
1,562

Author
01-19-2022
12:46 PM
“Listen, we all hoped and prayed the vaccines would be 100 percent effective, 100 percent safe, but they’re not. We now know that fully vaccinated individuals can catch Covid, they can transmit Covid. So what’s the point?” ~U.S. Senator Ron Johnson, Fox News, December 27, 2021 “I have made a ceaseless effort not to ridicule, not to bewail, not to scorn human actions, but to understand them.” ~Baruch Spinoza, Political Treatise, 1677 Many are aghast at the irony: Unvaccinated, unmasked Americans remain much less afraid of the Covid virus than are their vaccinated, masked friends and family. This, despite two compelling and well-publicized facts: Covid has been massively destructive. With 5.4 million confirmed Covid deaths worldwide (and some 19 million excess Covid-era deaths), Covid is a great enemy. In the U.S., the 840,000 confirmed deaths considerably exceed those from all of its horrific 20th-century wars. The Covid vaccines are safe and dramatically effective. The experience of 4.5+ billion people worldwide who have received a Covid vaccine assures us that they entail no significant risks of sickness, infertility, or miscarriage. Moreover, as the CDC illustrates, fully vaccinated and boosted Americans this past fall had a 10 times lower risk of testing positive for Covid and a 20 times lower risk of dying from it. Given Covid’s virulence, why wouldn’t any reasonable person welcome the vaccine and other non-constraining health-protective measures? How can a U.S. senator scorn protection that is 90+ percent effective? Does he also shun less-than-100%-effective seat belts, birth control, tooth brushing, and the seasonal flu vaccine that his doctor surely recommends? To say bluntly what so many are wondering: Has Covid become a pandemic of the stupid? Lest we presume so, psychological science has repeatedly illuminated how even smart people can make not-so-smart judgments. As Daniel Kahneman and others have demonstrated, intelligent people often make dumb decisions. Researcher Keith Stanovich explains: Some biases—such as our favoring evidence that confirms our preexisting views—have “very little relation to intelligence.” So, if we’re not to regard the resilient anti-vax minority as stupid, what gives? If, with Spinoza, we wish not to ridicule but to understand, several psychological dynamics can shine light. Had we all, like Rip Van Winkle, awakened to the clear evidence of Covid’s virulence and the vaccine efficacy, surely we would have more unanimously accepted these stark realities. Alas, today’s science-scorning American subculture seeded skepticism about Covid before the horror was fully upon us. Vaccine suspicion was then sustained by several social psychological phenomena that we all experience. Once people’s initial views were formed, confirmation bias inclined them to seek and welcome belief-confirming information. Motivated reasoning bent their thinking toward justifying what they had come to believe. Aided by social and broadcast media, group polarization further amplified and fortified the shared views of the like-minded. Misinformation begat more misinformation. Moreover, a powerful fourth phenomenon was at work: belief perseverance. Researchers Craig Anderson, Mark Lepper, and Lee Ross explored how people, after forming and explaining beliefs, resist changing their minds. In two of social psychology’s great but lesser known experiments, they planted an idea in Stanford undergraduates’ minds. Then they discovered how difficult it was to discredit the idea, once rooted. Their procedure was simple. Each study first implanted a belief, either by proclaiming it to be true or by offering anecdotal support. One experiment invited students to consider whether people who take risks make good or bad firefighters. Half looked at cases about a risk-prone person who was successful at firefighting and a cautious person who was not. The other half considered cases suggesting that a risk-prone person was less successful at firefighting. Unsurprisingly, the students came to believe what their case anecdotes suggested. Then the researchers asked all the students to explain their conclusion. Those who had decided that risk-takers make better firefighters explained, for instance, that risk-takers are brave. Those who had decided the opposite explained that cautious people have fewer accidents. Lastly, Anderson and his colleagues exposed the ruse. They let students in on the truth: The cases were fake news. They were made up for the experiment, with other study participants receiving the opposite information. With the truth now known, did the students’ minds return to their pre-experiment state? Hardly. After the fake information was discredited, the participants’ self-generated explanations sustained their newly formed beliefs that risk-taking people really do make better (or worse) firefighters. So, beliefs, once having “grown legs,” will often survive discrediting. As the researchers concluded, “People often cling to their beliefs to a considerably greater extent than is logically or normatively warranted.” In another clever Stanford experiment, Charles Lord and colleagues engaged students with opposing views of capital punishment. Each side viewed two supposed research findings, one supporting and the other contesting the idea that the death penalty deters crime. So, given the same mixed information, did their views later converge? To the contrary, each side was impressed with the evidence supporting their view and disputed the challenging evidence. The net result: Their disagreement increased. Rather than using evidence to form conclusions, they used their conclusions to assess evidence. And so it has gone in other studies, when people selectively welcomed belief-supportive evidence about same-sex marriage, climate change, and politics. Ideas persist. Beliefs persevere. The belief-perseverance findings reprise the classic When Prophecy Fails study led by cognitive dissonance theorist Leon Festinger. Festinger and his team infiltrated a religious cult whose members had left behind jobs, possessions, and family as they gathered to await the world’s end on December 21, 1954, and their rescue via flying saucer. When the prophecy failed, did the cult members abandon their beliefs as utterly without merit? They did not, and instead agreed with their leader’s assertion that their faithfulness “had spread so much light that God had saved the world from destruction.” These experiments are provocative. They indicate that the more we examine our theories and beliefs and explain how and why they might be true, the more closed we become to challenging information. When we consider and explain why a favorite stock might rise in value, why we prefer a particular political candidate, or why we distrust vaccinations, our suppositions become more resilient. Having formed and repeatedly explained our beliefs, we may become prisoners of our own ideas. Thus, it takes more compelling arguments to change a belief than it does to create it. Republican representative Adam Kinzinger understands: “I’ve gotten to wonder if there is actually any evidence that would ever change certain people’s minds.” Moreover, the phenomenon cuts both ways, and surely affects the still-fearful vaccinated and boosted people who have hardly adjusted their long-ago Covid fears to the post-vaccine, Omicron new world. The only known remedy is to “consider the opposite”—to imagine and explain a different result. But unless blessed with better-than-average intellectual humility, as exhibited by most who accept vaccine science, we seldom do so. Yet there is good news. If employers mandate either becoming vaccinated or getting tested regularly, many employees will choose vaccination. As laboratory studies remind us, and as real-world studies of desegregation and seat belt mandates confirm, our attitudes will then follow our actions. Behaving will become believing. (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com. Follow him on Twitter: @davidgmyers.)
... View more
0
0
3,075

Author
01-07-2022
08:05 AM
In Exploring Psychology, 12th Edition, Nathan DeWall and I report that autism spectrum disorder (ASD) “is now diagnosed in 1 in 38 children in South Korea, 1 in 54 in the United States, 1 in 66 in Canada….” Check that. A new CDC report on 2018 data raises the continually increasing U.S. proportion to 1 in 44 (23 per 1,000 among 8-year-olds in eleven representative communities followed by the Autism and Developmental Disabilities Monitoring Network). The report also confirms that ASD diagnoses are four times more common among boys than girls. Psychologist Simon Baron-Cohen believes the gender imbalance is because boys tend to be “systemizers”: They more often understand things according to rules or laws, as in mathematical and mechanical systems. Girls, he contends, tend to be “empathizers”: They excel at reading facial expressions and gestures. And what racial/ethnic group do you suppose has the highest rate of ASD diagnoses? The answer: There are no discernible differences (nor across socioeconomic groups). In 2018, ASD was diagnosed equally often among all racial/ethnic groups. A final fact to ponder: 4-year-olds, the CDC reports, were “50 percent more likely to receive an autism or special education classification” than were 8-year-olds. So what do you think? Is the increasing ASD diagnosis rate—over time and of 4-year-olds—a a welcome trend? Is Karen Remley, director of the CDC’s National Center on Birth Defects and Developmental Disabilities right to regard this “progress in early identification” as “good news because the earlier that children are identified with autism the sooner they can be connected to services and support”? Or does the increased labeling of children become a self-fulfilling prophecy that assigns children to a category that includes some social deficiency, and then treats them differently as a result? And does the trend reflect some relabeling of children’s disorders, as reflected in the decreasing diagnoses of “cognitive disorder” and “learning disability”? (The popularity of different psychiatric labels does exhibit cultural variation across time and place.) In this COVID-19 era of anti-vax fears, this much we know for sure: One thing that does not contribute to rising ASD diagnoses is childhood vaccinations. Children receive a measles/mumps/rubella (MMR) vaccination in early childhood, about the time ASD symptoms first get noticed—so some parents naturally presumed the vaccination caused the ensuing ASD. Yet, despite a fraudulent 1998 study claiming otherwise, vaccinations actually have no relationship to the disorder. In one study that followed nearly 700,000 Danish children, those receiving the measles/mumps/rubella vaccine were slightly less likely to later be among the 6517 ASD-diagnosed children. (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com. Follow him on Twitter: @davidgmyers.)
... View more
2
0
2,013

Author
11-17-2021
10:07 AM
Consider: If blessed with income and wealth beyond your needs, what would you do with it? If you decided to give away your excess wealth, would you focus it on present needs (by distributing it immediately) or future needs (by accumulating wealth for later distribution)? Consider two sample answers to the second question, both drawn from real life: Give for immediate use. In Bremerton, Washington, a generous anonymous donor is giving away $250,000 grants to local, national, and international nonprofits with a condition: They must spend it in the next 2 years. The seven recipients to date are gladly doing so by hiring new staff, giving scholarships, feeding people, and so forth. There’s no time like the present. Give today, but maximize future impact. The John Templeton Foundation, which I have served as a trustee, had a future-minded benefactor who grew his wealth by living simply and investing half his income from his earliest working years. Thanks to his investment success and the exponential mathematics of investment compounding, he was able, by his death at age 95, to endow the foundation, which today has nearly $4 billion in assets. Like all U.S. foundations, the foundation has a mandated 5 percent annual payout rate—meaning that they give today but with eyes also on more distant horizons. So, would you advise prioritizing the present (as the Bremerton donor has) or also focusing on the future (as most foundations do)? Dimitri Otis/Stone/Getty Images The Initiative to Accelerate Charitable Giving would appreciate the Bremerton donor. As the world recovers from Covid and strives for racial justice, the Initiative perceives that “demands for services from charities are greater than ever.” So, it argues, foundations should increase their giving now. The Patriotic Millionaires, co-led by Abigail Disney, have proposed doubling, for three years, the required foundation payout from 5 to 10 percent. The Accelerating Charitable Efforts Act, co-sponsored by Senators Angus King (I-ME) and Chuck Grassley (R-IA), would incentivize a 7 percent foundation payout rate (by waiving the 1.39 percent investment income tax for any year in which payout tops 7 percent of assets). Do you agree with this strategy—is now the time to give? Should we take care of our time, and leave it to future people to take care of theirs? If so, consider: Prioritizing the present will likely diminish a foundation’s future effectiveness. Given that asset-balanced foundation endowments have tended to earn less than 7 percent on their total investments,[1] even a 7 percent payout mandate would, over time, likely shrink a foundation’s assets and giving capacity. Assuming a continuation of long-term stock market performance, the Templeton Foundation calculates that its 50-year total giving would be almost double under a 5 percent payout (nearly $20 billion) vs. a 10 percent payout (less than $12 billion). Given both current and future human needs, would you still support a mandate that foundations distribute more of their assets now? Are today’s crises likely greater than tomorrow’s? The present versus future ethical dilemma brings to mind three related psychological principles: Temporal discounting. Humans often value immediate rewards over larger future rewards—a dollar today over $1.10 in a month. The phenomenon is familiar to financial advisors who plead with clients to value their future, and to harness the magic of compounding by investing today. Being financially nearsighted, our governments also tend to spend public monies on our present needs rather than our and our descendants’ future needs. Some of this present-focus reflects our commendable capacity for empathy—our hearts responding to present needs that we see and feel. But temporal discounting is also manifest in today’s consumers who oppose carbon taxes and clean energy mandates lest their lifestyle be restrained for the sake of humanity’s future. Temporal discounting undermines sustainability. Self-control: The ability to delay gratification. We aim to teach our children self-control—to control their impulses and to delay short-term gratification for bigger longer-term rewards. Better (in Walter Mischel’s classic experiment) two future marshmallows than one now. Such self-control predicts better school performance, better health, and higher income. Personal time perspective: Past, present, or future. In a 6-minute TED talk, Phil Zimbardo compared people with past, present, or future orientations—those who focus on their memories of what was, on their present situation, or on what will be. Although the good life is a mix of each, a future orientation bodes well for adolescents. Living with one eye on the future enables bigger future rewards and minimizing risk of school drop-out, problem gambling, smoking, and delinquency. Patience pays. So, mindful of both today’s and tomorrow’s needs, would you favor or oppose the proposals to increase foundation payout requirements? Of course, you say, both the present and the future matter. Indeed. But to what extent should we prioritize poverty relief (or scholarships or art galleries) today versus in the future? Who matters more—us and our people, or our great grandchildren and their compatriots? Or do we and our descendants all matter equally? (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com. Follow him on Twitter: @davidgmyers.) [1] Century-long U.S. stock returns have averaged near 10 percent, or about 7 percent when inflation-adjusted. But most foundations also have other assets that have experienced a lower rate of return—in cash, bonds, and, for example, emerging markets.
... View more
Labels
0
0
1,780

Author
10-28-2021
07:59 AM
Steven Pinker’s books—How the Mind Works, The Blank Slate, The Better Angels of Our Nature, Enlightenment Now, and his latest, Rationality—offer a consistent and important message: Smart, critical thinking attends less to anecdotes that tug at the heart than to realities revealed by representative data. Year after year, 7 in 10 Americans, after reading the news, tell Gallup there has been more crime than in the prior year. In Better Angels, Pinker documents the reality: a long-term crime decline, along with other subsiding forms of violence, including wars and genocide. Enlightenment Now details other ways—from the environment, to life expectancy, to human rights, to literacy, to quality of life—in which, contrary to our news-fed sense of doom, the world actually is getting better. The same thinking-with-data theme pervades Rationality: What It Is, Why It Seems Scarce, Why It Matters. For my money, Chapter 4 (“Probability and Randomness”) alone is worth the book’s price of admission. It’s a chapter I wish I could assign to every AP and college introductory psychology student. Here, according to Pinker are some noteworthy outcomes of our flawed thinking: Statistical illiteracy. Our tendency to judge the likelihood of events by the ease with which examples come to mind—the availability heuristic—leads us to think folks are more often killed by tornados than by 80-times-deadlier asthma; to believe that America’s immigrant population is 28 percent (rather than 12 percent); to guess that 24 percent of Americans are gay (rather than 4.5 percent. And how many unarmed Americans of all races would you guess are killed by police in an average year? Sixty-five, reports Pinker (from reported 2015–2019 FBI data). Unwise public spending. In 2019, after a Cape Cod surfer became Massachusetts’ first shark fatality in more than eight decades, towns equipped their beaches with scary billboard warnings and blood hemorrhage-control kits, and looked into “towers, drones, planes, balloons, sonar, acoustic buoys, and electromagnetic and odorant repellants” . . . while not investing in reducing car accident deaths at a fraction of the cost, with improved signage, barriers, and law enforcement. Mitigating climate change. Compared with deaths caused by mining accidents, lung disease, dam failures, gas explosions, and fouled air, modern nuclear power, despite its vivid few failures, “is the safest form of energy”—and emits no greenhouse gases. Exaggerated fears of terrorists. Although terrorists annually kill fewer people than are killed by lightning, bee stings, or bathtub drowning, we have engaged in massive anti-terrorist spending and launched wars that have killed hundreds of thousands. Amplified dread of school shootings. “Rampage killings in American schools claim around 35 victims a year, compared with about 16,000 routine police-blotter homicides,” Pinker tells us. In response, “schools have implemented billions of dollars of dubious safety measures . . . while traumatizing children with terrifying active-shooter drills.” “The press is an availability machine,” Pinker observes. “It serves up anecdotes that feed our impression of what’s common in a way that is guaranteed to mislead.” By contrast, unreported good news typically consists “of nothing happening, like a boring country at peace.” And progress—such as 137,000 people escaping extreme poverty each day—creeps up silently, “transforming the world by stealth. . . . There was never a Thursday in October in which it suddenly happened. So one of the greatest developments in human history—a billion and a quarter people escaping squalor [in the last 25 years]—has gone unnoticed.” This latest offering from one of psychology’s public intellectuals joins kindred-spirited data-based perspectives by Hans Rosling (Factfulness: Ten Reasons We’re Wrong About the World—and Why Things are Better Than You Think), Max Roser (ourworldindata.org), and William MacAskill (Doing Good Better), as well as my Intuition: Its Powers and Perils. Together, they help us all to think smarter by advocating reality-based, statistically literate, rational decisions that can help us spend and give more wisely, and sustain a flourishing world. (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com. Follow him on Twitter: @davidgmyers.)
... View more
Labels
0
0
1,732

Author
10-01-2021
08:02 AM
Consider the great COVID irony: As Surgeon General Vivek Murthy noted recently, “Vaccinated people may overestimate their peril, just as unvaccinated people may underestimate it.” Murthy could omit “may,” for we now have a string of national surveys (here, here, here, and here) showing that unvaccinated folks are much less likely to fear the virus. Moreover, those who are unvaccinated—and thus vastly more at risk of contracting and transmitting the virus—are also much less likely to protect themselves and others by wearing a mask. (If you see someone wearing a mask in your grocery store, they’re probably vaccinated.) Unvaccinated people’s discounting the threat, distrusting science, and prioritizing their rights to be unvaccinated and unmasked provide us social psychologists with a gigantic case study of unrealistic optimism, motivated reasoning, and group polarization. But looking forward, we can offer a prediction: As vaccine mandates increase, inducing more people to accept vaccination rather than being excluded from events or flights or bothered with weekly testing, attitudes will follow behavior. As every student of psychological science knows, two-way traffic flows between our attitudes and our behavior. We will often stand up for what we believe. But we also come to believe in what we stand up for. When people are induced to play a new role—perhaps their first days in the military or on a new job—their initial play-acting soon feels natural, as the new actions become internalized. When, in experiments, people are induced to support something about which they have doubts, they often come to accept their words. And in the laboratory, as in life, hurtful acts toward another foster disparagement, while helpful acts foster liking. In short, we not only can think ourselves into action, but also act ourselves into a new way of thinking. Behaving becomes believing. The attitudes-follow-behavior phenomenon is strongest in situations where we feel some responsibility for our action, and thus some need to explain it to ourselves—resolving any dissonance between our prior thinking and our new behavior. But the federal mandate—get vaccinated or face weekly testing—does (smartly) preserve some choice. What is more, we have ample historical evidence of mandates swaying public opinion. In the years following the 1954 Supreme Court school desegregation decision and the 1964 Civil Rights Act, White Americans—despite initial resistance—expressed steadily diminishing prejudice. Some resisted, and hate lingers. Yet as national anti-discrimination laws prompted Americans in different regions to act more alike, they also began to think more alike. Seat belt mandates, which at first evoked an angry defense of personal liberty, provide another example of attitudes following actions. Here in Michigan, the state representative who introduced the state’s 1982 seat belt law received hate mail, some comparing him to Hitler—despite abundant evidence that, like today’s vaccines, seatbelts save lives. But time rolls on, and so did seat belt acceptance, with Michigander’s approval of the law rising to 85 percent by the end of 1985 and usage rising from 20 percent in 1984 to 93 percent by 2014. Ditto other government policies, such as Social Security and Medicare—once contentious, now cherished. So amid the rampant information there is good news: Mandates can work. They can get people to protect themselves and others, as have nearly all United Airlines employees and New York health care workers. And after doing so, people will tend to embrace the way things are. (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com. Follow him on Twitter: @davidgmyers.)
... View more
Labels
1
0
2,934

Author
08-27-2021
10:08 AM
In the United States, anti-Asian prejudice has resurged, with nearly 6,600 hate incidents against Asian Americans and Pacific Islanders (AAPIs) recorded during the COVID-19 pandemic’s first year. Reports of harassment, vandalism, and brutal attacks on Asian Americans, from the very young to the very elderly, have made one-third of Asian Americans fear for their safety. An Asian-American friend last week recounted his fearful father getting a gun for self-protection. Women and girls reportedly endured two-thirds of these hostilities, including the horrific March, 2021 Atlanta shooting spree that claimed the lives of eight people—six of them Asian American women. One in four AAPI-owned small businesses has also faced pandemic-related anti-Asian words or acts. Today’s bigotry extends a long history, beginning with the 20th century’s ground zero event of anti-Asian prejudice: the World War II exclusion of 120,000 West Coast people of Japanese descent. As I have explained here and here, it was on my home island—Bainbridge Island, near Seattle—that the relocation and incarceration of Japanese Americans began. With only six days’ notice, Exclusion Order #1 ordered the evacuation of the island’s 276 Japanese American residents, each lugging nothing more than they could carry. (One government concern: The island’s south end overlooked a narrow passage to a naval shipyard and submarine base.) Today, that March 30, 1942 ferry departure point is the site of a national “Japanese American Exclusion Memorial.” On each visit home to Bainbridge, I return to the memorial, where a 276-foot wall with wood sculptures tells parts of the story. As my insurance agent father would later recall, it was a devastating day as the islanders bid farewell to their neighbors. His autobiography recalls the sadness, and also the lingering discrimination: “We had many Japanese friends and it was heartbreaking for us when the war started and the Japanese people on Bainbridge Island were ordered into concentration camps. . . . Most were educated here on the island [and] it was hard to believe that they were not as loyal Americans as we. I did all I could to keep the insurance on their homes in force. . . . The insurance companies that I represented were a bit prejudiced against insuring the Japanese people, particularly for liability insurance, for fear that lawsuits would be brought against them and the juries, being Caucasian, might be prejudiced in their jury awards. [One post-war returnee, wanting insurance on a car] named his four brothers and recited how all five of them had been in one or another of the American armed forces and had served in Italy, France, Germany and Japan. [When I showed the letter to the insurance manager] she said G___ D___ you, Ken Myers, for bringing me this letter. How can I say, ‘No’? So she wrote the first policy on a Japanese American after the war.” In contrast to the media-fomented bigotry that greeted other West Coast internees on their post-war return home, the Bainbridge internees were welcomed back by most. On my recent visit to the Memorial, I chanced to meet internee Lilly Kodama recalling her experience, as a seven-year-old, of being abruptly taken from her world. She presumed she was going with her family on a shopping trip, and was surprised to find her cousins and neighbors on the dock. But Kodama also spoke of the support of fellow islanders, including my father, but especially Walt and Millie Woodward, the heroic local newspaper owners who challenged the exclusion and then published news from the camps. The Bainbridge Island contrast illustrates what social psychologists have often reported: Social contact, especially between parties of equal status, restrains prejudice. In minimal-contact California, people of European descent and people of Japanese descent lived separately. Few people bid the departing internees goodbye. On their return, “No Japs Here” signs greeted them. Minimal contact enabled maximal prejudice. On high-contact Bainbridge, islanders intermingled as school classmates (as illustrated in this 1935 elementary school picture). Their homes, strawberry farms, and businesses were dispersed. In their absence, thirteen empty chairs were on stage at a high school graduation so all would remember who was missing. Internees returning after the war were greeted with food and assistance. Cooperative contact enabled minimal prejudice. Lincoln Elementary School photo courtesy Bainbridge Island Historical Museum This real-life social experiment has been replicated in our own time: People in states with the least immigrant contact express the most anti-immigrant antipathy, while those who know immigrants as neighbors, classmates, or fellow workers more often profess a welcoming attitude. Amid the anti-Asian prejudice of 2021, how can we replace incidents of closed fists and tight jaws with open arms? We can seek and facilitate intergroup contacts. We can travel and experience other cultures. We can welcome diversity into our communities and workplaces. We can challenge slurs. We can educate ourselves and others about our culture’s history. In such ways we can affirm the memorial’s closing words: P.S. Frank Kitamoto, after being taken from the island as a 2-year-old and later founding the Bainbridge Island Japanese American Community, recalled fellow islanders' support in this 2-minute video. For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com; follow him on Twitter: @DavidGMyers.)
... View more
Labels
4
0
7,329

Author
08-09-2021
08:43 AM
Friends matter. If psychological science has proven anything, it’s that feeling liked, supported, and encouraged by close friends and family fosters health and happiness. Having friends to confide in calms us, enables better sleep, reduces blood pressure, and even boosts immune functioning. Compared with socially isolated or lonely folks, those socially connected are at less risk of premature death. As the writer of Ecclesiastes surmised, “Woe to one who is alone and falls and does not have another to help.” But how many friends are enough? And for how many meaningful relationships do we have time and energy? Robin Dunbar, a recently retired Oxford evolutionary psychologist, offers an answer: about 150. He first derived that number—“Dunbar’s number”—by noting, in primates, the relationship between neocortex size and group size: the bigger the neocortex, the bigger the group. Extrapolating from his studies of non-human primates, he predicted that a manageable human group size would be around 150. Many evolutionists and animal behavior observers find Dunbar’s number amusing but, at best, simplistic. They note, for example, that primate group sizes are also influenced by diet and predators (see here and here). As one primate-culture expert told me, “We humans are too complicated to expect these simple numerical approaches to work.” In response, Dunbar—who seems not to suffer his critics gladly—vigorously defends his number. In his new book Friends (now available in the U.S. on Kindle in advance of a January, 2022 hardcover), Dunbar itemizes examples of 100- to 250-person human groups, including Neolithic, medieval, and 18th-century villages; Hutterite communities; hunter-gatherer communities; Indigenous communities from Inuit to Aboriginal; military companies; wedding invitees; and Christmas-card networks. “Every study we have looked at,” he emphasizes, “has consistently suggested that people vary in the number of friends they have, and that the range of variation is typically between 100 and 250 individuals.” But surely, you say, that number (which includes both family and nonfamily friends) varies across individuals and life circumstances. Indeed, notes Dunbar, it varies with age. The number of our meaningful relationships forms an inverted U-shaped curve across the lifespan. It starts at birth with one or two, and rises in the late teens until plateauing in our 30s at about 150. After the late 60s or early 70s, it “starts to plummet.... We start life with one or two close carers and, if we live long enough, we end life that way too.” personality. Extraverts are (no surprise) social butterflies, with lots of friends. Introverts accumulate fewer friends, but invest in them more intensely. family size. If you live surrounded by a large clan, you likely have fewer nonfamily friends than someone from a small or distant family. Have a baby, and—with less time for other relationships—your friendship circle may contract for a time. (Perhaps you have felt a diminished connection with friends after they had a baby or fell intensely in love?) Dunbar also describes people’s friendship layers. On average, he reports, people have about 5 intimate shoulder-to-cry-on friends—people they’re in touch with “at least once a week and feel close to.” Including these, they have, in sum, 15 close friends whom they’re in contact with at least monthly. The 50-friend circle incorporates our “party friends”—those we are in contact with at least once every six months. And the 150 totality incorporates those we’re in touch with at least annually—“what you might call the wedding/bar mitzvah/funeral group—the people that would turn up to your once-in-a-lifetime events.” Dunbar’s layers: Our friendship circles are of increasing size and decreasing investment/intensity (with each circle including the numbers in its inner circles). As on Facebook, “friends” include family. Our friendship circles vary in the time and concern we devote to them, says Dunbar. From studying people’s time diaries and friendship ratings and from analyzing big data on phone texting and calls, he found that we devote about 40 percent of our total social time to the 5 people in the innermost circle, and a further 20 percent to the additional 10 people in the 15-person “close friends” circle. Think about it: 60 percent of our total social effort is devoted to just 15 people. The remaining ~130 have to make do with what’s left over. Do Dunbar’s numbers resonate with your experience? Do you have a small inner core of friends (including family) who would drop anything to support you, and vice versa? Are these supplemented by a somewhat larger group of close-friend social companions? And do you have further-out layers of good friends and meaningful acquaintances that you would welcome to significant life events? Even if, as critics charge, Dunbar’s numbers are too exact, two conclusions seem apt. First, as Aristotle long ago recognized, we humans are social animals. We flourish and find protection and joy in relationships. Second, close relationships are psychologically expensive. Life requires us to prioritize, allocating our limited time and mental energy among our relationships. Friendships feed our lives—but, as with food, we’ve only got room for so much. (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com; follow him on Twitter: @DavidGMyers.)
... View more
Labels
4
0
5,568
Topics
-
Abnormal Psychology
7 -
Achievement
1 -
Affiliation
1 -
Cognition
8 -
Consciousness
6 -
Current Events
21 -
Development Psychology
7 -
Developmental Psychology
9 -
Emotion
10 -
Gender
1 -
Gender and Sexuality
1 -
Genetics
2 -
History and System of Psychology
2 -
Industrial and Organizational Psychology
2 -
Intelligence
3 -
Learning
3 -
Memory
2 -
Motivation
3 -
Motivation: Hunger
1 -
Nature-Nurture
4 -
Neuroscience
6 -
Personality
9 -
Psychological Disorders and Their Treatment
7 -
Research Methods and Statistics
22 -
Sensation and Perception
8 -
Social Psychology
63 -
Stress and Health
9 -
Teaching and Learning Best Practices
6 -
Thinking and Language
9 -
Virtual Learning
2
Popular Posts