-
About
Our Story
back- Our Mission
- Our Leadership
- Accessibility
- Careers
- Diversity, Equity, Inclusion
- Learning Science
- Sustainability
Our Solutions
back
-
Community
Community
back- Newsroom
- Discussions
- Webinars on Demand
- Digital Community
- The Institute at Macmillan Learning
- English Community
- Psychology Community
- History Community
- Communication Community
- College Success Community
- Economics Community
- Institutional Solutions Community
- Nutrition Community
- Lab Solutions Community
- STEM Community
- Newsroom
- Macmillan Community
- :
- Psychology Community
- :
- Talk Psych Blog
- :
- Talk Psych Blog - Page 3
Talk Psych Blog - Page 3
Options
- Mark all as New
- Mark all as Read
- Float this item to the top
- Subscribe
- Bookmark
- Subscribe to RSS Feed
Talk Psych Blog - Page 3
david_myers
Author
11-01-2022
02:04 PM
Reading my discipline’s discoveries leaves me sometimes surprised and frequently fascinated by our mind and its actions. In hopes of sharing those fascinations with the wider world, I’ve authored How Do We Know Ourselves? Curiosities and Marvels of the Human Mind, which I’m pleased to announce is published today by Farrar, Straus and Giroux. Its 40 bite-sized essays shine the light of psychological science on our everyday lives. I take the liberty of sharing this with you, dear readers of this wee blog, partly because the book is also a fund-raiser for the teaching of high school psychology. (All author royalties are pledged to support psychology teaching—half to the American Psychological Foundation to support Teachers of Psychology in Secondary Schools, and half to the Association for Psychological Science Fund for the Teaching and Public Understanding of Psychological Science.) My hope is that some of you—or perhaps some of your students (a Christmas gift idea for their parents?)—might enjoy these brief and playful musings half as much as I enjoyed creating them. (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com. Follow him on Twitter: @davidgmyers.)
... View more
Labels
0
0
1,778
david_myers
Author
10-12-2022
09:19 AM
"I could stand in the middle of Fifth Avenue and shoot somebody, and I wouldn't lose any voters, OK?" ~ Donald J. Trump, January 23, 2016 The conservative sage and former George W. Bush speech writer Peter Wehner is aghast at what his U.S. Republican Party has come to accept: Republican officials showed fealty to Trump despite his ceaseless lying and dehumanizing rhetoric, his misogyny and appeals to racism, his bullying and conspiracy theories. No matter the offense, Republicans always found a way to look the other way, to rationalize their support for him, to shift their focus to their progressive enemies. As Trump got worse, so did they. Indeed, in the wake of misappropriated top-secret documents, civil suits over alleged business frauds, and the revelations of the January 6 House Select Committee, Donald Trump’s aggregated polling data approval average increased from late July’s 37 percent to today’s 41 percent, a virtual tie with President Biden. Democrats join Wehner in being incredulous at Trump’s resilient approval rating, even as MAGA Republicans are similarly incredulous at Biden’s public approval. In politics as in love, we are often amazed at what others have chosen. Psychological science offers some explanations for why people might be drawn, almost cult-like, to charismatic autocratic leaders on the right or left. Perceived threats and frustrations fuel hostilities. Punitive, intolerant attitudes, which form the core of authoritarian inclinations, surface during times of change and economic frustration. During recessions, anti-Black prejudice has increased. In countries worldwide, low income years and low income people manifest most anti-immigrant prejudice. In the Netherlands and Britain, times of economic or terrorist threat have been times of increased support for right-wing authoritarians and anti-immigrant policies. In the U.S., MAGA support rides high among those with less than a college education living amid high income inequality. The illusory truth effect: Mere repetition feeds belief. In experiments, repetition has a strange power. It makes statements such as ““A galactic year takes 2500 terrestrial years”” seem truer. Hear a made-up smear of a political opponent over and over and it becomes more believable. Adolf Hitler, George Orwell, and Vladimir Putin all have understood the persuasive power of repetitive propaganda. So have Barack Obama (“If they just repeat attacks enough and outright lies over and over again . . . people start believing it”) and Donald Trump (“If you say it enough and keep saying it, they’ll start to believe you”). Conflicts feed social identities. We are social animals. Our ancestral history prepares us to protect ourselves in groups, to cheer for our groups, even to kill or die for our groups. When encountering strangers, we’re primed to make a quick judgment: friend or foe?—and to be less wary of those who look and sound like us. Conflicts—from sporting events to elections to wars—strengthen our social identity: our sense of who we are and who they are. In the U.S., White nationalist rallies serve to solidify and sustain aggrieved identities. Still, I hear you asking: Why do people, once persuaded, persist in supporting people they formerly would have shunned, given shocking new revelations? In just-published research, Duke University psychologists Brenda Yang, Alexandria Stone, and Elizabeth Marsh repeatedly observed a curious “asymmetry in belief revision”: People will more often come to believe a claim they once thought false than to unbelieve something they once thought true. The Duke experiments focused on relative trivia, such as whether Michelangelo’s statue of David is located in Venice. But consider two real life examples of people’s reluctance to unbelieve. Sustained Iraq War support. The rationale for the 2003 U.S. war against Iraq was that its leader, Saddam Hussein, was accumulating weapons of mass destruction. At the war’s beginning, Gallup reported that only 38 percent of Americans said the war was justified if there were no such weapons. Believing such would be found, 4 in 5 people supported the war. When no WMDs were found, did Americans then unbelieve in the war? Hardly. Fifty-eight percent still supported the war even if there were no such weapons (with new rationales, such as the supposed liberation of oppressed Iraqi people). Sustained Trump support. In 2011, the Public Religion Research Institute asked U.S. voters if “an elected official who commits an immoral act in their personal life can still behave ethically and fulfill their duties in their public and professional life.” Only 3 in 10 White evangelical Protestants concurred that politicians’ personal lives have no bearing on their public roles. But by July of 2017, after supporting Donald Trump, 7 in 10 White evangelicals were willing to separate the public and personal. It was a “head-spinning reversal,” said the PRRI CEO. Moreover, despite tales of Trump’s sexual infidelity, dishonesty, and other broken Ten Commandments, White evangelicals’ support of Trump continues. Once someone or something is embraced, unbelieving—letting go—is hard. In Stanley Milgram’s famed obedience experiments, people capitulated in small steps—first apparently delivering a mild 15 volts, then gradually delivering stronger and stronger supposed electrical shocks—after progressively owning and justifying their actions. Each repugnant act made the next easier, and also made the commitment more resilient. “With each moral compromise,” observes Peter Wehner, “the next one—a worse one—becomes easier to accept.” In small steps, conscience mutates. Cognitive dissonance subsides as people rationalize their commitment. Confirmation bias sustains belief as people selectively engage kindred views. Fact-free chatter within one’s echo chamber feeds group polarization. And so, after believing in a would-be autocrat—after feeling left behind, after hearing repeated lies, and after embracing a political identity—it becomes hard, so hard, to unbelieve. (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com. Follow him on Twitter: @davidgmyers.)
... View more
Labels
0
0
1,793
david_myers
Author
08-24-2022
09:43 AM
Language evolves. The fourteenth century English poet Geoffrey Chaucer, who lived but 25 generations ago, would struggle to communicate with us. Word meanings change. Yesterday’s evangelicals who led the abolition of Britain’s slave trade would wince at a particular subset of today’s American evangelicals—Trump acolytes who are infrequent worshippers, and for whom “evangelical” has become more of a cultural than a religious identity. In political discourse, a “socialist” becomes not just a person who favors public-owned industries, but someone who advocates reduced inequality and a stronger social safety net. And “critical race theory,” a once obscure idea in law journals that has been nonexistent in social psychology’s prejudice chapters and books, has been distorted and inflated as a way to malign efforts to ameliorate racism and teach real history. Similar concept creep has happened in the mental health world, observes University of Melbourne psychologist Nick Haslam. Concepts with a precise meaning have expanded to capture other or less extreme phenomena. Examples: “Addiction” (compulsive substance use) has expanded to include money-depleting gambling, sexual fixations, time-draining gaming, and even excessive shopping and social media use, as in: “I’m addicted to my phone.” Thus, between 1970 and 2020, the proportion of academic psychology abstracts mentioning “addiction” has increased sixfold. “Abuse” still refers to intentional physical harm or inappropriate sexual contact, but in everyday use may now include neglectful omissions and painful mistreatments: hurtful teasing, distressing affronts, or overwrought parents screaming at their children. Accompanying this semantic inflation, the proportion of post-1970 psychology abstracts mentioning “abuse” has multiplied seven times over. “Trauma” initially referred to physical injury (as in traumatic brain injury), then expanded to encompass horrific emotional traumas (rape, natural disaster, wartime combat, torture), and now has been inflated to include stressful life experiences within the range of normal human experience—to job loss, serious illness, and relationship breakups, and even, reports Harvard psychologist Richard McNally, to wisdom tooth extraction, enduring obnoxious jokes, and the normal birthing of a healthy child. So, no surprise, over the past half century the proportion of psychology abstracts mentioning “trauma” has increased tenfold. Haslam offers other concept-creep examples, such as broadening the “prejudice” of yesterday’s bigotry to include today’s subtler but persistent “implicit biases” and “micro aggressions.” And we could extend his list. ADD, ADHD, autism spectrum disorder, and the DSM-5’s new “prolonged grief disorder” all refer to genuine pathologies that have been broadened to include many more people. At least some of yesterday’s normally rambunctious boys, easily distracted adults, socially awkward people, and understandably bereaved parents or spouses are now assigned a psychiatric label and offered mental health or drug therapies. This psychologization or psychiatrization of human problems serves to expand the mental health and pharmacology industries, entailing both benefits and costs. Concept creep does have benefits. It represents an expansion of our circle of moral concern. As vicious violence, appalling adversity, and blatant bigotry have subsided in Western cultures, albeit with horrendous exceptions, we have become more sensitive to lesser but real harms—to upsetting maltreatment, dysfunctional compulsions, and toxic systemic biases. Progressive and empathic people, being sensitive to harm-doing, mostly welcome the expanded concepts of harm and victimization But concept creep, Haslam argues, also risks casting more and more people as vulnerable and fragile—as, for example, helpless trauma victims, rather than as potentially resilient creatures. “I am beginning to think that our world is in a constant state of trauma,” writes one psychotherapist/columnist. “Living with trauma, PTSD, unregulated depression and anxiety is almost the norm these days.” As is common in many professions, mental health workers may sometimes overreach to broaden their reach and client base: “Your hurt was an abuse, and you need me to help you heal.” Concept creep also risks trivializing big harms, Haslam notes, by conflating them with lesser harms: “If everyday sadness becomes ‘depression’ and everyday stressors become ‘traumas’ then those ideas lose their semantic punch.” If more and more pet-loving people seek to travel with their “emotional support” companions, the end result may be restricted access for those for whom companion animals serve a vital function. “Many traumas do indeed have severe and lasting effects that must not be minimized,” Haslam and co-author Melanie McGrath emphasize. “However, as the concept of trauma stretches to encompass fewer extreme experiences, the tendency to interpret marginal or ambiguous events as traumas is apt to promote hopelessness, submission, and passivity in response to challenges that might be overcome better if placed in a different interpretive frame.” The bottom line: Addiction, abuse, and trauma are genuine sources of human suffering. But where should we draw the line in defining and treating them? (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com. Follow him on Twitter: @davidgmyers.)
... View more
2
1
3,970
katherine_nurre
Macmillan Employee
08-18-2022
10:58 AM
As we approach the start of a new semester, psychology teachers can use this 5-minute animation written and narrated by David Myers to help students effectively learn and remember course material: https://t.co/24UIFUnWFy
... View more
1
0
2,323
david_myers
Author
07-21-2022
11:07 AM
“The best time to plant a tree was 20 years ago. The second-best time is now.” ~ Anonymous proverb Character education’s greatest task is instilling a mark of maturity: the willingness to delay gratification. In many studies, those who learn to restrain their impulses—by electing larger-later rather than smaller-now rewards—have gone on to become more socially responsible, academically successful, and vocationally productive. The ability to delay gratification, to live with one eye on the future, also helps protect people from the ravages of gambling, delinquency, and substance abuse. In one of psychology’s famous experiments, Walter Mischel gave 4-year-olds a choice between one marshmallow now, or two marshmallows a few minutes later. Those who chose two later marshmallows went on to have higher college graduation rates and incomes, and fewer addiction problems. Although a recent replication found a more modest effect, the bottom line remains: Life successes grow from the ability to resist small pleasures now in favor of greater pleasures later. Marshmallows—and much more—come to those who wait. The marshmallow choice parallels a much bigger societal choice: Should we prioritize today, with policies that keep energy prices and taxes low? Or should we prioritize the future, by investing now to spare us and our descendants the costs of climate change destruction? “Inflation is absolutely killing many, many people,” said U.S. Senator Joe Manchin, in explaining his wariness of raising taxes to fund climate mitigation. Manchin spoke for 50 fellow senators in prioritizing the present. When asked to pay more now to incentivize electric vehicles and fund clean energy, their answer, on behalf of many of their constituents, is no. But might the cost of inaction be greater? The White House Office of Management and Budget projects an eventual $2 trillion annual federal budget cost of unchecked climate change. If, as predicted, climate warming increases extreme weather disasters, if tax revenues shrink with the economy’s anticipated contraction, and if infrastructure and ecosystem costs soar, then are we being penny-wise and pound-foolish? With the worst yet to come, weather and climate disasters have already cost the U.S. a trillion dollars over the past decade, with the total rising year by year. nattrass /E+/Getty Images The insurance giant Swiss Re also foresees a $23 trillion global economy cost by 2050 if governments do not act now. The Big 4 accounting firm Deloitte is even more apprehensive, projecting a $178 trillion global cost by 2070. What do you think: Are our politicians—by seizing today’s single economic marshmallow—displaying a mark of immaturity: the inability to delay gratification for tomorrow’s greater good? A related phenomenon, temporal discounting, also steers their political judgments. Even mature adults tend to discount the future by valuing today’s rewards—by preferring, say, a dollar today over $1.10 in a month. Financial advisors therefore plead with their clients to do what people are not disposed to do . . . to think long-term—to capitalize on the power of compounding by investing in their future. Alas, most of us—and our governments—are financially nearsighted. We prioritize present circumstances over our, and our children’s, future. And so we defend our current lifestyle by opposing increased gas taxes and clean-energy mandates. The western U.S. may be drying up, sea water creeping into Miami streets, and glaciers and polar ice retreating, but even so, only “1 percent of voters in a recent New York Times/Siena College poll named climate change as the most important issue facing the country, far behind worries about inflation and the economy.” The best time to plant a tree, or to have invested in climate protection, was 20 years ago. The worst time is 20 years from now, when severe climate destruction will be staring us in the eye. As we weigh our present against our future, psychological science reminds our political representatives, and all of us, of a profoundly important lesson: Immediate gratification makes today easy, but tomorrow hard. Delayed gratification makes today harder, but tomorrow easier. (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com. Follow him on Twitter: @davidgmyers.)
... View more
1
1
2,068
david_myers
Author
07-05-2022
07:49 AM
With 45,000+ annual U.S. firearm deaths (123 per day, from homicide, suicide, and accidents), America has a gun problem, of which the recent Buffalo, Uvalde, and Highland Park mass killings are horrific examples. In response, we often hear that the problem is not America’s 400 million guns but its people. “We have a serious problem involving families, involving drugs, involving mental health in this country,” asserted Colorado congressional representative Ken Buck. “We have become a less safe society generally. Blaming the gun for what’s happening in America is small-minded.” To protect against mass killings by 18-year-olds, as in Buffalo and Uvalde, we are told that we don’t need to match the minimum age for assault rifle purchase, still 18 after the new gun safety act, with the age for beer purchase, 21. We don’t need to train and license gun owners as we do drivers. We don’t need safe-storage laws or restrictions on large-capacity magazines. Instead, we need to fix the problem perceived by Texas Governor Greg Abbott and National Rifle Association chief executive Wayne LaPierre: evil people. To solve the gun violence problem, we need better people, enabled by commendable social changes: more married fathers, less pornography, fewer violent video games. And most importantly, we’re told, we need to deal with mass killers’ mental sickness. “People with mental illness are getting guns and committing these mass shootings,” observed former U.S. Speaker of the House Paul Ryan. While president, Donald Trump agreed: “When you have some person like this, you can bring them into a mental institution.” Mass killers, he later added, are “mentally ill monsters.” However, reality intrudes. As I documented in an earlier essay, most mentally ill people are nonviolent, most violent criminals and mass shooters have not been diagnosed as mentally ill, and rare events such as mass shootings are almost impossible to predict. As much as we psychologists might appreciate the ostensible high regard, today’s psychological science lacks the presumed powers of discernment. If mental-health assessments cannot predict individual would-be killers, three other factors (in addition to short-term anger and alcohol effects) do offer some predictive power: Demographics. As the recent massacres illustrate, most violent acts are committed by young males. The late psychologist David Lykken made the point memorably: “We could avoid two-thirds of all crime simply by putting all able-bodied young men in cryogenic sleep from the age of 12 through 28.” Past behavior. It’s one of psychology’s maxims: The best predictor of future behavior is past behavior. The best predictor of future GPA is past GPA. The best predictor of future employee success is past employee success. The best predictor of future class attendance or smoking or exercise is, yes, the recent past. Likewise, recent violent acts are a predictor of violent acts. Guns. Compared to Canada, the United States has 3.5 times the number of guns per person and 8.2 times the gun homicide rate. Compared to England, the U.S. has 26 times as many guns per person—and 103 times the gun homicide rate. To check U.S. state variations, I plotted each state’s gun-in-home rate with its gun homicide rate. As you can see, the correlation is strongly positive, ranging from (in the lower left) Massachusetts, where 15 percent of homes have a gun, to Alaska, where 65 percent of homes have a gun—and where the homicide rate is 7 times greater. Of these three predictor variables, gun policy is one that, without constraining hunters’ rights, society can manage with some success: When nations restrict gun access, the result has been fewer guns in civilian hands, which enables fewer impulsive gun uses and fewer planned mass shootings. Many people nevertheless believe that, as Senator Ted Cruz surmised after the Uvalde shooting, “What stops arms bad guys is armed good guys.” Never mind that in one analysis of 433 active shooter attacks on multiple people—armed lay citizens took out active shooters in only 12 instances. Many more—a fourth of such attacks—ended in a shooter suicide. Moreover, if the answer to bad guys with guns is to equip more good guys with guns, then why are states with more armed good guys more homicidal? What explains the state-by-state guns/homicide correlation? Are the more murderous Alaskans (and Alabamians and Louisianans) really more “evil” or “mentally ill”? Or is human nature essentially the same across the states, with the pertinent difference being the weapons? (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com. Follow him on Twitter: @davidgmyers.)
... View more
3
1
2,520
david_myers
Author
06-02-2022
12:19 PM
“Donald Trump . . . unleashed something, that is so much bigger than he is now or ever will be: He pushed the limits of acceptability, hostility, aggression and legality beyond where other politicians dared push them.” ~ Charles Blow, 2022 This is a venomous time. From 2015 to 2019, FBI-reported U.S. hate crimes increased 25 percent. Social psychologists have therefore wondered: Is this increase, and the accompanying resurgence of white nationalism, fed by Donald Trump’s rhetoric—his saying, for example, that the torch-carrying Charlottesville white nationalists included some “very fine people”? Or did the president’s tweets and speeches merely reflect existing prejudices, which, in a few disturbed individuals, fed hateful acts? Simply put: Do political leaders’ words merely voice common prejudices and grievances? Or do they also amplify them? They do both. Leaders play to their audiences. And when prominent leaders voice prejudices, it then becomes more socially acceptable for their followers to do likewise. When disdain for those of another race or religion or sexual orientation becomes socially acceptable, insults or violence may ensue. Social norms mutate, and norms matter. A new Nature Human Behaviour report of 13 studies of 10,000+ people documents the norm-changing influence of President Trump’s rhetoric. During his presidency, “Explicit racial and religious prejudice significantly increased amongst Trump’s supporters,” say the report’s authors, social psychologists Benjamin Ruisch (University of Kent) and Melissa Ferguson (Yale University). Some of their studies followed samples of Americans from 2014 to 2017, ascertaining their attitudes toward Muslims (whether they agreed, for example, that “Islam is quite primitive”). As seen on this 7-point scale of anti-Muslim prejudice, Trump supporters’ anti-Muslim sentiments significantly increased. But what about those for whom Donald Trump was not a positive role model? Would they become less imitative? Might they be like those observed in one study of jaywalking, in which pedestrians became less likely to jaywalk after observing someone they didn’t admire (a seeming low-status person) doing so? Indeed, Trump opponents exhibited decreased Muslim prejudice over time. Moreover, Ruisch and Ferguson found, “Trump support remained a robust predictor of increases in [anti-Muslim] prejudice” even after controlling for 39 other variables, such as income, age, gender, and education. Trump support also predicted increases in other forms of prejudice, such as racial animus (“Generally, Blacks are not as smart as Whites are”) and anti-immigrant attitudes. Pew national surveys similarly find that the attitude gap between those voting for and against Trump widened from 2016 to 2020 (see the 2020 data below). The 57 percent of Democrat voters who, in 2016, agreed that it’s more difficult to be a Black American than a White American increased to 74 percent in 2020, illustrating the general historical trend toward more egalitarian attitudes. But no such positive shift occurred among Trump supporters, whose agreement with the concept of Black disadvantage actually declined slightly, from 11 percent in 2016 to 9 percent in 2020. Ruisch and Ferguson see shifting social norms at work. “Trump supporters perceive that it has become more acceptable to express prejudice since Trump’s election . . . and the perception that prejudice is more acceptable predicts greater personal prejudice among Trump supporters.” Their conclusion aligns with the earlier finding of a political science research team—“that counties that had hosted a 2016 Trump campaign rally saw a 226 percent increase in reported hate crimes over comparable counties that did not host such a rally.” The happier news is that political leaders’ speech can work for better as well as for worse. In one month during 2021, a Stanford research team randomly assigned 1014 counties with low vaccination rates to receive 27-second YouTube ads (via TV, website, or app). Each featured Donald Trump’s expressed support for Covid vaccinations. After a Fox News anchor’s introduction, shown below, clips featured the Trumps getting the vaccine, with Ivanka saying, “Today I got the shot. I hope you do too” and Donald explaining “I would recommend it, and I would recommend it to a lot of people that don’t want to get it.” At a cost of about $100,000, the add was viewed 11.6 million times by 6 million people. When compared with 1018 control counties, the experimental treatment counties experienced an additional 104,036 vaccinations, for an additional cost of less than $1 each. The simple lesson: Under the influence of powerful leaders, social norms and behaviors can change. And social norms matter—sometimes for worse, but sometimes for better. (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com. Follow him on Twitter: @davidgmyers.)
... View more
Labels
1
0
4,349
david_myers
Author
05-10-2022
07:44 AM
“All effective propaganda must be limited to a very few points and must harp on these in slogans.” ~ Adolf Hitler, Mein Kampf, 1926 Among psychology’s most reliable phenomena is the power of mere repetition. It comes in two forms, each replicated by psychological research many times in many ways. Repetition Fosters Fondness We humans are naturally disposed to prefer what’s familiar (and usually safe) and to be wary of what’s unfamiliar (and possibly dangerous). Social psychologists led by the late Robert Zajonc exposed the power and range of this mere exposure effect. In study after study, the more frequently they showed people unfamiliar nonsense syllables, Chinese characters, geometric figures, musical selections, artwork, or faces, the better they liked them. Repetition breeds liking. Mere exposure also warms our relationships. As strangers interact, they tend to increasingly like each other and to stop noticing initially perceived imperfections or differences. Those who come to know LGBTQ folks almost inevitably come to accept and like them. By three months, infants in same-race families come to prefer photos of people of their own (familiar) race. We even prefer our own familiar face—our mirror-image face we see while brushing our teeth, over our actual face we see in photos. Advertisers understand repetition’s power. With repetitions of an ad, shoppers begin to prefer the familiar product even if not remembering the ad. Indeed, the familiarity-feeds-fondness effect can occur without our awareness. In one clever experiment, research participants focused on repeated words piped into one earpiece while an experimenter simultaneously fed a novel tune into the other ear. Later, they could not recognize the unattended-to tune—yet preferred it over other unpresented tunes. Even amnesia patients, who cannot recall which faces they have been shown, will prefer faces they’ve repeatedly observed. Repetition Breeds Belief As mere exposure boosts liking, so mere repetition moves minds. In experiments, repetition makes statements such as “Othello was the last opera of Verdi” seem truer. After hearing something over and over, even a made-up smear of a political opponent becomes more believable. Adolf Hitler understood this illusory truth effect. So did author George Orwell. In his world of Nineteen Eighty-Four, the population was controlled by the mere repetition of slogans: “Freedom is slavery.” “Ignorance is strength.” “War is peace.” And so does Vladimir Putin, whose controlled, continuous, and repetitive propaganda has been persuasive to so many Russians. Barack Obama understood the power of repetition: “If they just repeat attacks enough and outright lies over and over again . . . people start believing it.” So did Donald Trump: “If you say it enough and keep saying it, they’ll start to believe you.” And he did so with just the intended effect. What explains repetition’s persuasive power? Familiar sayings (whether true or false) become easier to process and to remember. This processing fluency and memory availability can make assertions feel true. The result: Repeated untruths such as “taking vitamin C prevents colds” or “childhood vaccines cause autism” may become hard-to-erase mental bugs. But can mere repetition lead people to believe bizarre claims—that a presidential election was stolen, that climate change is a hoax, that the Sandy Hook school massacre was a scam to promote gun control? Alas, yes. Experiments have shown that repetition breeds belief even when people should know better. After repetition, “The Atlantic Ocean is the largest ocean on Earth” just feels somewhat truer. Even crazy claims can seem truer when repeated. That’s the conclusion of a new truth-by-repetition experiment. At Belgium’s Catholic University of Louvain, Doris Lacassagne and her colleagues found that, with enough repetition, highly implausible statements such as “Elephants run faster than cheetahs” seem somewhat less likely to be false. Less extreme but still implausible statements, such as “A monsoon is caused by an earthquake” were especially vulnerable to the truth-by-repetition effect. For those concerned about the spread of oft-repeated conspiracy theories, the study also offered some better news. Lacassagne found that barely more than half of her 232 U.S.-based participants shifted toward believing the repeated untruths. The rest knew better, or even shifted to greater incredulity. At the end of his life, Republican Senator John McCain lamented “the growing inability, and even unwillingness, to separate truth from lies.” For psychology educators like me and some of you, the greatest mission is teaching critical thinking that helps students winnow the wheat of truth from the chaff of misinformation. Evidence matters. So we teach our students, “Don’t believe everything you hear.” And, after hearing it, “Don’t believe everything you think!” (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com. Follow him on Twitter: @davidgmyers.)
... View more
2
0
3,847
david_myers
Author
03-24-2022
01:01 PM
“There are trivial truths and great truths,” the physicist Niels Bohr reportedly said.[1] “The opposite of a trivial truth is plainly false. The opposite of a great truth is also true.” Light is a particle. And light is a wave. Psychology, too, embraces paradoxical truths. Some are more complementary than contradictory: Attitudes influence behavior, and attitudes follow behavior. Self-esteem pays dividends, and self-serving bias is perilous. Memories and attitudes are explicit, and they are implicit. We are the creatures of our social worlds, and we are the creators of our social worlds. Brain makes mind, and mind controls brain. To strengthen attitudes, use persuasion, and to strengthen attitudes, challenge them. Religion “makes prejudice, and it unmakes prejudice” (Gordon Allport). “When I accept myself just as I am, then I can change” (Carl Rogers). Psychology also offers puzzling paradoxical concepts. Paradoxical sleep (aka REM sleep) is so-called because the muscles become near-paralyzed while the body is internally aroused. The immigrant paradox refers to immigrant U.S. children exhibiting better mental health than native-born children. And the paradox of choice describes how the modern world’s excessive options produce diminished satisfaction. Even more puzzling are seemingly contradictory findings from different levels of analysis. First, consider: Who in the U.S. is more likely to vote Republican—those with lots of money or those with little? Who is happier—liberals or conservatives? Who does more Google searches for “sex”—religious or nonreligious people? Who scores highest on having meaning in life—those who have wealth or those who don’t? Who is happiest and healthiest—actively religious or nonreligious people? As I have documented, in each case the answer depends on whether we compare places or individuals: Politics. Low-income states and high-income individuals have voted Republican in recent U.S. presidential elections. Happy welfare states and unhappy liberals. Liberal countries and conservative individuals manifest greater well-being. Google “sex” searches. Highly religious states, and less religious individuals, do more Google “sex” searching. Meaning in life. Self-reported meaning in life is greatest in poor countries, and among rich individuals. Religious engagement correlates negatively with well-being across aggregate levels (when comparing more vs. less religious countries or American states), yet positively across individuals. Said simply, actively religious individuals and nonreligious places are generally flourishing. As sociologist W. S. Robinson long ago appreciated, “An ecological [aggregate-level] correlation is almost certainly not equal to its individual correlation.” Thus, for example, if you want to make religion look good, cite individual data. If you want to make it look bad, cite aggregate data. In response to this paradoxical finding, Nobel laureate economist Angus Deaton and psychologist Arthur Stone wondered: “Why might there be this sharp contradiction between religious people being happy and healthy, and religious places being anything but?”[2] To this list of psychological science paradoxes, we can add one more: the gender-equality paradox—the curious finding of greater gender differences in more gender-equal societies. You read that right. Several research teams have reported that across several phenomena, including the proportion of women pursuing degrees in STEM (science, technology, engineering, and math) fields, gender differences are greater in societies with more political and economic gender equality. In the February, 2022, issue of Psychological Science, University of Michigan researcher Allon Vishkin describes “the myriad findings” that societies with lower male-superior ideology and educational policy “display larger gender differences.” This appears, he reports, not only in STEM fields of study, but also in values and preferences, personality traits, depression rates, and moral judgments. Moreover, his analysis of 803,485 chess players in 160 countries reveals that 90 percent of chess players are men; yet “women participate more often in countries with less gender equality.” Go figure. Vishkin reckons that underlying the paradox is another curious phenomenon: Gender unequal societies have more younger players, and there’s greater gender equality in chess among younger people. Paradoxical findings energize psychological scientists, as we sleuth their explanation. They also remind us of Bohr’s lesson. Sometimes the seeming opposite of a truth is another truth. Reality is often best described by complementary principles: mind emerges from brain, and mind controls brain. Both are true, yet either, by itself, is a half-truth. (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com. Follow him on Twitter: @davidgmyers.) [1] I first read this unsourced quote in a 1973 article by social psychologist William McGuire. Neils Bohr’s son, Hans Bohr, in his biography of his father, reports that Neils Bohr discerned “two sorts of truths, profound truths recognized by the fact that the opposite is also a profound truth, in contrast to trivialities where opposites are obviously absurd.” [2] For fellow researchers: The paradox is partially resolved by removing income as a confounding factor. Less religious places also tend to be affluent places (think Denmark and Oregon). More religious places tend to be poorer places (think Pakistan and Alabama). Thus, when we compare less versus more religious places, we also are comparing richer versus poorer places. And as Ed Diener, Louis Tay, and I observed from Gallup World Poll data, controlling for objective life circumstances, such as income, eliminates or even slightly reverses the negative religiosity/well-being correlation across countries.
... View more
1
0
2,630
david_myers
Author
03-03-2022
02:54 PM
If you live in one of 30 U.S. states with recently legalized sports betting, you’ve surely noticed: Your online media and television have been awash with sports betting ads from DraftKings, FanDuel, Caesars Sportsbook, and more. For this, we can thank the 2018 U.S. Supreme Court’s overturning of the sports betting ban, which also led the NFL in 2021 to allow sports betting ads even during its broadcasts and live-streams. With the deluge of ads, which sometimes offer new customers free money to lure initial bets, the gaming industry hopes to hook new betters and expand its customer base from the current 50 million or so Americans who gamble on sports. For most, the few dollars wagered may be nothing more than a bit of exciting fun. But for some—those who develop a gambling disorder—the betting becomes compulsive and debilitating, as gamblers crave the excitement, seek to redeem their losses, and lie to hide their behavior. Family finances suffer. Bankruptcies happen. Divorces result. And with the sports betting floodgates now opened, problem gambling is increasing. “The National Problem Gambling helpline is receiving an average of more than 22,500 calls a month this year,” reports the Wall Street Journal, “up from a monthly average of about 14,800 last year.” Pgiam/E+/Getty Images It’s no secret that, over time, the house wins and gamblers nearly always lose. So how does the gambling industry manage to suck nearly a quarter-trillion dollars annually from U.S. pockets? Are state lotteries, like Britain’s National Lottery, merely (as one of my sons mused) “a tax on the statistically ignorant”? (My state’s lottery pays out as winnings only 61 cents of each dollar bet.) To remind folks of the power of psychological dynamics, and to prepare them to think critically about the allure of gambling inducements, we can ask: What psychological principles does the gambling industry exploit? Consider these: Partially (intermittently) reinforced behavior becomes resistant to extinction. Pigeons that have been reinforced unpredictably—on a “variable ratio” schedule—may continue pecking thousands more times without further reward. Like fly fishing, slot machines and sports gambling reward people occasionally and unpredictably. So hope springs eternal. The judgment-altering power of the availability heuristic. As Nobel laureate psychologist Daniel Kahneman has shown, people tend to estimate the commonality of various events based on their mental availability—how readily instances come to mind. Casinos have the idea. They broadcast infrequent wins with flashing lights, while keeping the far more common losses invisible. Likewise, gamblers, like stock day-traders, may live to remember and tell of their memorable wins, while forgetting their more mundane losses. Illusory correlations feed an illusion of control. People too readily believe that they can predict or control chance events. When choosing their own lottery number (rather than being assigned one), people demand much more money when invited to sell their ticket. If assigned to throw the dice or spin the wheel themselves, their confidence increases. Dice players also tend to throw hard if wanting high numbers, and soft for low numbers. When winning, they attribute outcomes to their skill, while losses become “near misses.” Losing sports gamblers may rationalize that their bet was actually right, except for a referee’s bad call or a freakish ball bounce. Difficulty delaying gratification. Those who from childhood onward have learned to delay gratification—who choose two marshmallows later over one now (as in the famous “marshmallow test” experiment)—become more academically successful and ultimately productive. They are also less likely to smoke, to commit delinquent acts, and to gamble—each of which offer immediate reward, even if at the cost of diminished long-term health and well-being. The gaming industry seeks present-minded rather than future-minded folks. They aim to hook those who will elect that figurative single marshmallow satisfaction of today’s desire over the likelihood of a greater deferred reward. Credible, attractive communicators exploit “peripheral route persuasion.” Endorsements by beautiful, famous, or trusted people can add to the allure. As former gaming industry marketing executive Jack O’Donnell notes, the sports gambling industry harnesses sports celebrity power when paying former all-star receiver Jerry Rice to dump Gatorade on a winning DraftKings bettor, when trusted sportscaster Brent Musburger encourages placing a bet, and when legendary quarterback and former Super Bowl MVP Drew Brees admonishes people to live your “Bet Life.” Each of these psychological dynamics has its own power. When combined, they help us understand the gaming industry’s lure, and, for some, its tragic addictive force. (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com. Follow him on Twitter: @davidgmyers.)
... View more
0
0
2,015
david_myers
Author
02-09-2022
12:41 PM
As members of one species, we humans share both a common biology (cut us and we bleed) and common behaviors (we similarly sense the world around us, use language, and produce and protect children). Our human genes help explain our kinship—our shared human nature. And they contribute to our individual diversity: Some people, compared with others, are taller, smarter, skinnier, healthier, more temperamental, shyer, more athletic . . . the list goes on and on. Across generations, your ancestors shuffled their gene decks, leading to the hand that—with no credit or blame due you—you were dealt. If you are reading this, it’s likely that genetic luck contributed to your having above average intelligence. Others, dealt a different hand and a different life, would struggle to understand these words. Andrew Brookes/Image Source/Getty Images Individual variation is big. Individuals vary much, much more within groups (say, comparing Danes with Danes or Kenyans with Kenyans) than between groups (as when comparing Danes and Kenyans). Yet there are also group differences. Given this reality (some groups struggle more in school), does behavior genetic science validate ethnocentric beliefs and counter efforts to create a just and equal society? In The Genetic Lottery: Why DNA Matters for Social Equality, University of Texas behavior geneticist Kathryn Paige Harden answers an emphatic no. She documents the power of genes, but also makes the case for an egalitarian culture in which everyone thrives. Among her conclusions are these: We all are family. Going back far enough in time to somewhere between 5000 and 2000 B.C., we reach a remarkable point where “everyone alive then, if they left any descendants at all, was a common ancestor of everyone alive now.” We are all kin beneath the skin. We’re each highly improbable people. “Each pair of parents could produce over 70 trillion genetically unique offspring.” If you like yourself, count yourself fortunate. Most genes have tiny effects. Ignore talk of a single “gay gene” or “smart gene.” The human traits we care about, including our personality, mental health, intelligence, longevity, and sexual orientation “are influenced by many (very, very, very many) genetic variants, each of which contributes only a tiny drop of water to the swimming pool of genes that make a difference.” Individual genes’ tiny effects may nevertheless add up to big effects. Today’s Genome Wide Association Studies (GWAS) measure millions of genome elements and correlate each with an observed trait (phenotype). The resulting miniscule correlations from thousands of genetic variants often “add up to meaningful differences between people.” Among the White American high school students in one large study, only 11 percent of those who had the lowest GWAS polygenic index score predicting school success later graduated from college, as did 55 percent of those who had the highest score. “That kind of gap—a fivefold increase in the rate of college graduation—is anything but trivial.” Twin studies confirm big genetic effects. “After fifty years and more than 1 million twins, the overwhelming conclusion is that when people inherit different genes, their lives turn out differently.” Parent-child correlations come with no causal arrows. If the children of well-spoken parents who read to them have larger vocabularies, the correlation could be environmental, or genetic, or some interactive combination of the two. Beware the ecological fallacy (jumping from one data level to another). Genetic contributions to individual differences within groups (such as among White American high school students) provide zero evidence of genetic differences between groups. Genetic science does not explain social inequalities. Harden quotes sociologist Christopher Jencks’ illustration of a genetically influenced trait eliciting an environmentally caused outcome: “If, for example, a nation refuses to send children with red hair to school, the genes that cause red hair can be said to lower reading scores.” Harden also quotes social scientist Ben Domingue: “Genetics are a useful mechanism for understanding why people from relatively similar backgrounds end up different. . . . But genetics is a poor tool for understanding why people from manifestly different starting points don’t end up the same.” Many progressives affirm some genetic influences on individual traits. For example, unlike some conservatives who may see sexual orientation as a moral choice, progressives more often understand sexual orientation as a genetically influenced natural disposition. Differences ≠ deficits. “The problem to be fixed is society’s recalcitrant unwillingness to arrange itself in ways that allow everyone, regardless of which genetic variants they inherit, to participate fully in the social and economic life of [their] country.” An example: For neurodiverse individuals, the question is how to design environments that match their skills. Behavior genetics should be anti-eugenic. Advocates of eugenics have implied that traits are fixed due to genetic influences, and may therefore deny the value of social interventions. Alternatively, some genome-blind advocates shun behavior genetics science that could inform both our self-understanding and public policy. Harden advocates a third option, an anti-eugenic perspective that, she says, would reduce the waste of time and resources on well-meaning but ineffective programs. For example, by controlling for genetic differences with a GWAS measure, researchers can more accurately confirm the actual benefits of an environmental intervention such as an educational initiative or income support. Anti-eugenics also, she contends, uses genetic information to improve lives, not classify people, uses genetic information to promote equity, not exclusion, doesn’t mistake being lucky in a Western capitalist society for being “good” or deserving, and considers what policies people would favor if they didn’t know who they would be. Harden’s bottom line: Acknowledging the realities of human diversity, and discerning the powers and limits of various environmental interventions, can enhance our quest for a just and fair society. (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com. Follow him on Twitter: @davidgmyers.)
... View more
2
0
2,224
david_myers
Author
01-19-2022
12:46 PM
“Listen, we all hoped and prayed the vaccines would be 100 percent effective, 100 percent safe, but they’re not. We now know that fully vaccinated individuals can catch Covid, they can transmit Covid. So what’s the point?” ~U.S. Senator Ron Johnson, Fox News, December 27, 2021 “I have made a ceaseless effort not to ridicule, not to bewail, not to scorn human actions, but to understand them.” ~Baruch Spinoza, Political Treatise, 1677 Many are aghast at the irony: Unvaccinated, unmasked Americans remain much less afraid of the Covid virus than are their vaccinated, masked friends and family. This, despite two compelling and well-publicized facts: Covid has been massively destructive. With 5.4 million confirmed Covid deaths worldwide (and some 19 million excess Covid-era deaths), Covid is a great enemy. In the U.S., the 840,000 confirmed deaths considerably exceed those from all of its horrific 20th-century wars. The Covid vaccines are safe and dramatically effective. The experience of 4.5+ billion people worldwide who have received a Covid vaccine assures us that they entail no significant risks of sickness, infertility, or miscarriage. Moreover, as the CDC illustrates, fully vaccinated and boosted Americans this past fall had a 10 times lower risk of testing positive for Covid and a 20 times lower risk of dying from it. Given Covid’s virulence, why wouldn’t any reasonable person welcome the vaccine and other non-constraining health-protective measures? How can a U.S. senator scorn protection that is 90+ percent effective? Does he also shun less-than-100%-effective seat belts, birth control, tooth brushing, and the seasonal flu vaccine that his doctor surely recommends? To say bluntly what so many are wondering: Has Covid become a pandemic of the stupid? Lest we presume so, psychological science has repeatedly illuminated how even smart people can make not-so-smart judgments. As Daniel Kahneman and others have demonstrated, intelligent people often make dumb decisions. Researcher Keith Stanovich explains: Some biases—such as our favoring evidence that confirms our preexisting views—have “very little relation to intelligence.” So, if we’re not to regard the resilient anti-vax minority as stupid, what gives? If, with Spinoza, we wish not to ridicule but to understand, several psychological dynamics can shine light. Had we all, like Rip Van Winkle, awakened to the clear evidence of Covid’s virulence and the vaccine efficacy, surely we would have more unanimously accepted these stark realities. Alas, today’s science-scorning American subculture seeded skepticism about Covid before the horror was fully upon us. Vaccine suspicion was then sustained by several social psychological phenomena that we all experience. Once people’s initial views were formed, confirmation bias inclined them to seek and welcome belief-confirming information. Motivated reasoning bent their thinking toward justifying what they had come to believe. Aided by social and broadcast media, group polarization further amplified and fortified the shared views of the like-minded. Misinformation begat more misinformation. Moreover, a powerful fourth phenomenon was at work: belief perseverance. Researchers Craig Anderson, Mark Lepper, and Lee Ross explored how people, after forming and explaining beliefs, resist changing their minds. In two of social psychology’s great but lesser known experiments, they planted an idea in Stanford undergraduates’ minds. Then they discovered how difficult it was to discredit the idea, once rooted. Their procedure was simple. Each study first implanted a belief, either by proclaiming it to be true or by offering anecdotal support. One experiment invited students to consider whether people who take risks make good or bad firefighters. Half looked at cases about a risk-prone person who was successful at firefighting and a cautious person who was not. The other half considered cases suggesting that a risk-prone person was less successful at firefighting. Unsurprisingly, the students came to believe what their case anecdotes suggested. Then the researchers asked all the students to explain their conclusion. Those who had decided that risk-takers make better firefighters explained, for instance, that risk-takers are brave. Those who had decided the opposite explained that cautious people have fewer accidents. Lastly, Anderson and his colleagues exposed the ruse. They let students in on the truth: The cases were fake news. They were made up for the experiment, with other study participants receiving the opposite information. With the truth now known, did the students’ minds return to their pre-experiment state? Hardly. After the fake information was discredited, the participants’ self-generated explanations sustained their newly formed beliefs that risk-taking people really do make better (or worse) firefighters. So, beliefs, once having “grown legs,” will often survive discrediting. As the researchers concluded, “People often cling to their beliefs to a considerably greater extent than is logically or normatively warranted.” In another clever Stanford experiment, Charles Lord and colleagues engaged students with opposing views of capital punishment. Each side viewed two supposed research findings, one supporting and the other contesting the idea that the death penalty deters crime. So, given the same mixed information, did their views later converge? To the contrary, each side was impressed with the evidence supporting their view and disputed the challenging evidence. The net result: Their disagreement increased. Rather than using evidence to form conclusions, they used their conclusions to assess evidence. And so it has gone in other studies, when people selectively welcomed belief-supportive evidence about same-sex marriage, climate change, and politics. Ideas persist. Beliefs persevere. The belief-perseverance findings reprise the classic When Prophecy Fails study led by cognitive dissonance theorist Leon Festinger. Festinger and his team infiltrated a religious cult whose members had left behind jobs, possessions, and family as they gathered to await the world’s end on December 21, 1954, and their rescue via flying saucer. When the prophecy failed, did the cult members abandon their beliefs as utterly without merit? They did not, and instead agreed with their leader’s assertion that their faithfulness “had spread so much light that God had saved the world from destruction.” These experiments are provocative. They indicate that the more we examine our theories and beliefs and explain how and why they might be true, the more closed we become to challenging information. When we consider and explain why a favorite stock might rise in value, why we prefer a particular political candidate, or why we distrust vaccinations, our suppositions become more resilient. Having formed and repeatedly explained our beliefs, we may become prisoners of our own ideas. Thus, it takes more compelling arguments to change a belief than it does to create it. Republican representative Adam Kinzinger understands: “I’ve gotten to wonder if there is actually any evidence that would ever change certain people’s minds.” Moreover, the phenomenon cuts both ways, and surely affects the still-fearful vaccinated and boosted people who have hardly adjusted their long-ago Covid fears to the post-vaccine, Omicron new world. The only known remedy is to “consider the opposite”—to imagine and explain a different result. But unless blessed with better-than-average intellectual humility, as exhibited by most who accept vaccine science, we seldom do so. Yet there is good news. If employers mandate either becoming vaccinated or getting tested regularly, many employees will choose vaccination. As laboratory studies remind us, and as real-world studies of desegregation and seat belt mandates confirm, our attitudes will then follow our actions. Behaving will become believing. (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com. Follow him on Twitter: @davidgmyers.)
... View more
0
0
4,624
david_myers
Author
01-07-2022
08:05 AM
In Exploring Psychology, 12th Edition, Nathan DeWall and I report that autism spectrum disorder (ASD) “is now diagnosed in 1 in 38 children in South Korea, 1 in 54 in the United States, 1 in 66 in Canada….” Check that. A new CDC report on 2018 data raises the continually increasing U.S. proportion to 1 in 44 (23 per 1,000 among 8-year-olds in eleven representative communities followed by the Autism and Developmental Disabilities Monitoring Network). The report also confirms that ASD diagnoses are four times more common among boys than girls. Psychologist Simon Baron-Cohen believes the gender imbalance is because boys tend to be “systemizers”: They more often understand things according to rules or laws, as in mathematical and mechanical systems. Girls, he contends, tend to be “empathizers”: They excel at reading facial expressions and gestures. And what racial/ethnic group do you suppose has the highest rate of ASD diagnoses? The answer: There are no discernible differences (nor across socioeconomic groups). In 2018, ASD was diagnosed equally often among all racial/ethnic groups. A final fact to ponder: 4-year-olds, the CDC reports, were “50 percent more likely to receive an autism or special education classification” than were 8-year-olds. So what do you think? Is the increasing ASD diagnosis rate—over time and of 4-year-olds—a a welcome trend? Is Karen Remley, director of the CDC’s National Center on Birth Defects and Developmental Disabilities right to regard this “progress in early identification” as “good news because the earlier that children are identified with autism the sooner they can be connected to services and support”? Or does the increased labeling of children become a self-fulfilling prophecy that assigns children to a category that includes some social deficiency, and then treats them differently as a result? And does the trend reflect some relabeling of children’s disorders, as reflected in the decreasing diagnoses of “cognitive disorder” and “learning disability”? (The popularity of different psychiatric labels does exhibit cultural variation across time and place.) In this COVID-19 era of anti-vax fears, this much we know for sure: One thing that does not contribute to rising ASD diagnoses is childhood vaccinations. Children receive a measles/mumps/rubella (MMR) vaccination in early childhood, about the time ASD symptoms first get noticed—so some parents naturally presumed the vaccination caused the ensuing ASD. Yet, despite a fraudulent 1998 study claiming otherwise, vaccinations actually have no relationship to the disorder. In one study that followed nearly 700,000 Danish children, those receiving the measles/mumps/rubella vaccine were slightly less likely to later be among the 6517 ASD-diagnosed children. (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com. Follow him on Twitter: @davidgmyers.)
... View more
2
0
2,607
david_myers
Author
11-17-2021
10:07 AM
Consider: If blessed with income and wealth beyond your needs, what would you do with it? If you decided to give away your excess wealth, would you focus it on present needs (by distributing it immediately) or future needs (by accumulating wealth for later distribution)? Consider two sample answers to the second question, both drawn from real life: Give for immediate use. In Bremerton, Washington, a generous anonymous donor is giving away $250,000 grants to local, national, and international nonprofits with a condition: They must spend it in the next 2 years. The seven recipients to date are gladly doing so by hiring new staff, giving scholarships, feeding people, and so forth. There’s no time like the present. Give today, but maximize future impact. The John Templeton Foundation, which I have served as a trustee, had a future-minded benefactor who grew his wealth by living simply and investing half his income from his earliest working years. Thanks to his investment success and the exponential mathematics of investment compounding, he was able, by his death at age 95, to endow the foundation, which today has nearly $4 billion in assets. Like all U.S. foundations, the foundation has a mandated 5 percent annual payout rate—meaning that they give today but with eyes also on more distant horizons. So, would you advise prioritizing the present (as the Bremerton donor has) or also focusing on the future (as most foundations do)? Dimitri Otis/Stone/Getty Images The Initiative to Accelerate Charitable Giving would appreciate the Bremerton donor. As the world recovers from Covid and strives for racial justice, the Initiative perceives that “demands for services from charities are greater than ever.” So, it argues, foundations should increase their giving now. The Patriotic Millionaires, co-led by Abigail Disney, have proposed doubling, for three years, the required foundation payout from 5 to 10 percent. The Accelerating Charitable Efforts Act, co-sponsored by Senators Angus King (I-ME) and Chuck Grassley (R-IA), would incentivize a 7 percent foundation payout rate (by waiving the 1.39 percent investment income tax for any year in which payout tops 7 percent of assets). Do you agree with this strategy—is now the time to give? Should we take care of our time, and leave it to future people to take care of theirs? If so, consider: Prioritizing the present will likely diminish a foundation’s future effectiveness. Given that asset-balanced foundation endowments have tended to earn less than 7 percent on their total investments,[1] even a 7 percent payout mandate would, over time, likely shrink a foundation’s assets and giving capacity. Assuming a continuation of long-term stock market performance, the Templeton Foundation calculates that its 50-year total giving would be almost double under a 5 percent payout (nearly $20 billion) vs. a 10 percent payout (less than $12 billion). Given both current and future human needs, would you still support a mandate that foundations distribute more of their assets now? Are today’s crises likely greater than tomorrow’s? The present versus future ethical dilemma brings to mind three related psychological principles: Temporal discounting. Humans often value immediate rewards over larger future rewards—a dollar today over $1.10 in a month. The phenomenon is familiar to financial advisors who plead with clients to value their future, and to harness the magic of compounding by investing today. Being financially nearsighted, our governments also tend to spend public monies on our present needs rather than our and our descendants’ future needs. Some of this present-focus reflects our commendable capacity for empathy—our hearts responding to present needs that we see and feel. But temporal discounting is also manifest in today’s consumers who oppose carbon taxes and clean energy mandates lest their lifestyle be restrained for the sake of humanity’s future. Temporal discounting undermines sustainability. Self-control: The ability to delay gratification. We aim to teach our children self-control—to control their impulses and to delay short-term gratification for bigger longer-term rewards. Better (in Walter Mischel’s classic experiment) two future marshmallows than one now. Such self-control predicts better school performance, better health, and higher income. Personal time perspective: Past, present, or future. In a 6-minute TED talk, Phil Zimbardo compared people with past, present, or future orientations—those who focus on their memories of what was, on their present situation, or on what will be. Although the good life is a mix of each, a future orientation bodes well for adolescents. Living with one eye on the future enables bigger future rewards and minimizing risk of school drop-out, problem gambling, smoking, and delinquency. Patience pays. So, mindful of both today’s and tomorrow’s needs, would you favor or oppose the proposals to increase foundation payout requirements? Of course, you say, both the present and the future matter. Indeed. But to what extent should we prioritize poverty relief (or scholarships or art galleries) today versus in the future? Who matters more—us and our people, or our great grandchildren and their compatriots? Or do we and our descendants all matter equally? (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com. Follow him on Twitter: @davidgmyers.) [1] Century-long U.S. stock returns have averaged near 10 percent, or about 7 percent when inflation-adjusted. But most foundations also have other assets that have experienced a lower rate of return—in cash, bonds, and, for example, emerging markets.
... View more
Labels
0
0
2,363
david_myers
Author
10-28-2021
07:59 AM
Steven Pinker’s books—How the Mind Works, The Blank Slate, The Better Angels of Our Nature, Enlightenment Now, and his latest, Rationality—offer a consistent and important message: Smart, critical thinking attends less to anecdotes that tug at the heart than to realities revealed by representative data. Year after year, 7 in 10 Americans, after reading the news, tell Gallup there has been more crime than in the prior year. In Better Angels, Pinker documents the reality: a long-term crime decline, along with other subsiding forms of violence, including wars and genocide. Enlightenment Now details other ways—from the environment, to life expectancy, to human rights, to literacy, to quality of life—in which, contrary to our news-fed sense of doom, the world actually is getting better. The same thinking-with-data theme pervades Rationality: What It Is, Why It Seems Scarce, Why It Matters. For my money, Chapter 4 (“Probability and Randomness”) alone is worth the book’s price of admission. It’s a chapter I wish I could assign to every AP and college introductory psychology student. Here, according to Pinker are some noteworthy outcomes of our flawed thinking: Statistical illiteracy. Our tendency to judge the likelihood of events by the ease with which examples come to mind—the availability heuristic—leads us to think folks are more often killed by tornados than by 80-times-deadlier asthma; to believe that America’s immigrant population is 28 percent (rather than 12 percent); to guess that 24 percent of Americans are gay (rather than 4.5 percent. And how many unarmed Americans of all races would you guess are killed by police in an average year? Sixty-five, reports Pinker (from reported 2015–2019 FBI data). Unwise public spending. In 2019, after a Cape Cod surfer became Massachusetts’ first shark fatality in more than eight decades, towns equipped their beaches with scary billboard warnings and blood hemorrhage-control kits, and looked into “towers, drones, planes, balloons, sonar, acoustic buoys, and electromagnetic and odorant repellants” . . . while not investing in reducing car accident deaths at a fraction of the cost, with improved signage, barriers, and law enforcement. Mitigating climate change. Compared with deaths caused by mining accidents, lung disease, dam failures, gas explosions, and fouled air, modern nuclear power, despite its vivid few failures, “is the safest form of energy”—and emits no greenhouse gases. Exaggerated fears of terrorists. Although terrorists annually kill fewer people than are killed by lightning, bee stings, or bathtub drowning, we have engaged in massive anti-terrorist spending and launched wars that have killed hundreds of thousands. Amplified dread of school shootings. “Rampage killings in American schools claim around 35 victims a year, compared with about 16,000 routine police-blotter homicides,” Pinker tells us. In response, “schools have implemented billions of dollars of dubious safety measures . . . while traumatizing children with terrifying active-shooter drills.” “The press is an availability machine,” Pinker observes. “It serves up anecdotes that feed our impression of what’s common in a way that is guaranteed to mislead.” By contrast, unreported good news typically consists “of nothing happening, like a boring country at peace.” And progress—such as 137,000 people escaping extreme poverty each day—creeps up silently, “transforming the world by stealth. . . . There was never a Thursday in October in which it suddenly happened. So one of the greatest developments in human history—a billion and a quarter people escaping squalor [in the last 25 years]—has gone unnoticed.” This latest offering from one of psychology’s public intellectuals joins kindred-spirited data-based perspectives by Hans Rosling (Factfulness: Ten Reasons We’re Wrong About the World—and Why Things are Better Than You Think), Max Roser (ourworldindata.org), and William MacAskill (Doing Good Better), as well as my Intuition: Its Powers and Perils. Together, they help us all to think smarter by advocating reality-based, statistically literate, rational decisions that can help us spend and give more wisely, and sustain a flourishing world. (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com. Follow him on Twitter: @davidgmyers.)
... View more
Labels
0
0
2,417
Topics
-
Abnormal Psychology
6 -
Achievement
1 -
Affiliation
1 -
Cognition
7 -
Consciousness
7 -
Current Events
25 -
Development Psychology
11 -
Developmental Psychology
9 -
Emotion
10 -
Gender
1 -
Gender and Sexuality
1 -
Genetics
2 -
History and System of Psychology
2 -
Industrial and Organizational Psychology
2 -
Intelligence
3 -
Learning
3 -
Memory
2 -
Motivation
3 -
Motivation: Hunger
2 -
Nature-Nurture
4 -
Neuroscience
6 -
Personality
9 -
Psychological Disorders and Their Treatment
8 -
Research Methods and Statistics
22 -
Sensation and Perception
8 -
Social Psychology
77 -
Stress and Health
8 -
Teaching and Learning Best Practices
7 -
Thinking and Language
12 -
Virtual Learning
2
Popular Posts