-
About
Our Story
back- Our Mission
- Our Leadership
- Accessibility
- Careers
- Diversity, Equity, Inclusion
- Learning Science
- Sustainability
Our Solutions
back
-
Community
Community
back
- Macmillan Community
- :
- Psychology Community
- :
- Talk Psych Blog
Talk Psych Blog
Options
- Mark all as New
- Mark all as Read
- Float this item to the top
- Subscribe
- Bookmark
- Subscribe to RSS Feed
Talk Psych Blog

Author
Friday
The Washington Post reports that money can buy happiness. To emphasize the joys of wealth, it displays this glamorous couple enjoying a sumptuous private jet meal. “Whoever said money can’t buy happiness isn’t spending it right,” proclaimed a famous Lexus ad. Extreme-Photographer/E+/Getty Images The Post draws from an excellent “adversarial collaboration” (when scientists partner to test their opposing views) by psychologists Matthew Killingsworth and Daniel Kahneman, facilitated by Barbara Mellers. Killingsworth had questioned Kahneman’s report of an “income satiation” effect, with well-being not much increasing above annual incomes of $75,000 (in 2008 dollars, or near $100,000 today). With the exception of an unhappy minority, happiness does, they now agree, continue rising with income. A figure from Killingsworth’s PNAS article (“Experienced well-being rises with income, even above $75,000 per year”) illustrates: But note that, as is typical with economists’ reporting of money-happiness data, the x-axis presents log income. (Unlike a linear income x-axis, which adds equal dollar increments, a logarithmic scale—as you can see—compacts the spread.) So what if we depict these data with an x-axis of linear dollars (the actual dollars of real people)? We then see what others have found in both U.S. and global surveys: happiness indeed rises with income, even beyond $100,000, but with a diminishing rate of increased happiness as income rises from high to super high. Multiple studies show the same curvilinear money-happiness relationship when comparing poor with wealthy nations (as illustrated in this report, again scaled with actual, not log, income). Moreover, an international survey of more than 2000 millionaires from seventeen countries found that, at net worths above $1 million, more wealth is minimally predictive of happiness (though millionaires enjoy more work autonomy and time for active leisure). And, as Ed Diener and I reported in 2018, economic growth has not improved human morale (and teen girls’ morale has plummeted). In inflation-adjusted dollars, U.S. adults, albeit with greater inequality, are three times richer than 65 years ago, with bigger houses, new technologies, home air conditioning, and more per person cars and dining out. We have more money and what it buys, but no greater happiness. Nevertheless, today’s undergraduates (in the annual UCLA American Freshman survey) continue to believe—entering collegians rate this #1 among 20 alternative life objectives—that being “very well off” matters, a lot. It’s the modern American dream: life, liberty, and the purchase of happiness. For low-income people, money does buy necessities and greater freedom. Money matters. And extreme inequality is socially toxic. But as the above data show, once we have income security and more than enough for life’s essentials, each additional $20,000 of income pays diminishing happiness dividends. Finally, we need to remember that these are correlational data. If higher-income people are somewhat happier, it may be not only because money matters, but also partly because happiness is conducive to vocational and financial success (a depressed mood is enervating). What U.S. President Jimmy Carter told Americans in 1979 remains true: “Owning things and consuming things does not satisfy our longing for meaning. We’ve learned that piling up material goods cannot fill the emptiness of lives which have no confidence or purpose.” Carter echoed William Cowper’s words from 1782: “Happiness depends, as nature shows, less on exterior things than most suppose.” Happiness depends less on gratifying our escalating wants than on simply wanting what we have. And it depends more on supportive social connections that satisfy our need to belong, and on embracing a meaning-filled sense of vocation and a spirituality that offers community and hope. Money matters, but it matters less than images of luxury private jet travel might lead us to suppose. What do you think: Might these facts of life inform our conversations about lifestyle choices, public income distribution policies, and inherited wealth? (For David’s other essays on psychological science and everyday life, visit TalkPsych.com or his new essay collection, How Do We Know Ourselves: Curiosities and Marvels of the Human Mind. Follow him on Twitter: @davidgmyers.)
... View more
Labels
1
1
328

Author
3 weeks ago
At its best, psychological science transparently puts big competing ideas to the test. With varied methods and with replication of noteworthy findings, it winnows truth from the haystack of mere speculation. If the evidence supports an idea, so much the better for it. If the idea collides with a wall of fact, then it gets rejected or revised. In reality, psychology often falls short of this ideal. A noteworthy finding doesn’t replicate. Confirmation bias drives a researcher to selectively attend to supportive evidence. In rare cases, researchers have stage-managed desired results or even faked data. Yet the psychological science ideal is achievable. My social psychologist colleagues Jonathan Haidt and Jean Twenge exemplify this ideal as they assemble evidence regarding social media effects on teen mental health, and invite others to critique and supplement their data. “It is amazing how much I have learned, and refined my views, just by asking people to make me smarter,” Haidt has told me. The stimulus for their work is a troubling social phenomenon: As smartphones and social media use have spread among teens, teen depression and suicide ideation have soared, especially among girls. The CDC’s 2023 Youth Risk Behavior Survey report illustrates: Is this simultaneous increase in social media use and teen depression a mere coincidence? Given that plausible other factors such as economic trends, wars, or domestic violence seem not to account for the decade-long trend, Haidt and Twenge conjectured a plausible culprit: the shift from face-to-face relationships to screen-based relationships, with in-person time with friends dropping by more than half since 2010. More time online has also displaced sleep and play. And it has increased demoralizing social comparisons. As Cornell University’s Sebastian Deri and his colleagues found across eleven studies, most of us, in the age of selfies, perceive our friends as having more fun: Other folks seem to party more, eat out more, and look happier and prettier. Even teens not on social media are likely affected, Haidt notes. When friends are interacting online several hours a day, those not similarly engaged can feel left out and isolated. Halfpoint/iStock/Getty Images To assess their presumption of social media harm, and mindful of lingering skepticism, Haidt and Twenge assembled the available evidence from four psychological science methods: correlational, longitudinal, experimental, and quasi-experimental. Correlational: First, they asked, do daily social media hours correlate with teen mental health? In a recent Substack essay, Haidt notes that 80 percent of 55 studies answered yes. The correlation is modest when summed across genders and all forms of screen time, but becomes telling when, as shown in these UK data, one spotlights girls’ social media exposure. Longitudinal: Second, they asked, does social media use at Time 1 predict mental health at Time 2? Among 40 longitudinal studies, Haidt reports, in 25 the answer was yes. For example, in a new study reducing social media use proved “a feasible and effective method of improving body image” among vulnerable young adults. Experimental: Third, they asked, do experiments that randomly assign participants to social media exposure produce a mental health effect? In 12 of 18 experiments, mostly done with college students and young adults, the answer was, again, yes. Moreover, among the six studies finding no effect, four involved only a brief (week or less) social media diet. Quasi-experimental: Finally, they asked, do quasi-experiments find that the timing of social media arrival predicts mental health? Was the rollout of Facebook on a campus or the arrival of high-speed internet in a community followed—at that location—by increased mental health problems? In all six studies, Haidt reports, “when social life moves rapidly online, mental health declines, especially for girls.” Together, these correlational, longitudinal, experimental, and quasi-experimental findings illustrate how psychological science explores life-relevant questions with multiple methods. Moreover, the diverse findings weave a compelling answer to the social media–teen mental health question. In the words of Haidt’s Substack title: “Social Media is a Major Cause of the Mental Illness Epidemic in Teen Girls. Here’s the Evidence.” Would you agree with Haidt’s conclusion? If yes, would you also agree with recent bipartisan calls to restrict social media to those over 16? Would doing so be supportive of parents, teens, and schools—much as efforts to restrict teen smoking have effectively dropped teen smoking from nearly 23 percent in 2000 to 2 percent in 2021? Would you concur with researchers who advise parents to keep phones out of teens’ bedrooms at night? If you are a teen, does this research have any implications for your and and your friends' mental health? Should teens begin smartphone use with texting rather than with selfies and social media? Should they intentionally restrain their daily hours online? And if you don’t agree that social media are a “major cause” of teen girls’ increased depression, what would be your alternate explanation? The importance of these questions, for teens, families, and society, will drive further research and debate. In the meantime, the complementary insights gleaned from these correlational, longitudinal, experimental, and quasi-experimental studies showcase, methinks, psychological science at its best. (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com or his new essay collection, How Do We Know Ourselves: Curiosities and Marvels of the Human Mind. Follow him on Twitter: @davidgmyers.)
... View more
1
0
3,207

Author
02-08-2023
08:29 AM
It’s a big lesson of psychological science: Most of us exhibit a self-serving bias. On most subjective, socially desirable dimensions we see ourselves as better than average—as better than 90 percent of drivers, as more moral than most others, as better able to get along with others. When good things happen we accept credit, while we attribute our failures and bad deeds to external factors—to bad breaks or a situation beyond our control. In kindred research, more than a thousand studies show how a positive thinking bias produces illusory optimism. Unrealistic optimism is evident when, for example, most students perceive themselves are far more likely than their average classmate to attend graduate school, get a high-paying job, and own a nice home, while they see themselves as much less likely to suffer a heart attack, get cancer, or be divorced. The powers and perils of our self-serving pride and Pollyannaish optimism are a profound truth. But as a saying popularized by physicist Niels Bohr reminds us, “The opposite of a profound truth may well be another profound truth.” So now for the rest of the story. If you and I tend to overestimate our strengths and our potentials, we also tend to underestimate the good impressions we make when meeting others. In six new studies of more than 2000 university students, University of Toronto psychologists Norhan Elsaadawy and Erika Carlson found that “most people tend to underestimate how positively they are seen by the individuals they know well and by individuals they have just met.” That happy finding confirms a 2018 report by Cornell University researcher Erica Boothby and her colleagues. After meeting and interacting with someone, “People systematically underestimated how much their conversation partners liked them and enjoyed their company.” This was true for strangers meeting in the laboratory, first-year undergraduates meeting their dorm mates, and adults getting to know one another in a workshop. After all such get-acquainted conversations, people “are liked more than they know.” Elsaadawy and Carlson surmise that we underestimate the impressions we make because we, more than our conversation partners, focus on our blunders and imperfections. We’re painfully aware of what others hardly notice—our stumbling words, our nervousness, our talking too much or too little, even our bad hair that day. fjmoura/DigitalImages/Getty Images Our acute self-consciousness of our gaffes, shortcomings, and blemishes was amusingly evident in one of psychology’s elegantly simple and life-relevant experiments. At Cornell University, Thomas Gilovich and his co-researchers asked participants to wear an embarrassing T-shirt (of 1970s crooner Barry Manilow) before completing some questionnaires with several others. Asked afterwards how many of the other participants noticed their unfashionable attire, the average person guessed about half. In reality, only 23 percent noticed. The good news: Others notice our clothes, our blunders, our anxiety, even our bad hair, less than we suppose. Not only may our blunders and imperfections do us less harm than we suppose, in some situations they may help. In experiments, well-regarded people who commit a pratfall—who stumble, or spill coffee, or make a mistake—may become better liked. Our blunder can evoke empathy, humanize us, and make us seem more relatable. Perhaps, then, we can fret less about how others regard us. Assuming we are well-intentioned, the odds are that people like us, even more than we suppose. And, though painfully obvious to us, many of our flaws and failings are less obvious to others, or soon forgotten. (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com or his new essay collection, How Do We Know Ourselves: Curiosities and Marvels of the Human Mind. Follow him on Twitter: @davidgmyers.)
... View more
Labels
1
0
773

Author
01-17-2023
11:08 AM
Have you ever, amid a group, found yourself caught up in something larger than yourself? Perhaps with other fans at a game? At worship? At a rock concert? With your emotions aroused and your self-awareness diminished, did you feel at one with the surrounding others? Did you experience what we social psychologists call deindividuation? In group situations, arousal + anonymity can power mob cruelty or vandalism, as in the January 6, 2021, attack on the U.S. Capitol. As one rioter lamented afterward, “I got caught up in the moment.” In other settings, socially produced arousal and loss of self enable moral elevation—a prosocial feeling of warmth and an expansion of self that bonds us to others. In Collective Effervescence, filmmaker Richard Sergay illustrates how diverse people in polarized cultures can bridge societal rifts and experience spiritual transcendence through singing. His 13-minute video, produced for the Templeton World Charity Foundation, focuses on Koolulam, an Israel-based “social-musical initiative centering around mass singing events.” In one such event, a thousand Jews, Christians, and Muslims gathered at midnight after the final day of Ramadan. Their mission: to learn and sing—in Hebrew, Arabic, and English, and in three-part harmony—Bob Marley’s “One Love.” Film image courtesy Richard Sergay and Templeton World Charity Foundation Collective effervescence, so named by sociologist Emile Durkheim, engages group members in a self-transcending experience. When experiencing such behavioral and emotional synchrony, people not only feel at one with others, notes psychologist Shira Gabriel and her colleagues, they also, afterward, feel less stressed and depressed, and more purposeful. As strangers make music together, social psychological phenomena operate. These include not only the diminished self-awareness and lessened restraint that marks deindividuation, but also the reconciling power of realizing a superordinate (shared) goal. Moreover, the collective synchrony of bodies and voices engages embodied cognition—our bodily states influencing our mental states. With bodies and voices collectively forming one choral voice, participants’ emotions converge. Synchronized voices create, at least in the moment, harmonized spirits. Even synchronous physical movements—such as walking together—facilitates conflict resolution, report psychologists Christine Webb, Maya Rossignac-Milon, and E. Tory Higgins. As walkers coordinate their steps, mutual rapport and empathy increase. The boundary between self and other softens. In some contexts, the embodied cognition experience is enhanced by combining collective singing and physical movements. “In the Western Isles of Scotland,” Scottish composer John Bell tells me, “there are tunes called ‘waulking-songs.’ These were sung as accompaniments to physical work, particularly spinning, weaving and waulking” (stretching and fluffing tweed fibers), much like sea shanty songs used by sailing ship crew members to focus and coordinate their efforts. In sub-Saharan Africa, music likewise connects voice with movement. Singing, Bell observes, is a full-bodied experience. “In South Africa, some of the ‘freedom songs’ now popular in the West, such as ‘We Are Marching in the Light of God,’ were sung by people carrying the coffins of their colleagues killed by the apartheid regime. When you look at videos of demonstrations during these years, there is a clear synchronicity between the song of the people and their physical movement. The one enables the other.” Singing the rhythmic “We Are Marching in the Light of God” (“Siyahamba” in Zulu) begs for movement, as illustrated by this Los Angeles community choir and this African Children’s choir. In the American folk tradition, Sacred Harp (a.k.a. “shape note”) music is similarly participant-centered and embodied—with a cappella singers facing one another around a hollow square, and each co-leading with hand motions, as here. So, psychological science helps explain the collective effervescence created by Koolulam events, and by other collective synchronous behaviors, from group singing to line dancing. When we synchronize our bodies and voices, we harmonize our spirits. (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com or his new essay collection, How Do We Know Ourselves: Curiosities and Marvels of the Human Mind. Follow him on Twitter: @davidgmyers.)
... View more
Labels
1
1
1,002

Author
12-19-2022
07:58 AM
My brain—and yours—is blessed with neuroplasticity. More than other species’ brains, ours can adapt by reorganizing after damage or by building new experience-based pathways. Although neuroplasticity is greatest in childhood, adult brains also change with experience. Master pianists, ballerinas, and jugglers have enlarged brain networks that manifest their acquired knowledge and skills. Brain plasticity is also at work in those of us who experience a new world of sound, enabled by a cochlear implant (CI). For many, the initial result of CI activation is underwhelming—squeaky high-pitched sounds rather than understood speech—followed by several months of gradually increasing voice comprehension as the brain adapts. Recalibrating voices. My immediate response to CI activation was happily more rewarding. “What is your middle name?” I heard the audiologist ask my previously deaf ear (reminding me of Alexander Graham Bell’s first telephone words: “Mr. Watson—come here—I want to see you”). To be sure, her words were barely—and not always—discernible. And they came with the squeaky voice of the little girl who had seemingly occupied her body. But now my plastic brain had something to work on. The ENT surgeon attributed the high-pitched voice to implant’s disproportionately high-pitched stimulation. (The cochlear wire only reaches through about 1.5 of the cochlea’s 2.5 turns—its high-frequency region—beyond which the electrodes would need to be so close that they would interfere with each other.) But with time, I am assured, my brain will recalibrate, and already that's happening. Aural rehabilitation. As pianists, ballerinas, and jugglers can train their plastic brains, so too those experiencing disabilities can, to some extent, retrain their brains. By prescribing a stroke patient to use only the “bad” hand or leg—constraint-induced therapy, it’s called—dexterity will often increase. One person who had been partially paralyzed by a stroke gradually learned, by cleaning tables with their good hand restrained, to write again and even play tennis with the affected hand. Ditto for CI recipients, who are told to anticipate up to a year of gradually improving speech perception as their plastic brain adjusts to the new input. To assist that process, I am advised to dedicate time each day to watching captioned TV, or listening to speech with just the CI. If needed, recipients can also undergo word-recognition training. Happily, I seem not to need word training. Within the first week of receiving the CI’s input, some adaption was already evident, with speech becoming increasingly intelligible. Even with the hearing aid in my other ear replaced with an earplug, I can, in a quiet room, converse with someone via the CI alone. And with the enhanced binaural hearing I can again attend department meetings and chat with folks amid a coffee hour. I also seemingly benefit from a curious phenomenon of auditory selective attention. I can listen to what sounds like a) squeaky voices with my left, CI-assisted ear, b) normal voices with my right, hearing-aid assisted ear, or c) the improved hearing from both inputs combined—yet with normal voice perception predominating. Moreover, I am experiencing . . . A new world of sound. An unanticipated outcome of my CI activation has been various objects coming to life. My implant activation has caused my silent office clock to start audibly clicking the seconds. my congregation’s tepid singing to become more vibrant. our previously inaudible garbage disposal and car—both of which I have left running overnight—to make noticeable sound. (Not that you’ve ever wondered, but a running car, when left unattended for 10 hours, drinks about a quarter tank.) What strange CI powers are these . . . to cause previously silent objects to make sound! Honestly, however, I am awestruck by those of you with normal hearing. You swim amid an ocean of inescapable sound, yet somehow manage, without constant distraction, to filter and attend to pertinent sounds, such as one particular voice. You are amazing. But then for us all, hearing is a wonder. Imagine a science fiction novel in which alien creatures transferred thoughts from one head to another by pulsating air molecules. That is us! As we converse, we are—in ways we don’t comprehend—converting what’s in our mind into vocal apparatus vibrations that send air molecules bumping through space, creating waves of compressed and expanded air. On reaching our recipient’s eardrums, the resulting vibrations jiggle the middle ear bones, triggering fluid waves down the adjacent cochlea. These bend the hair cell receptors, which trigger neural impulses up the auditory nerve to our brain—which somehow decodes the pulsations into meaning. From mind to air pressure waves to mechanical waves to fluid waves to electrochemical waves to mind, we communicate. Mind-to-mind communication via jostling air molecules: As the Psalmist exclaimed, we are “wonderfully made.” (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com or his new essay collection, How Do We Know Ourselves: Curiosities and Marvels of the Human Mind. Follow him on Twitter: @davidgmyers.)
... View more
1
1
1,508

Author
12-06-2022
07:47 AM
Artificial intelligence—the long-promised computer simulation of human intelligence—has arrived. A striking new example: On December 1st, OpenAI released a chatbot, ChatGPT, that you can play with here. I was introduced to this by one of my children, a computer systems engineer, who is mightily impressed (and not normally so wow’d by new technology that impresses me). He illustrated with a couple examples of his own: In this next example, the Sherlock joke, though wonderful, is familiar. But consider ChatGPT’s answer to the follow-up question: Impressive! Then I challenged Chat GPT to write some psychological science. I asked it to write me an essay explaining the difference between classical and operant conditioning. Its response would have merited an A grade from any instructor. Then I reset the conversation and asked it the same question again, and it responded with a new and equally impressive essay. Then I gave it a harder challenge (seeing if it understands a concept that a respected public intellectual and his editor miscast): Kudos to ChatGPT, which grasps that oft-misunderstood psychological concept. I also wondered if students could ask ChatGPT to improve their writing before handing in a paper: The future is here. For ideas on how you can explore and play with ChatGPT, or with OpenAI Playground, Ethan Mollick offers a variety of possibilities here. For crisp synopses of AI's history and future, see here. Or see here how Michelle Huang “trained an ai chatbot on my childhood journal entries" so that she could "engage in real-time dialogue with my 'inner child.'" And then consider: How might the new AI enable creative thinking? Tutoring? Cheating? Paper or essay grading? Conversing about important topics? (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com or his new essay collection, How Do We Know Ourselves: Curiosities and Marvels of the Human Mind. Follow him on Twitter: @davidgmyers.) Image credit: baona/E+/Getty Images
... View more
Labels
0
0
1,343

Author
11-22-2022
10:16 AM
Last week, I spent 3 hours under general anesthesia (while receiving a cochlear implant). Being a curious psychological scientist, and knowing my anesthesiologist, I seized the opportunity for a simple memory test. First, the reason for my curiosity: A complete night’s sleep serves to process our day’s experiences for permanent memory storage. To sleep well is to remember. Nevertheless, the initial neural recording (the “encoding”) of memories takes a bit of waking time. Rats in experiments will therefore forget what they’ve just experienced if their memory formation gets interrupted with an electric current passed through their brain. Humans are similarly amnesic for what they experience in the moments before receiving electroconvulsive therapy (ECT). And, as I’ve explained in my psychology texts, “Football players and boxers momentarily knocked unconscious typically have no memory of events just before the knockout.” Would the same be true for someone falling asleep, or for someone lapsing into a drug-induced temporary coma? Are you amnesic for what you were thinking or experiencing just before nodding off? To enable my experiencing an answer, my anesthesiologist alerted me to his drug administration, indicating that I would soon experience mental lights out. That was my signal to start counting the seconds out loud: “1, 2, 3, . . .” knape/E+/Getty Images On awakening 3.5 hours later, I remembered the trip into the operating room and onto the operating bed. I remembered chatting with the attending staff. I remembered the anesthesiologist connecting the bodily sensors . . . but nothing thereafter. In reality, I learned on awakening, my unremembered conscious engagement continued for about 3 minutes, including my counting to 16. A segment of my life—fully experienced but unrecorded—had vanished into the mind’s black hole. It was a weird experience. But weirder yet is what I have underappreciated until now: that I—and you—experience this fascinating phenomenon daily. An anesthesia-induced coma is not sleep (and may also be complicated by an amnesic drug effect). Nevertheless, last month when I proposed my anticipated quasi experiment to Baylor University sleep researcher Michael Scullin, he predicted my experience. The expected memory loss, he said, would be an example of (a new concept to me) mesograde amnesia.[i] We routinely but unknowingly experience mesograde amnesia as our immediate pre-sleep experience falls into oblivion. The phenomenon was demonstrated in a 1997 experiment by James Wyatt and colleagues: People failed to recall words spoken to them shortly before an EEG recording detected their transition to sleep. (The memory loss—from up to 4 minutes before sleep commenced—was, like mine on the operating table, surprisingly long.) Weirder yet, as Scullin further explained, sleep-induced mesograde amnesia implies that you and I will typically not remember our short (1- to 4-minute) awakenings during our night—a phenomenon also experimentally confirmed. Thus, university students who send a text message while briefly awake will, the next morning, often have no memory of doing so. And sleep apnea patients will experience multiple brief awakenings without remembering them. Mesograde amnesia explains one of my own recent sleep experiences. As I slipped into bed alone on a recent warm night, I pushed the blanket down to my feet. The room cooled during the night, and in the morning I awoke to find the blanket pulled up—with my having no memory of how that happened. Had my fairy godmother noticed my chill? Scullin’s memory tutorial also led to my wondering about an evening after-work experience I have at least weekly—briefly nodding off while watching a British mystery. When I snap back to consciousness, I typically need to replay about 10 minutes for which I have no memory. I’ve assumed that the program gap represents a 10-minute nap. In reality, I now realize, 4 minutes of mesograde amnesia plus 6 minutes of napping could account for the missing 10 minutes. What’s true for the sleep-experiment participants, and for me, is also true of you. Your falling asleep—at the beginning of your sleeping or napping, and again during your interrupted sleep—makes you amnesic for your immediately preceding life experience. Over time, your mesograde amnesia experiences add up to hours of your conscious life that have vanished, having gone unrecorded on your cerebral hard drive. Count it as one more example of our wonder-full lives. (For more such wonders, see my just-published How Do We Know Ourselves: Curiosities and Marvels of the Human Mind. 😊) [i] Most amnesia is either anterograde (trouble making new memories) or retrograde (trouble accessing old memories). Mesograde (middle grade) amnesia is not clearly due to either the inability to store a new memory or retrieve the memory once stored. Some say it is produced by memory-disruptive bursts of hippocampal activity during the wake-to-sleep transition.
... View more
Labels
1
0
1,135

Author
11-15-2022
12:28 PM
As President Biden approaches his 80th birthday and contemplates seeking reelection, many wonder: Does someone entering their ninth life decade have—and will they sustain—the energy, mental acuity, and drive to excel? From its early 2022 national survey (with ABC News), the Washington Post reported that “54 percent say they do not think Biden has the mental sharpness it takes to serve as president, while 40 percent say he does.” Mr. President. I empathize. I, too, turned 80 this fall. So on behalf of you and all of us octogenarians, let me shine the light of psychological science on our capacities. First, people should understand that the more we age, the less age predicts our abilities. Knowing that James is 8 and Jamal is 18 tells us much about their differences. Not so with two adults who similarly differ by a decade. Many a 80-year-old can outrun, outbike, and outthink a 70-year-old neighbor. It’s true that we 80s folks have some diminishing abilities. Like you, Mr. President, I can still jog—but not as fast or far. The stairs we once bounded up have gotten steeper, the newsprint smaller, others’ voices fainter. And in the molasses of our brain, memories bubble more slowly to the surface: We more often experience brain freezes as we try to retrieve someone’s name or the next point we were about to make. Hence your legendary gaffs. Yet with a lifetime’s accumulation of antibodies, we also suffer fewer common colds and flus than do our grandchildren. Physical exercise, which you and I regularly do, not only sustains our muscles, bones, and hearts; it also stimulates neurogenesis, the birth of new brain cells and neural connections. The result, when compared with sedentary folks like your predecessor, is better memory, sharper judgment, and minimized cognitive decline. Moreover, we either retain or grow three important strengths: Crystallized intelligence. We can admit to experiencing what researchers document: Our fluid intelligence—our ability to reason and react speedily—isn’t what it used to be. We don’t solve math problems as quickly or learn new technologies as readily, and we’re no match for our grandkids at video games. But the better news is that our crystallized intelligence—our accumulated knowledge and the ability to apply it—crests later in life. No wonder many historians, philosophers, and artists have produced their most noteworthy work later in life than have mathematicians and scientists. Anna Mary Robertson Moses (“Grandma Moses”) took up painting in her 70s. At age 89, Frank Lloyd Wright designed New York City’s Guggenheim Museum. At age 94, my psychologist colleague Albert Bandura has just co-authored yet another article. Perhaps our most important work is also yet ahead? Wisdom. With maturity, people’s social skills often increase. They become better able to take multiple perspectives, to offer helpful sagacity amid conflicts, and to appreciate the limits of their knowledge. The wisdom to know when we know a thing and when we do not is born of experience. Working at Berlin’s Max Planck Institute, psychologist Paul Baltes and his colleagues developed wisdom tests that assess people’s life knowledge and judgments about how to conduct themselves in complex circumstances. Wisdom “is one domain in which some older individuals excel,” they report. “In youth we learn, in age we understand,” observed the 19th-century novelist Marie Von Ebner-Eschenbach. Stable emotionality. As the years go by, our feelings mellow. Unlike teens, who tend to rebound up from gloom or down from elation within an hour, our highs are less high and our lows less low. As we age, we find ourselves less often feeling excited or elated. But our lives are also less often disrupted by depression. In later life we are better able to look beyond the moment. Compliments produce less elation; criticisms, less despair. At the outset of my career, praise and criticism would inflate and deflate my head. A publication might have me thinking I was God’s new gift to my profession, while a rejection led me to ponder moving home to join the family business. With experience, both acclaim and reproach become mere iotas of additional feedback atop a mountain of commentary. Thus, when responding to the day’s slings and arrows, we can better take a big-picture, long-term perspective. Mr. President, I understand these things, as I suspect you do, too. When in my 60s, I assumed—wrongly—that by age 80, I would no longer have the energy to read, to think, to write. Instead, I take joy in entering my office each day at a place called Hope. I relish learning something new daily. I find delight in making words march up a screen. And I’m mellower, as it takes more to make me feel either ecstatic or despondent. And you? Will you, as a newly minted octogenarian, show your age? Yes, that jog up to the podium will surely slow. You will likely more often misspeak or forget a point. Your sleep will be more interrupted. But you will also benefit from the crystallized intelligence that comes with your lifetime’s experience. You can harness the wisdom that comes with age. And you can give us the gift of emotional maturity that will enable you, better than most, to navigate the “battle between our better angels and our darkest impulses.” ------------ *This essay updates a 2020 essay published at TalkPsych.com. See also Psalm 92:14 🙂 (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com or his new essay collection: How Do We Know Ourselves: Curiosities and Marvels of the Human Mind. Follow him on Twitter: @davidgmyers.)
... View more
Labels
2
1
1,075

Author
11-08-2022
11:33 AM
This moment is both the oldest you have ever been, and the youngest you will henceforth be. To live is to age. But which age tends to offer the best of life, and which the worst? Should we anticipate our mid- or later-life future with hope, or with dread? When asked which life eras are the unhappiest, some say it’s the teen years: mood swings, social anxieties, parental demands, peer pressures, and career worries. Some say it’s later life, with its diminished occupational purpose, stamina, recall, and social networks—and with the Grim Reaper looming ever closer. Still others say it’s the in-between “midlife crisis” years, when people (traditionally referring to men) realize their career fantasies are not being realized, their marriage is ho-hum, and their bodies are aging. “While Diana puts on a brave face,” declared a 1991 People magazine cover, “a brooding [42-year-old] Prince Charles grapples with a midlife crisis and retreats to his old girlfriends.” Three decades ago, when authoring The Pursuit of Happiness: Who is Happy—and Why, I surmised that, apart from happy and stressful life events, no time of life was notably happier or unhappier. Consider Ronald Inglehart’s 1991 aggregated worldwide big data: Follow-up research confirmed the life span stability of well-being, with Robert McCrae, Paul Costa, and colleagues finding “no evidence” of instability in midlife men and women, including one study with 10,000 participants. Moreover, when the Gallup Organization (in data shared with me in 2010) asked 142,682 people worldwide to rate their lives on a ladder, from 0 (“the worst possible life”) to 10 (“the best possible life”), age gave no clue to life satisfaction: Andrew Jebb and colleagues analyzed more recent Gallup data from 1.7 million people in 166 countries, and although they also found claims of a U-shaped happiness trend, they similarly observed that a midlife dip is “overblown.” Any age differences, said the researchers, are “trivial.” Amassing similar, newer cross-sectional and longitudinal data similar, University of Alberta psychologist Nancy Galambos and her collaborators (see here and here) agree: “We cannot conclude that there is a universal U shape in happiness.” But hold on. Dartmouth College economist David Blanchflower and his colleagues (here and here) beg to differ, citing more than 300 studies, with data from 146 countries, that do find a U shape: “The midlife low occurs in the mid-40s.” Moreover, new data lend credence to a U-shaped happiness trajectory. A mostly British research team led by economist Osea Giuntella notes that prior research, being mostly cross-sectional, has compared people of differing ages at one point in time. When researchers compare different cohorts (groups born at different times), they also are comparing people raised in different economic, political, cultural, and educational circumstances. The Giuntella team instead harvested international data following some 500,000 people through time. Their conclusion? The midlife crisis is real: “Midlife is a time when people disproportionately take their own lives, have trouble sleeping, are clinically depressed, spend time thinking about suicide, feel life is not worth living, find it hard to concentrate, forget things, feel overwhelmed in their workplace, suffer from disabling headaches, and become dependent on alcohol.” When examining the newest Gallup World Poll survey data, Harvard epidemiologist Tyler VanderWeele and his co-researchers similarly “found the classic U-shaped pattern with life evaluation being higher for younger people and older people (up through about age 80) and lower in middle life.” The Blanchflower, Giuntella, and VanderWeele findings have changed me from a confident skeptic of U-shaped life span happiness to a curious agnostic, but still believing the age effect to be modest compared to bigger happiness predictors, such as personality traits, social support, and sense of life purpose, meaning, and hope. What’s more certain and concerning is a dramatic decline in psychological well-being among younger Americans. The last decade has witnessed an unprecedented soaring of serious depression among U.S. teens and young adults. In a large survey, VanderWeele’s team similarly found not a U-shape life curve but rather lower well-being among younger Americans; or, said differently, better well-being among older Americans. So be kind to the young adults in your life (possibly yourself). And consider more good news. As Nathan DeWall and I documented in Psychology, 13th Edition, older adults become more attentive to positive news, with their brains less responsive to negative events. They experience fewer relationship stresses, including less anxiety and anger. They become more trusting. And they become more likely to remember their lives’ good rather than bad events. In later life, we also become more emotionally stable, our emotional highs less high, our lows less low. Things that used to irritate—slow traffic, poor restaurant service, a friend’s snub—no longer seem like such big deals. With age, compliments trigger less elation and criticisms less despair, as each become mere iotas of additional feedback atop a lifetime of accumulated praise and reproach. “At 70,” said Eleanor Roosevelt, “I would say the advantage is that you take life more calmly. You know that ‘this, too, shall pass!’” Ergo, despite recent indications of a slight midlife dip, people’s happiness and life satisfaction is remarkably stable—with the striking exception being the dramatic recent rise in teen/young adult depression. But the best news is that later life is not to be feared. For most older adults, life seems, on balance, to be, and to have been, mostly good. (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com or his new essay collection: How Do We Know Ourselves: Curiosities and Marvels of the Human Mind. Follow him on Twitter: @davidgmyers.)
... View more
Labels
0
0
978

Author
11-01-2022
02:04 PM
Reading my discipline’s discoveries leaves me sometimes surprised and frequently fascinated by our mind and its actions. In hopes of sharing those fascinations with the wider world, I’ve authored How Do We Know Ourselves? Curiosities and Marvels of the Human Mind, which I’m pleased to announce is published today by Farrar, Straus and Giroux. Its 40 bite-sized essays shine the light of psychological science on our everyday lives. I take the liberty of sharing this with you, dear readers of this wee blog, partly because the book is also a fund-raiser for the teaching of high school psychology. (All author royalties are pledged to support psychology teaching—half to the American Psychological Foundation to support Teachers of Psychology in Secondary Schools, and half to the Association for Psychological Science Fund for the Teaching and Public Understanding of Psychological Science.) My hope is that some of you—or perhaps some of your students (a Christmas gift idea for their parents?)—might enjoy these brief and playful musings half as much as I enjoyed creating them. (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com. Follow him on Twitter: @davidgmyers.)
... View more
Labels
0
0
1,202

Author
10-12-2022
09:19 AM
"I could stand in the middle of Fifth Avenue and shoot somebody, and I wouldn't lose any voters, OK?" ~ Donald J. Trump, January 23, 2016 The conservative sage and former George W. Bush speech writer Peter Wehner is aghast at what his U.S. Republican Party has come to accept: Republican officials showed fealty to Trump despite his ceaseless lying and dehumanizing rhetoric, his misogyny and appeals to racism, his bullying and conspiracy theories. No matter the offense, Republicans always found a way to look the other way, to rationalize their support for him, to shift their focus to their progressive enemies. As Trump got worse, so did they. Indeed, in the wake of misappropriated top-secret documents, civil suits over alleged business frauds, and the revelations of the January 6 House Select Committee, Donald Trump’s aggregated polling data approval average increased from late July’s 37 percent to today’s 41 percent, a virtual tie with President Biden. Democrats join Wehner in being incredulous at Trump’s resilient approval rating, even as MAGA Republicans are similarly incredulous at Biden’s public approval. In politics as in love, we are often amazed at what others have chosen. Psychological science offers some explanations for why people might be drawn, almost cult-like, to charismatic autocratic leaders on the right or left. Perceived threats and frustrations fuel hostilities. Punitive, intolerant attitudes, which form the core of authoritarian inclinations, surface during times of change and economic frustration. During recessions, anti-Black prejudice has increased. In countries worldwide, low income years and low income people manifest most anti-immigrant prejudice. In the Netherlands and Britain, times of economic or terrorist threat have been times of increased support for right-wing authoritarians and anti-immigrant policies. In the U.S., MAGA support rides high among those with less than a college education living amid high income inequality. The illusory truth effect: Mere repetition feeds belief. In experiments, repetition has a strange power. It makes statements such as ““A galactic year takes 2500 terrestrial years”” seem truer. Hear a made-up smear of a political opponent over and over and it becomes more believable. Adolf Hitler, George Orwell, and Vladimir Putin all have understood the persuasive power of repetitive propaganda. So have Barack Obama (“If they just repeat attacks enough and outright lies over and over again . . . people start believing it”) and Donald Trump (“If you say it enough and keep saying it, they’ll start to believe you”). Conflicts feed social identities. We are social animals. Our ancestral history prepares us to protect ourselves in groups, to cheer for our groups, even to kill or die for our groups. When encountering strangers, we’re primed to make a quick judgment: friend or foe?—and to be less wary of those who look and sound like us. Conflicts—from sporting events to elections to wars—strengthen our social identity: our sense of who we are and who they are. In the U.S., White nationalist rallies serve to solidify and sustain aggrieved identities. Still, I hear you asking: Why do people, once persuaded, persist in supporting people they formerly would have shunned, given shocking new revelations? In just-published research, Duke University psychologists Brenda Yang, Alexandria Stone, and Elizabeth Marsh repeatedly observed a curious “asymmetry in belief revision”: People will more often come to believe a claim they once thought false than to unbelieve something they once thought true. The Duke experiments focused on relative trivia, such as whether Michelangelo’s statue of David is located in Venice. But consider two real life examples of people’s reluctance to unbelieve. Sustained Iraq War support. The rationale for the 2003 U.S. war against Iraq was that its leader, Saddam Hussein, was accumulating weapons of mass destruction. At the war’s beginning, Gallup reported that only 38 percent of Americans said the war was justified if there were no such weapons. Believing such would be found, 4 in 5 people supported the war. When no WMDs were found, did Americans then unbelieve in the war? Hardly. Fifty-eight percent still supported the war even if there were no such weapons (with new rationales, such as the supposed liberation of oppressed Iraqi people). Sustained Trump support. In 2011, the Public Religion Research Institute asked U.S. voters if “an elected official who commits an immoral act in their personal life can still behave ethically and fulfill their duties in their public and professional life.” Only 3 in 10 White evangelical Protestants concurred that politicians’ personal lives have no bearing on their public roles. But by July of 2017, after supporting Donald Trump, 7 in 10 White evangelicals were willing to separate the public and personal. It was a “head-spinning reversal,” said the PRRI CEO. Moreover, despite tales of Trump’s sexual infidelity, dishonesty, and other broken Ten Commandments, White evangelicals’ support of Trump continues. Once someone or something is embraced, unbelieving—letting go—is hard. In Stanley Milgram’s famed obedience experiments, people capitulated in small steps—first apparently delivering a mild 15 volts, then gradually delivering stronger and stronger supposed electrical shocks—after progressively owning and justifying their actions. Each repugnant act made the next easier, and also made the commitment more resilient. “With each moral compromise,” observes Peter Wehner, “the next one—a worse one—becomes easier to accept.” In small steps, conscience mutates. Cognitive dissonance subsides as people rationalize their commitment. Confirmation bias sustains belief as people selectively engage kindred views. Fact-free chatter within one’s echo chamber feeds group polarization. And so, after believing in a would-be autocrat—after feeling left behind, after hearing repeated lies, and after embracing a political identity—it becomes hard, so hard, to unbelieve. (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com. Follow him on Twitter: @davidgmyers.)
... View more
Labels
0
0
976

Author
08-24-2022
09:43 AM
Language evolves. The fourteenth century English poet Geoffrey Chaucer, who lived but 25 generations ago, would struggle to communicate with us. Word meanings change. Yesterday’s evangelicals who led the abolition of Britain’s slave trade would wince at a particular subset of today’s American evangelicals—Trump acolytes who are infrequent worshippers, and for whom “evangelical” has become more of a cultural than a religious identity. In political discourse, a “socialist” becomes not just a person who favors public-owned industries, but someone who advocates reduced inequality and a stronger social safety net. And “critical race theory,” a once obscure idea in law journals that has been nonexistent in social psychology’s prejudice chapters and books, has been distorted and inflated as a way to malign efforts to ameliorate racism and teach real history. Similar concept creep has happened in the mental health world, observes University of Melbourne psychologist Nick Haslam. Concepts with a precise meaning have expanded to capture other or less extreme phenomena. Examples: “Addiction” (compulsive substance use) has expanded to include money-depleting gambling, sexual fixations, time-draining gaming, and even excessive shopping and social media use, as in: “I’m addicted to my phone.” Thus, between 1970 and 2020, the proportion of academic psychology abstracts mentioning “addiction” has increased sixfold. “Abuse” still refers to intentional physical harm or inappropriate sexual contact, but in everyday use may now include neglectful omissions and painful mistreatments: hurtful teasing, distressing affronts, or overwrought parents screaming at their children. Accompanying this semantic inflation, the proportion of post-1970 psychology abstracts mentioning “abuse” has multiplied seven times over. “Trauma” initially referred to physical injury (as in traumatic brain injury), then expanded to encompass horrific emotional traumas (rape, natural disaster, wartime combat, torture), and now has been inflated to include stressful life experiences within the range of normal human experience—to job loss, serious illness, and relationship breakups, and even, reports Harvard psychologist Richard McNally, to wisdom tooth extraction, enduring obnoxious jokes, and the normal birthing of a healthy child. So, no surprise, over the past half century the proportion of psychology abstracts mentioning “trauma” has increased tenfold. Haslam offers other concept-creep examples, such as broadening the “prejudice” of yesterday’s bigotry to include today’s subtler but persistent “implicit biases” and “micro aggressions.” And we could extend his list. ADD, ADHD, autism spectrum disorder, and the DSM-5’s new “prolonged grief disorder” all refer to genuine pathologies that have been broadened to include many more people. At least some of yesterday’s normally rambunctious boys, easily distracted adults, socially awkward people, and understandably bereaved parents or spouses are now assigned a psychiatric label and offered mental health or drug therapies. This psychologization or psychiatrization of human problems serves to expand the mental health and pharmacology industries, entailing both benefits and costs. Concept creep does have benefits. It represents an expansion of our circle of moral concern. As vicious violence, appalling adversity, and blatant bigotry have subsided in Western cultures, albeit with horrendous exceptions, we have become more sensitive to lesser but real harms—to upsetting maltreatment, dysfunctional compulsions, and toxic systemic biases. Progressive and empathic people, being sensitive to harm-doing, mostly welcome the expanded concepts of harm and victimization But concept creep, Haslam argues, also risks casting more and more people as vulnerable and fragile—as, for example, helpless trauma victims, rather than as potentially resilient creatures. “I am beginning to think that our world is in a constant state of trauma,” writes one psychotherapist/columnist. “Living with trauma, PTSD, unregulated depression and anxiety is almost the norm these days.” As is common in many professions, mental health workers may sometimes overreach to broaden their reach and client base: “Your hurt was an abuse, and you need me to help you heal.” Concept creep also risks trivializing big harms, Haslam notes, by conflating them with lesser harms: “If everyday sadness becomes ‘depression’ and everyday stressors become ‘traumas’ then those ideas lose their semantic punch.” If more and more pet-loving people seek to travel with their “emotional support” companions, the end result may be restricted access for those for whom companion animals serve a vital function. “Many traumas do indeed have severe and lasting effects that must not be minimized,” Haslam and co-author Melanie McGrath emphasize. “However, as the concept of trauma stretches to encompass fewer extreme experiences, the tendency to interpret marginal or ambiguous events as traumas is apt to promote hopelessness, submission, and passivity in response to challenges that might be overcome better if placed in a different interpretive frame.” The bottom line: Addiction, abuse, and trauma are genuine sources of human suffering. But where should we draw the line in defining and treating them? (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com. Follow him on Twitter: @davidgmyers.)
... View more
2
1
2,817

Macmillan Employee
08-18-2022
10:58 AM
As we approach the start of a new semester, psychology teachers can use this 5-minute animation written and narrated by David Myers to help students effectively learn and remember course material: https://t.co/24UIFUnWFy
... View more
0
0
1,591

Author
07-21-2022
11:07 AM
“The best time to plant a tree was 20 years ago. The second-best time is now.” ~ Anonymous proverb Character education’s greatest task is instilling a mark of maturity: the willingness to delay gratification. In many studies, those who learn to restrain their impulses—by electing larger-later rather than smaller-now rewards—have gone on to become more socially responsible, academically successful, and vocationally productive. The ability to delay gratification, to live with one eye on the future, also helps protect people from the ravages of gambling, delinquency, and substance abuse. In one of psychology’s famous experiments, Walter Mischel gave 4-year-olds a choice between one marshmallow now, or two marshmallows a few minutes later. Those who chose two later marshmallows went on to have higher college graduation rates and incomes, and fewer addiction problems. Although a recent replication found a more modest effect, the bottom line remains: Life successes grow from the ability to resist small pleasures now in favor of greater pleasures later. Marshmallows—and much more—come to those who wait. The marshmallow choice parallels a much bigger societal choice: Should we prioritize today, with policies that keep energy prices and taxes low? Or should we prioritize the future, by investing now to spare us and our descendants the costs of climate change destruction? “Inflation is absolutely killing many, many people,” said U.S. Senator Joe Manchin, in explaining his wariness of raising taxes to fund climate mitigation. Manchin spoke for 50 fellow senators in prioritizing the present. When asked to pay more now to incentivize electric vehicles and fund clean energy, their answer, on behalf of many of their constituents, is no. But might the cost of inaction be greater? The White House Office of Management and Budget projects an eventual $2 trillion annual federal budget cost of unchecked climate change. If, as predicted, climate warming increases extreme weather disasters, if tax revenues shrink with the economy’s anticipated contraction, and if infrastructure and ecosystem costs soar, then are we being penny-wise and pound-foolish? With the worst yet to come, weather and climate disasters have already cost the U.S. a trillion dollars over the past decade, with the total rising year by year. nattrass /E+/Getty Images The insurance giant Swiss Re also foresees a $23 trillion global economy cost by 2050 if governments do not act now. The Big 4 accounting firm Deloitte is even more apprehensive, projecting a $178 trillion global cost by 2070. What do you think: Are our politicians—by seizing today’s single economic marshmallow—displaying a mark of immaturity: the inability to delay gratification for tomorrow’s greater good? A related phenomenon, temporal discounting, also steers their political judgments. Even mature adults tend to discount the future by valuing today’s rewards—by preferring, say, a dollar today over $1.10 in a month. Financial advisors therefore plead with their clients to do what people are not disposed to do . . . to think long-term—to capitalize on the power of compounding by investing in their future. Alas, most of us—and our governments—are financially nearsighted. We prioritize present circumstances over our, and our children’s, future. And so we defend our current lifestyle by opposing increased gas taxes and clean-energy mandates. The western U.S. may be drying up, sea water creeping into Miami streets, and glaciers and polar ice retreating, but even so, only “1 percent of voters in a recent New York Times/Siena College poll named climate change as the most important issue facing the country, far behind worries about inflation and the economy.” The best time to plant a tree, or to have invested in climate protection, was 20 years ago. The worst time is 20 years from now, when severe climate destruction will be staring us in the eye. As we weigh our present against our future, psychological science reminds our political representatives, and all of us, of a profoundly important lesson: Immediate gratification makes today easy, but tomorrow hard. Delayed gratification makes today harder, but tomorrow easier. (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com. Follow him on Twitter: @davidgmyers.)
... View more
1
1
1,240

Author
07-05-2022
07:49 AM
With 45,000+ annual U.S. firearm deaths (123 per day, from homicide, suicide, and accidents), America has a gun problem, of which the recent Buffalo, Uvalde, and Highland Park mass killings are horrific examples. In response, we often hear that the problem is not America’s 400 million guns but its people. “We have a serious problem involving families, involving drugs, involving mental health in this country,” asserted Colorado congressional representative Ken Buck. “We have become a less safe society generally. Blaming the gun for what’s happening in America is small-minded.” To protect against mass killings by 18-year-olds, as in Buffalo and Uvalde, we are told that we don’t need to match the minimum age for assault rifle purchase, still 18 after the new gun safety act, with the age for beer purchase, 21. We don’t need to train and license gun owners as we do drivers. We don’t need safe-storage laws or restrictions on large-capacity magazines. Instead, we need to fix the problem perceived by Texas Governor Greg Abbott and National Rifle Association chief executive Wayne LaPierre: evil people. To solve the gun violence problem, we need better people, enabled by commendable social changes: more married fathers, less pornography, fewer violent video games. And most importantly, we’re told, we need to deal with mass killers’ mental sickness. “People with mental illness are getting guns and committing these mass shootings,” observed former U.S. Speaker of the House Paul Ryan. While president, Donald Trump agreed: “When you have some person like this, you can bring them into a mental institution.” Mass killers, he later added, are “mentally ill monsters.” However, reality intrudes. As I documented in an earlier essay, most mentally ill people are nonviolent, most violent criminals and mass shooters have not been diagnosed as mentally ill, and rare events such as mass shootings are almost impossible to predict. As much as we psychologists might appreciate the ostensible high regard, today’s psychological science lacks the presumed powers of discernment. If mental-health assessments cannot predict individual would-be killers, three other factors (in addition to short-term anger and alcohol effects) do offer some predictive power: Demographics. As the recent massacres illustrate, most violent acts are committed by young males. The late psychologist David Lykken made the point memorably: “We could avoid two-thirds of all crime simply by putting all able-bodied young men in cryogenic sleep from the age of 12 through 28.” Past behavior. It’s one of psychology’s maxims: The best predictor of future behavior is past behavior. The best predictor of future GPA is past GPA. The best predictor of future employee success is past employee success. The best predictor of future class attendance or smoking or exercise is, yes, the recent past. Likewise, recent violent acts are a predictor of violent acts. Guns. Compared to Canada, the United States has 3.5 times the number of guns per person and 8.2 times the gun homicide rate. Compared to England, the U.S. has 26 times as many guns per person—and 103 times the gun homicide rate. To check U.S. state variations, I plotted each state’s gun-in-home rate with its gun homicide rate. As you can see, the correlation is strongly positive, ranging from (in the lower left) Massachusetts, where 15 percent of homes have a gun, to Alaska, where 65 percent of homes have a gun—and where the homicide rate is 7 times greater. Of these three predictor variables, gun policy is one that, without constraining hunters’ rights, society can manage with some success: When nations restrict gun access, the result has been fewer guns in civilian hands, which enables fewer impulsive gun uses and fewer planned mass shootings. Many people nevertheless believe that, as Senator Ted Cruz surmised after the Uvalde shooting, “What stops arms bad guys is armed good guys.” Never mind that in one analysis of 433 active shooter attacks on multiple people—armed lay citizens took out active shooters in only 12 instances. Many more—a fourth of such attacks—ended in a shooter suicide. Moreover, if the answer to bad guys with guns is to equip more good guys with guns, then why are states with more armed good guys more homicidal? What explains the state-by-state guns/homicide correlation? Are the more murderous Alaskans (and Alabamians and Louisianans) really more “evil” or “mentally ill”? Or is human nature essentially the same across the states, with the pertinent difference being the weapons? (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com. Follow him on Twitter: @davidgmyers.)
... View more
3
0
1,556
Topics
-
Abnormal Psychology
7 -
Achievement
1 -
Affiliation
1 -
Cognition
8 -
Consciousness
6 -
Current Events
19 -
Development Psychology
6 -
Developmental Psychology
9 -
Emotion
10 -
Gender
1 -
Genetics
2 -
History and System of Psychology
1 -
History and Systems of Psychology
1 -
Industrial and Organizational Psychology
2 -
Intelligence
3 -
Learning
3 -
Memory
2 -
Motivation
3 -
Motivation: Hunger
1 -
Nature-Nurture
4 -
Neuroscience
6 -
Personality
9 -
Psychological Disorders and Their Treatment
6 -
Research Methods and Statistics
22 -
Sensation and Perception
7 -
Social Psychology
62 -
Stress and Health
9 -
Teaching and Learning Best Practices
6 -
Thinking and Language
9 -
Virtual Learning
2
Popular Posts