-
About
Our Story
back- Our Mission
- Our Leadership
- Accessibility
- Careers
- Diversity, Equity, Inclusion
- Learning Science
- Sustainability
Our Solutions
back
-
Community
Community
back- Newsroom
- Discussions
- Webinars on Demand
- Digital Community
- The Institute at Macmillan Learning
- English Community
- Psychology Community
- History Community
- Communication Community
- College Success Community
- Economics Community
- Institutional Solutions Community
- Nutrition Community
- Lab Solutions Community
- STEM Community
- Newsroom
- Macmillan Community
- :
- Psychology Community
- :
- Talk Psych Blog
Talk Psych Blog
Options
- Mark all as New
- Mark all as Read
- Float this item to the top
- Subscribe
- Bookmark
- Subscribe to RSS Feed
Talk Psych Blog
Showing articles with label Development Psychology.
Show all articles
david_myers
Author
04-25-2024
06:36 AM
Most academic fields are blessed with public intellectuals—people who contribute big ideas to their disciplines and also to public discourse. Economics has had (among others) Paul Krugman and Milton Friedman. History has had Henry Louis Gates, Jr. and Doris Kearns Goodwin. Evolutionary biology has had Richard Dawkins and E. O. Wilson. And psychological science? On my top 10 psychology public intellectuals list—admittedly reflecting my current interests—would be the late Daniel Kahneman, along with Martin Seligman, Elizabeth Loftus, Steven Pinker, Jennifer Eberhardt, Angela Duckworth, Roy Baumeister, Jean Twenge, and Robert Cialdini. With so many deserving candidates, your interests and list will differ. Likely it would now also include Jonathan Haidt, whose new book, The Anxious Generation, appeared with a trifecta—as the simultaneous #1 nonfiction bestseller at the New York Times, Publisher’s Weekly, and Amazon—and with featured reviews in major newspapers and The New Yorker; interviews on TV networks, talk shows, and podcasts; and Haidt’s own The Atlantic feature article. In collaboration with Jean Twenge (my social psychology text coauthor), Haidt aims less to sell books than to ignite a social movement. Teen depression, anxiety, and suicidal thinking have soared in the smartphone/social media era, Haidt and Twenge observe, and especially so for those teen girls who devote multiple daily hours to social media. For an excellent 7-minute synopsis of their evidence—perfect for class discussion, youth groups, or the family dinner table—see here. Their solution is straightforward: We need to stop overprotecting kids from real-world challenges and under-protecting them in the virtual world. We should decrease life experience–blocking phone-based childhood and increase resilience-building unrestricted play and in-person social engagement. To make this practical, Haidt offers schools and parents four recommendations: No smartphones until high school (flip phones before). No social media before age 16. Phone-free schools (deposit phones on arrival). More free play and unsupervised real-world responsibility. Given such high visibility assertions, Haidt and Twenge’s writings are understandably stimulating constructive, open debate that models what Haidt advocated in his earlier The Coddling of the American Mind (2018), and in founding the Heterodox Academy to support “open inquiry, viewpoint diversity, and constructive disagreement.” His colleague critics, including psychologist Candice Odgers writing in Nature and an Oxford research team, question the smartphone effect size and offer alternative explanations for the teen mental health crisis. Although the research story is still being written, my reading of the accumulated evidence supports Haidt and Twenge, whose replies to their skeptics provide a case study in rhetorical argumentation: Are they merely offering correlational evidence? No, longitudinal studies and experiments confirm the social media effect, as do quasi-experiments that find mental health impacts when and where social media get introduced. Are the effects too weak to explain the huge increase in teen girls’ depression and anxiety? No, five social media hours a day double teen girls’ depression risk. Moreover, social media have collective effects; they infuse kids’ social networks. Is teen malaise instead a product of family poverty and financial recession? No, it afflicts the affluent as well, and has increased during an era of economic growth. Are the problems related to U.S. politics, culture, or school shootings? No, they cross Western countries. Are teens more stressed due to increased school pressures and homework? No; to the contrary, homework pressure has declined. Two other alternative explanations—that kids are experiencing less independence and less religious engagement—actually dovetail with the social media time-drain evidence. (Haidt, a self-described atheist, includes a chapter on the smartphone-era decline in experiences of spiritual awe, meditation, and community.) Haidt’s inspiring an international conversation about teens and technology takes my mind back to 2001. A committee of four of us, led by Martin Seligman, evaluated candidates for the first round of Templeton Foundation–funded positive psychology prizes. Our $100,000 top prize winner—recognizing both achievements and promise—was an impressive young scholar named (you guessed it) Jon Haidt. More than we expected, we got that one right. In 2024, our culture is becoming wiser and hopefully healthier, thanks to Haidt’s evidence-based teen mental health advocacy, enabled by his persistent public voice. (David Myers, a Hope College social psychologist, authors psychology textbooks and trade books, including his recent essay collection, How Do We Know Ourselves? Curiosities and Marvels of the Human Mind.)
... View more
Labels
-
Current Events
-
Development Psychology
-
Social Psychology
1
0
2,932
david_myers
Author
02-20-2024
10:28 AM
“Young Americans are more pro-Palestinian than their elders. Why?” headlined a recent Washington Post article. ’Tis true, as many surveys reveal. In a late October 2023 YouGov poll, 20 percent of adults under age 29, but 65 percent of those 65 and over, reported pro-Israel sympathies in the Israel-Hamas war. In a follow-up Pew survey, 18- to 29-year-olds were less than half as likely as adults ages 65+ to “favor the Biden administration’s response to the Israel-Hamas war.” Consider other attitudinal generation gaps: Politics. In the 2020 U.S. presidential election, Biden won the support of most voters under age 30, while Trump was favored by a slight majority of those ages 65+. Climate concerns. In survey after survey, young adults express more concern for the future climate. They are, for example, more than twice as likely as adults ages 65+ to favor phasing out fossil fuels: Same-sex marriage. In the latest Gallup survey, 60 percent of those ages 65+, and 89 percent of 18- to 29-year-olds, favored gay marriage. Moreover, a generation gap exists worldwide. Religiosity. It’s no secret that worldwide, today’s young adults, compared with their elders, are less often religiously affiliated and engaged. They believe less, attend less, and pray less. These generational dissimilarities—with more documented by social psychologist Jean Twenge in Generations—have at least two possible explanations: A life-cycle explanation observes that attitudes can change with age. Our youthful progressivism may mutate into a more conservative later-life perspective. With life experience, people change. A cohort (generational) explanation observes that emerging adults form attitudes in response to their time, and then carry those attitudes throughout life? There is wisdom in both. We are not fixed entities. Over the last half century, most people, regardless of age, have become more accepting of same-sex marriage. With age, people may increasingly seek to conserve familiar traditions as values. Some agree with the old cliche, “Those who are not socialist by age 20 have no heart. Those who are not conservative by age 40 have no brain.” Yet, as Twenge and I explain in Social Psychology, Fourteenth Edition, the evidence more strongly supports the cohort/generational explanation. Attitudes form in youth and emerging adulthood, and then become more stable. In surveys of the same people over years, attitudes tend to change more from ages 15 to 25 than from ages 55 to 65. When asked to recall memorable life and world events, adults also tend to reminisce about happenings during their impressionable teens and young adult years. These are also the prime years for recruiting people into cults or to new political views. The teens and early twenties are formative. In Public Religion Research Institute data, below, depicting generation gaps in religiosity over time, I found more evidence of the cohort/generational effect. Note that in 1996, 20 percent of people in their 20s were religiously unaffiliated; 10 years later, 17 percent of people in their 30s were the same; and, 26 years later, 20 percent of people at roughly midlife, were religiously unaffiliated. But surely, you say, some people in each cohort will change as they age, by becoming religiously engaged or disengaged. And overall there has been a slight trend toward disaffiliation in each cohort. Yes, and yes. But what’s striking is each cohort’s overall stability over time. Today’s older generations were more likely, as youth, to have attended worship and religious education programs—the footprints of which they have retained into their later lives. In explaining the U.S. generation gap in attitudes toward Israelis and Palestinians, the Washington Post also offers a cohort explanation: Each age group has a different “generational memory” of Israel, Dov Waxman, director of the UCLA Younes and Soraya Nazarian Center for Israel Studies, said. Beliefs about the world tend to form in our late teens and early 20s and often don’t change, he said. Older generations, with a more visceral sense of the Holocaust, tend to see Israel as a vital refuge for the Jews. . . . But by the time millennials began forming their understanding of global events, the violence of the second Intifada had concluded in the mid-2000s with enhanced walls and barriers constructed between Israel and the West Bank, and then Gaza. This generation formed its idea of Israel from reports of Palestinians denied access to water, freedom of movement and fair trials. Evidence of cohort stability over time implies two important lessons. First, generational succession is destiny. Today’s older generation, with its ambivalence about gay rights, will be replaced by younger gay-supportive generations. Barring unanticipated events, support for climate change mitigation efforts will grow. In the absence of religious/spiritual renewal—which could happen (the proportion of religious “nones” does appear to have peaked)—secularism will increase. Second, there are few more influential vocations than educating, mentoring, guiding, and inspiring people during their formative teen and college-age years. To be sure, our entire life is a process of becoming and reforming. At every age, we are unfinished products. Yet the foundation of our future selves and of our deepest beliefs and values tends to be laid in the teachings, relationships, and experiences of those seminal years. (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com or check out his recent essay collection, How Do We Know Ourselves?: Curiosities and Marvels of the Human Mind. Follow him on Twitter: @davidgmyers.) *Photo credit Maskot/Getty Images
... View more
Labels
-
Development Psychology
-
Social Psychology
0
0
1,063
david_myers
Author
10-31-2023
06:25 AM
It’s a “national youth mental health crisis.” So says U.S. Surgeon General Vivek Murthy of post-2010 soaring teen depression. Today’s teens are sadder, lonelier, and (among girls) more suicide prone. It’s truly a tough time to be a teen. Image generated by Dall-E3 Converging evidence (as I summarized in a prior essay) points to a culprit: long hours on social media (4.8 hours per day, reports a new Gallup teen survey): Correlational evidence reveals not only the simultaneous increase in smartphones and depression, but also an association between daily social media hours and depression risk. Longitudinal studies have found that social media use at Time 1 predicts mental health issues at Time 2. Experiments that randomly assign people to more or less social media exposure verify causation. Quasi-experimental evidence confirms that the rollout of social media in a specific time and place predicts increased mental health issues. In hindsight, it’s understandable: Daily online hours entail less face-to-face time with friends, less sleep, and more comparison of one’s own mundane life with others’ more glamorous and seemingly successful lives. Others, it seems, are having more fun. As Theodore Roosevelt reportedly observed, “Comparison is the thief of joy.” Still, this its-social-media claim has dissenters. In the latest of her lucid Substack essays, Jean Twenge—psychology’s leading teen mental health sleuth—identifies a baker’s dozen alternative explanations for today’s teen malaise, each of which she rebuts. To sample a few: Today’s teens are just more transparent about their bad feelings. But behavioral measures, such as emergency room self-harm admissions, closely track the self-report changes. The media/depression correlation is too weak to explain the crisis. But even a small .20 correlation can explain “a good chunk” of the increased depression—with “girls spending 5 hours a day or more on social media [being] twice as likely to be depressed.” The new Gallup survey confirms Twenge’s surmise, reporting that “teens who spend five or more hours per day on social media apps are significantly more likely to report experiencing negative emotions compared with those who spend less than two hours per day.” And Twenge is surely right: “If teens who ate 5 apples a day (vs. none) were three times more likely to be depressed, parents would never let their kids eat that many apples.” It’s because of school shootings. But teen mental health risks have similarly surged in countries without school shootings. It’s due to increased school pressure and homework. But today’s teens, compared to their 1990s counterparts, report spending less time on homework. It’s because their parents are more depressed. But they aren’t. The mental health “crisis of our time” is a teen/young adult crisis. Of the thirteen alternative explanations, Twenge concedes some credibility to but one—“It’s because children and teens have less independence.” Indeed, compared to yesteryear’s free-range children, today’s kids less often roam their neighborhood, play without adult supervision, and spend time with friends. But this trend, Twenge notes, dovetails with their increased online time. Moreover, the trend toward less teen independence predated the upsurge in both online hours and depression. Twenge’s conclusion: “If teens were still seeing friends in person about as much, were sleeping just as much, and were not on social media 5 hours a day—all things traceable to the rise of smartphones and social media, I highly doubt teen depression would have doubled in a decade.” (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com or check out his new essay collection, How Do We Know Ourselves?: Curiosities and Marvels of the Human Mind. Follow him on Twitter: @davidgmyers.)
... View more
Labels
-
Development Psychology
-
Social Psychology
0
0
2,562
david_myers
Author
08-30-2023
09:36 AM
Many Americans are indifferent about marriage. In a 2019 Pew survey, 55 percent of 18- to 29-year-olds, and nearly half of all adults, agreed that couples who want to stay together are “just as well off if [they] decide not to marry.” In 2007 to 2009 University of Michigan surveys, high school seniors expressed even less esteem for marriage, with only about a third agreeing that “most people will have fuller and happier lives if they choose legal marriage rather than staying single or just living with someone.” Yet it’s no secret among those of us who study such things that marriage is a major predictor of health and human flourishing. See, for example, these General Social Survey data which I extracted from more than 64,000 randomly sampled Americans since 1972 (showing, also, a COVID-related 2021 morale dip). So does marriage—what anthropologist Joseph Henrich says “may be the most primeval of human institutions”—make for happiness? Before assuming such, critical thinkers should wonder about two other possibilities. First, does marriage (especially when compared to divorce) predict health and happiness merely because it compares those in surviving happy versus failed marriages? To see if getting married predicts long-term health and well-being across all new marriages, Harvard epidemiologist Tyler VanderWeele, with Ying Chen and colleagues, harvested data from 11,830 nurses who, in the Harvard Nurses’ Health Study, were unmarried in 1989. They identified those who married versus those who didn’t in the next four years, and then tracked their lives for 25 years. Even when including those who later divorced, those who had married were, 25 years later, healthier and less likely to have died. They were also happier, more purpose-filled, and less depressed and lonely. Ah, but what about the second possibility: Were the to-be-married nurses simply happier, healthier, and richer to begin with? Did happiness à marriage rather than marriage à happiness? Happy people do enjoy better and more stable relationships. Depressed people tend to be irritable, not fun to live with, and vulnerable to divorce. So surely happiness does predict marriage and marital stability. Yet even after controlling for preexisting health and well-being, reports VanderWeele, marriage remains “an important pathway to human flourishing. It increases physical health, mental health, happiness, and purpose.” And not just for straight folks, I would add (as Letha Dawsom Scanzoni and I explained in our 2005 book, A Christian Case for Gay Marriage). Marriage is one effective way to help fulfill the deep human need that Aristotle long ago recognized—the need to belong. Marriage mostly (though not always) works, VanderWeele suspects, because marriage provides companionship. It boosts health and longevity. And it offers sexual satisfaction. Thus, he reasons, societies’ tax, parental leave, and child-support policies should incentivize marriage. And marriage enrichment and counseling should be widely available. Indeed, mindful that all healthy close relationships support our human need to belong, society should support varied opportunities for companionship and attachment. Our workplaces, our neighborhoods, our worship places, our recreational facilities, and our schools can all work at being places of supportive connection—places where you and I feel like we belong. (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com or his new essay collection, How Do We Know Ourselves: Curiosities and Marvels of the Human Mind. Follow him on Twitter: @davidgmyers.) Photo credit: Peter Dazeley/The Image Bank/Getty Images
... View more
Labels
-
Development Psychology
-
Social Psychology
0
0
2,218
david_myers
Author
05-17-2023
01:33 PM
By now you’ve likely heard: Teen sadness, depression, and suicide attempts have soared since 2010, especially among girls. From 2011 to 2020, suicide-related issues rose fourfold, from 0.9 percent of U.S. pediatric emergency room visits to 4.2 percent. The teen mental health decline is unprecedented. And it is substantial. Although debate over causation continues, varied studies converge in pointing to social media as a likely major culprit. And legislators have taken note, by proposing bipartisan state and congressional bills that would limit teen social media access. The CDC’s new 100-page “Youth Risk Behavior Surveillance (YRBS) samples 17,232 ninth to twelfth graders from all U.S. public and private schools, and documents the malaise of today’s teen girls. But another troubled, less-discussed group also caught my eye—what the report refers to as LGBQ+ teens (transgender teens were not separately surveyed). A CDC summary document portrays the significant mental health challenges of LGBQ+ high schoolers: Poor mental health in last 30 days: Persistent feelings of sadness or hopelessness: Seriously considered attempting suicide: Made a suicide plan in the last year: Attempted suicide: These data replicate findings in other reports. In 2022, the Trevor Project, which studies and supports the mental health of LGBTQ youth, collected more than 34,000 reports from 13- to 24-year-old LGBTQ youth and young adults. Although the respondents were self-selected, the online survey found, exactly as did the CDC, that “45% of LGBTQ youth seriously considered attempting suicide in the past year.” What explains the sexual identity–mental health correlation? The common presumption is that the stigma and stress faced by sexual minorities exacts a toll. “We must recognize that LGBTQ young people face stressors simply for being who they are that their peers never have to worry about,” observed Trevor Project CEO Amit Paley. Digging deeper into the CDC’s YRBS data, it appears that, indeed, even with the growing acceptance of people who are gay, the stigmatization and stressors remain. Kids—and society at large—can be cruel. More findings: Did not go to school because of safety concerns: Bullied online: Bullied at school: These data prompt our sobering reflection on the struggles of LGBTQ youth. They also make us wonder: Might sexual-minority youth be less vulnerable to depression, hopelessness, and suicidal thinking if given . . . more access to mental health services? . . . a support group in a safe space? . . . greater public education about the realities of sexual orientation? Or what would you suggest? (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com or his new essay collection, How Do We Know Ourselves: Curiosities and Marvels of the Human Mind. Follow him on Twitter: @davidgmyers.) Image credit: Thomas Baker/Alamy Stock Photo.
... View more
Labels
-
Current Events
-
Development Psychology
-
Gender and Sexuality
0
0
1,658
david_myers
Author
03-09-2023
12:15 PM
At its best, psychological science transparently puts big competing ideas to the test. With varied methods and with replication of noteworthy findings, it winnows truth from the haystack of mere speculation. If the evidence supports an idea, so much the better for it. If the idea collides with a wall of fact, then it gets rejected or revised. In reality, psychology often falls short of this ideal. A noteworthy finding doesn’t replicate. Confirmation bias drives a researcher to selectively attend to supportive evidence. In rare cases, researchers have stage-managed desired results or even faked data. Yet the psychological science ideal is achievable. My social psychologist colleagues Jonathan Haidt and Jean Twenge exemplify this ideal as they assemble evidence regarding social media effects on teen mental health, and invite others to critique and supplement their data. “It is amazing how much I have learned, and refined my views, just by asking people to make me smarter,” Haidt has told me. The stimulus for their work is a troubling social phenomenon: As smartphones and social media use have spread among teens, teen depression has soared, especially among girls. Moreover, youth hospitalization for "attempted suicide or self-injury increased from 49,285 in 2009 to 129,699 in 2019." The CDC’s 2023 Youth Risk Behavior Survey report illustrates: Is this simultaneous increase in social media use and teen depression a mere coincidence? Given that plausible other factors such as economic trends, wars, or domestic violence seem not to account for the decade-long trend, Haidt and Twenge conjectured a plausible culprit: the shift from face-to-face relationships to screen-based relationships, with in-person time with friends dropping by more than half since 2010. More time online has also displaced sleep and play. And it has increased demoralizing social comparisons. As Cornell University’s Sebastian Deri and his colleagues found across eleven studies, most of us, in the age of selfies, perceive our friends as having more fun: Other folks seem to party more, eat out more, and look happier and prettier. Even teens not on social media are likely affected, Haidt notes. When friends are interacting online several hours a day, those not similarly engaged can feel left out and isolated. Halfpoint/iStock/Getty Images To assess their presumption of social media harm, and mindful of lingering skepticism, Haidt and Twenge assembled the available evidence from four psychological science methods: correlational, longitudinal, experimental, and quasi-experimental. Correlational: First, they asked, do daily social media hours correlate with teen mental health? In a recent Substack essay, Haidt notes that 80 percent of 55 studies answered yes. The correlation is modest when summed across genders and all forms of screen time, but becomes telling when, as shown in these UK data, one spotlights girls’ social media exposure. Longitudinal: Second, they asked, does social media use at Time 1 predict mental health at Time 2? Among 40 longitudinal studies, Haidt reports, in 25 the answer was yes. For example, in a new study reducing social media use proved “a feasible and effective method of improving body image” among vulnerable young adults. Experimental: Third, they asked, do experiments that randomly assign participants to social media exposure produce a mental health effect? In 12 of 18 experiments, mostly done with college students and young adults, the answer was, again, yes. Moreover, among the six studies finding no effect, four involved only a brief (week or less) social media diet. Quasi-experimental: Finally, they asked, do quasi-experiments find that the timing of social media arrival predicts mental health? Was the rollout of Facebook on a campus or the arrival of high-speed internet in a community followed—at that location—by increased mental health problems? In all six studies, Haidt reports, “when social life moves rapidly online, mental health declines, especially for girls.” Together, these correlational, longitudinal, experimental, and quasi-experimental findings illustrate how psychological science explores life-relevant questions with multiple methods. Moreover, the diverse findings weave a compelling answer to the social media–teen mental health question. In the words of Haidt’s Substack title: “Social Media is a Major Cause of the Mental Illness Epidemic in Teen Girls. Here’s the Evidence.” Would you agree with Haidt’s conclusion? If yes, would you also agree with recent bipartisan calls to restrict social media to those over 16? Would doing so be supportive of parents, teens, and schools—much as efforts to restrict teen smoking have effectively dropped teen smoking from nearly 23 percent in 2000 to 2 percent in 2021? Would you concur with researchers who advise parents to keep phones out of teens’ bedrooms at night? If you are a teen, does this research have any implications for your and and your friends' mental health? Should teens begin smartphone use with texting rather than with selfies and social media? Should they intentionally restrain their daily hours online? And if you don’t agree that social media are a “major cause” of teen girls’ increased depression, what would be your alternate explanation? The importance of these questions, for teens, families, and society, will drive further research and debate. In the meantime, the complementary insights gleaned from these correlational, longitudinal, experimental, and quasi-experimental studies showcase, methinks, psychological science at its best. (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com or his new essay collection, How Do We Know Ourselves: Curiosities and Marvels of the Human Mind. Follow him on Twitter: @davidgmyers.)
... View more
Labels
-
Development Psychology
-
Social Psychology
1
0
7,526
david_myers
Author
11-15-2022
12:28 PM
As President Biden approaches his 80th birthday and contemplates seeking reelection, many wonder: Does someone entering their ninth life decade have—and will they sustain—the energy, mental acuity, and drive to excel? From its early 2022 national survey (with ABC News), the Washington Post reported that “54 percent say they do not think Biden has the mental sharpness it takes to serve as president, while 40 percent say he does.” Mr. President. I empathize. I, too, turned 80 this fall. So on behalf of you and all of us octogenarians, let me shine the light of psychological science on our capacities. First, people should understand that the more we age, the less age predicts our abilities. Knowing that James is 8 and Jamal is 18 tells us much about their differences. Not so with two adults who similarly differ by a decade. Many a 80-year-old can outrun, outbike, and outthink a 70-year-old neighbor. It’s true that we 80s folks have some diminishing abilities. Like you, Mr. President, I can still jog—but not as fast or far. The stairs we once bounded up have gotten steeper, the newsprint smaller, others’ voices fainter. And in the molasses of our brain, memories bubble more slowly to the surface: We more often experience brain freezes as we try to retrieve someone’s name or the next point we were about to make. Hence your legendary gaffs. Yet with a lifetime’s accumulation of antibodies, we also suffer fewer common colds and flus than do our grandchildren. Physical exercise, which you and I regularly do, not only sustains our muscles, bones, and hearts; it also stimulates neurogenesis, the birth of new brain cells and neural connections. The result, when compared with sedentary folks like your predecessor, is better memory, sharper judgment, and minimized cognitive decline. Moreover, we either retain or grow three important strengths: Crystallized intelligence. We can admit to experiencing what researchers document: Our fluid intelligence—our ability to reason and react speedily—isn’t what it used to be. We don’t solve math problems as quickly or learn new technologies as readily, and we’re no match for our grandkids at video games. But the better news is that our crystallized intelligence—our accumulated knowledge and the ability to apply it—crests later in life. No wonder many historians, philosophers, and artists have produced their most noteworthy work later in life than have mathematicians and scientists. Anna Mary Robertson Moses (“Grandma Moses”) took up painting in her 70s. At age 89, Frank Lloyd Wright designed New York City’s Guggenheim Museum. At age 94, my psychologist colleague Albert Bandura has just co-authored yet another article. Perhaps our most important work is also yet ahead? Wisdom. With maturity, people’s social skills often increase. They become better able to take multiple perspectives, to offer helpful sagacity amid conflicts, and to appreciate the limits of their knowledge. The wisdom to know when we know a thing and when we do not is born of experience. Working at Berlin’s Max Planck Institute, psychologist Paul Baltes and his colleagues developed wisdom tests that assess people’s life knowledge and judgments about how to conduct themselves in complex circumstances. Wisdom “is one domain in which some older individuals excel,” they report. “In youth we learn, in age we understand,” observed the 19th-century novelist Marie Von Ebner-Eschenbach. Stable emotionality. As the years go by, our feelings mellow. Unlike teens, who tend to rebound up from gloom or down from elation within an hour, our highs are less high and our lows less low. As we age, we find ourselves less often feeling excited or elated. But our lives are also less often disrupted by depression. In later life we are better able to look beyond the moment. Compliments produce less elation; criticisms, less despair. At the outset of my career, praise and criticism would inflate and deflate my head. A publication might have me thinking I was God’s new gift to my profession, while a rejection led me to ponder moving home to join the family business. With experience, both acclaim and reproach become mere iotas of additional feedback atop a mountain of commentary. Thus, when responding to the day’s slings and arrows, we can better take a big-picture, long-term perspective. Mr. President, I understand these things, as I suspect you do, too. When in my 60s, I assumed—wrongly—that by age 80, I would no longer have the energy to read, to think, to write. Instead, I take joy in entering my office each day at a place called Hope. I relish learning something new daily. I find delight in making words march up a screen. And I’m mellower, as it takes more to make me feel either ecstatic or despondent. And you? Will you, as a newly minted octogenarian, show your age? Yes, that jog up to the podium will surely slow. You will likely more often misspeak or forget a point. Your sleep will be more interrupted. But you will also benefit from the crystallized intelligence that comes with your lifetime’s experience. You can harness the wisdom that comes with age. And you can give us the gift of emotional maturity that will enable you, better than most, to navigate the “battle between our better angels and our darkest impulses.” ------------ *This essay updates a 2020 essay published at TalkPsych.com. See also Psalm 92:14 🙂 (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com or his new essay collection: How Do We Know Ourselves: Curiosities and Marvels of the Human Mind. Follow him on Twitter: @davidgmyers.)
... View more
Labels
-
Development Psychology
2
1
2,583
david_myers
Author
11-08-2022
11:33 AM
This moment is both the oldest you have ever been, and the youngest you will henceforth be. To live is to age. But which age tends to offer the best of life, and which the worst? Should we anticipate our mid- or later-life future with hope, or with dread? When asked which life eras are the unhappiest, some say it’s the teen years: mood swings, social anxieties, parental demands, peer pressures, and career worries. Some say it’s later life, with its diminished occupational purpose, stamina, recall, and social networks—and with the Grim Reaper looming ever closer. Still others say it’s the in-between “midlife crisis” years, when people (traditionally referring to men) realize their career fantasies are not being realized, their marriage is ho-hum, and their bodies are aging. “While Diana puts on a brave face,” declared a 1991 People magazine cover, “a brooding [42-year-old] Prince Charles grapples with a midlife crisis and retreats to his old girlfriends.” Three decades ago, when authoring The Pursuit of Happiness: Who is Happy—and Why, I surmised that, apart from happy and stressful life events, no time of life was notably happier or unhappier. Consider Ronald Inglehart’s 1991 aggregated worldwide big data: Follow-up research confirmed the life span stability of well-being, with Robert McCrae, Paul Costa, and colleagues finding “no evidence” of instability in midlife men and women, including one study with 10,000 participants. Moreover, when the Gallup Organization (in data shared with me in 2010) asked 142,682 people worldwide to rate their lives on a ladder, from 0 (“the worst possible life”) to 10 (“the best possible life”), age gave no clue to life satisfaction: Andrew Jebb and colleagues analyzed more recent Gallup data from 1.7 million people in 166 countries, and although they also found claims of a U-shaped happiness trend, they similarly observed that a midlife dip is “overblown.” Any age differences, said the researchers, are “trivial.” Amassing similar, newer cross-sectional and longitudinal data similar, University of Alberta psychologist Nancy Galambos and her collaborators (see here and here) agree: “We cannot conclude that there is a universal U shape in happiness.” But hold on. Dartmouth College economist David Blanchflower and his colleagues (here and here) beg to differ, citing more than 300 studies, with data from 146 countries, that do find a U shape: “The midlife low occurs in the mid-40s.” Moreover, new data lend credence to a U-shaped happiness trajectory. A mostly British research team led by economist Osea Giuntella notes that prior research, being mostly cross-sectional, has compared people of differing ages at one point in time. When researchers compare different cohorts (groups born at different times), they also are comparing people raised in different economic, political, cultural, and educational circumstances. The Giuntella team instead harvested international data following some 500,000 people through time. Their conclusion? The midlife crisis is real: “Midlife is a time when people disproportionately take their own lives, have trouble sleeping, are clinically depressed, spend time thinking about suicide, feel life is not worth living, find it hard to concentrate, forget things, feel overwhelmed in their workplace, suffer from disabling headaches, and become dependent on alcohol.” When examining the newest Gallup World Poll survey data, Harvard epidemiologist Tyler VanderWeele and his co-researchers similarly “found the classic U-shaped pattern with life evaluation being higher for younger people and older people (up through about age 80) and lower in middle life.” The Blanchflower, Giuntella, and VanderWeele findings have changed me from a confident skeptic of U-shaped life span happiness to a curious agnostic, but still believing the age effect to be modest compared to bigger happiness predictors, such as personality traits, social support, and sense of life purpose, meaning, and hope. What’s more certain and concerning is a dramatic decline in psychological well-being among younger Americans. The last decade has witnessed an unprecedented soaring of serious depression among U.S. teens and young adults. In a large survey, VanderWeele’s team similarly found not a U-shape life curve but rather lower well-being among younger Americans; or, said differently, better well-being among older Americans. So be kind to the young adults in your life (possibly yourself). And consider more good news. As Nathan DeWall and I documented in Psychology, 13th Edition, older adults become more attentive to positive news, with their brains less responsive to negative events. They experience fewer relationship stresses, including less anxiety and anger. They become more trusting. And they become more likely to remember their lives’ good rather than bad events. In later life, we also become more emotionally stable, our emotional highs less high, our lows less low. Things that used to irritate—slow traffic, poor restaurant service, a friend’s snub—no longer seem like such big deals. With age, compliments trigger less elation and criticisms less despair, as each become mere iotas of additional feedback atop a lifetime of accumulated praise and reproach. “At 70,” said Eleanor Roosevelt, “I would say the advantage is that you take life more calmly. You know that ‘this, too, shall pass!’” Ergo, despite recent indications of a slight midlife dip, people’s happiness and life satisfaction is remarkably stable—with the striking exception being the dramatic recent rise in teen/young adult depression. But the best news is that later life is not to be feared. For most older adults, life seems, on balance, to be, and to have been, mostly good. (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com or his new essay collection: How Do We Know Ourselves: Curiosities and Marvels of the Human Mind. Follow him on Twitter: @davidgmyers.)
... View more
Labels
-
Development Psychology
0
0
2,042
david_myers
Author
01-07-2022
08:05 AM
In Exploring Psychology, 12th Edition, Nathan DeWall and I report that autism spectrum disorder (ASD) “is now diagnosed in 1 in 38 children in South Korea, 1 in 54 in the United States, 1 in 66 in Canada….” Check that. A new CDC report on 2018 data raises the continually increasing U.S. proportion to 1 in 44 (23 per 1,000 among 8-year-olds in eleven representative communities followed by the Autism and Developmental Disabilities Monitoring Network). The report also confirms that ASD diagnoses are four times more common among boys than girls. Psychologist Simon Baron-Cohen believes the gender imbalance is because boys tend to be “systemizers”: They more often understand things according to rules or laws, as in mathematical and mechanical systems. Girls, he contends, tend to be “empathizers”: They excel at reading facial expressions and gestures. And what racial/ethnic group do you suppose has the highest rate of ASD diagnoses? The answer: There are no discernible differences (nor across socioeconomic groups). In 2018, ASD was diagnosed equally often among all racial/ethnic groups. A final fact to ponder: 4-year-olds, the CDC reports, were “50 percent more likely to receive an autism or special education classification” than were 8-year-olds. So what do you think? Is the increasing ASD diagnosis rate—over time and of 4-year-olds—a a welcome trend? Is Karen Remley, director of the CDC’s National Center on Birth Defects and Developmental Disabilities right to regard this “progress in early identification” as “good news because the earlier that children are identified with autism the sooner they can be connected to services and support”? Or does the increased labeling of children become a self-fulfilling prophecy that assigns children to a category that includes some social deficiency, and then treats them differently as a result? And does the trend reflect some relabeling of children’s disorders, as reflected in the decreasing diagnoses of “cognitive disorder” and “learning disability”? (The popularity of different psychiatric labels does exhibit cultural variation across time and place.) In this COVID-19 era of anti-vax fears, this much we know for sure: One thing that does not contribute to rising ASD diagnoses is childhood vaccinations. Children receive a measles/mumps/rubella (MMR) vaccination in early childhood, about the time ASD symptoms first get noticed—so some parents naturally presumed the vaccination caused the ensuing ASD. Yet, despite a fraudulent 1998 study claiming otherwise, vaccinations actually have no relationship to the disorder. In one study that followed nearly 700,000 Danish children, those receiving the measles/mumps/rubella vaccine were slightly less likely to later be among the 6517 ASD-diagnosed children. (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com. Follow him on Twitter: @davidgmyers.)
... View more
Labels
-
Current Events
-
Development Psychology
2
0
2,606
david_myers
Author
04-29-2021
10:49 AM
“I am among [Michigan’s] 300 plus ‘Juvenile Lifers,’” a prisoner known to his friends as Chan wrote me in 1994, kindly passing along a math error he had caught in one of my textbooks. More than half a lifetime ago, Chan, as a 17-year-old, had joined a friend in committing an armed robbery and murder. He expressed “great remorse and regret” for his crime, as well as his hope to learn and grow with the goal of contributing “something of substance and worth.” In the ensuing six years of our occasional correspondence, Chan—an intelligent and now deeply religious man—has been described to me by others, including the retired superintendent of his former prison, as a model prisoner. He is excelling in prison-taught college courses. After taking introductory psychology with my text, he alerted me that Aristotle’s Apothegems is actually spelled Apothegms. Chan, now in his mid-40s, would much rather be contributing to society and paying taxes than having his room and board funded by Michigan taxpayers, whose $2.06 billion prison budget impedes our governor’s fulfilling her campaign pledge to “fix the damn roads.” But does society somehow benefit from keeping those who have committed an impulsive juvenile crime endlessly locked up? Might Chan, if released, still be a risk? Hardly. Teens’ inhibitory frontal lobes lag the development of their emotional limbic system. With brains not yet fully prepared to calculate long-term consequences, the result is teen impulsiveness and emotionality. No wonder arrest rates for rape, assault, and murder soar during the teen years and decline after age 20—to a much lower level by the mid-40s. As psychologist David Lykken noted, “We could avoid two-thirds of all crime simply by putting all able-bodied young men in cryogenic sleep from the age of 12 through 28.” By that time, the frontal lobes have matured, testosterone is subsiding, and men are mellowing. Middle-aged men are not just adolescents with inflated waistlines. But if the incarceration of juvenile lifers like Chan is costly to society, might it nevertheless deter future Chans from violent acts? Alas, when committing an impulsive act or a crime of passion, people seldom pause to calmly calculate the long-term consequences. (Even the threat of capital punishment does not predict lower state homicide rates.) Any deterrence effect lies less with the length of a punishment than with its probability—its swiftness and sureness. The immaturity of the teen brain and the diminishing risk of violence with age, as explained in Supreme Court briefs by the American Psychological Association and other health associations, contributed to the Court’s 2012 ruling that mandatory life-without-parole sentences for juveniles violated the constitutional prohibition of cruel and unusual punishment. Even discretionary life-without-parole sentences were unconstitutional, it ruled, except for “the rarest of juvenile offenders, those whose crimes reflect permanent incorrigibility.” Then, last week, the Court qualified that judgment by affirming the life sentence of Mississippian Brett Jones, who—when barely age 15, and after a lifetime of abuse—responded to his grandfather’s reportedly hitting him by impulsively stabbing his grandfather to death. Like Chan, Jones, now 31, is said to be “remorseful for his crime, hardworking and a ‘good kid’” who gets along with everybody. How ironic, commentators noted, that the majority opinion—that teens can forever be held responsible for their juvenile misdeeds—was written by Justice Brett Kavanaugh, who had argued during his confirmation hearings that holding him responsible for his high school yearbook page was “a new level of absurdity.” Moreover, responded Justice Sonia Sotomayor, this decision will prevent hundreds of other juvenile defendants, 70 percent of whom are people of color, from securing early release. zodebala/E+/Getty Images Nevertheless, there has been increasing bipartisan concern about the human and financial costs of lengthy mass incarceration for long-ago transgressions. The Smarter Sentencing Act, co-sponsored by Senators Mike Lee (R-Utah) and Dick Durbin (D-Illinois), responds to the reality that the seven-fold increased federal prison population since 1980 makes such incarceration “one of our nation’s biggest expenditures, dwarfing the amount spent on law enforcement.” Surely, we can say yes to public protection, but also yes to smarter sentencing—sentencing that holds the Chans and Brett Joneses accountable for their acts, while also recognizing that the impulsive, momentary act of an immature teen needn’t predict one’s distant future. Indeed, how many of us would like to be judged today by the worst moments of our immature adolescence? (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com; follow him on Twitter: @DavidGMyers.)
... View more
Labels
-
Development Psychology
-
Neuroscience
-
Social Psychology
1
0
2,307
david_myers
Author
11-20-2020
06:00 AM
Today, even while wishing President-elect Biden a happy birthday, some wonder: At age 78, does he have—and will he sustain for four years—the energy, mental acuity, and drive to excel in his new role? Or, as he approaches 80, will he embody his opponent’s caricature of “Sleepy Joe”—as someone not to be trusted with the cognitive demands of national and world leadership? Mr. President-elect, I empathize. I, too, turned 78 this fall. So on behalf of you and all of us late-70s folks, let me shine the light of psychological science on our capacities. First, people should understand that the more we age, the less age predicts our abilities. Knowing that James is 8 and Jamal is 18 tells us much about their differences. Not so with two adults who similarly differ by a decade. Many a 78-year-old can outrun and outthink a 68-year-old neighbor. It’s true that we late-70s folks have some diminishing abilities. Like you, Mr. President-elect, I can still jog—but not as fast or far. The stairs we once bounded up have gotten steeper, the newsprint smaller, others’ voices fainter. And in the molasses of our brain, memories bubble more slowly to the surface: We more often experience brain freezes as we try to retrieve someone’s name or the next point we were about to make. Yet with a lifetime’s accumulation of antibodies, we also suffer fewer common colds and flus than do our grandchildren. Physical exercise, which you and I regularly do, not only sustains our muscles, bones, and hearts; it also stimulates neurogenesis, the birth of new brain cells and neural connections. The result, when compared with sedentary folks like your predecessor, is better memory, sharper judgment, and minimized cognitive decline. Moreover, we either retain or grow three important strengths: Crystallized intelligence. We can admit to experiencing what researchers document: Our fluid intelligence—our ability to reason and react speedily—isn’t what it used to be. We don’t solve math problems as quickly or learn new technologies as readily, and we’re no match for our grandkids at video games. But the better news is that our crystallized intelligence—our accumulated knowledge and the ability to apply it—crests later in life. No wonder many historians, philosophers, and artists have produced their most noteworthy work later in life than have mathematicians and scientists. Anna Mary Robertson Moses (“Grandma Moses”) took up painting in her 70s. At age 89, Frank Lloyd Wright designed New York City’s Guggenheim Museum. At age 94, my psychologist colleague Albert Bandura has just co-authored yet another article. Perhaps our most important work is also yet ahead? Wisdom. With maturity, people’s social skills often increase. They become better able to take multiple perspectives, to offer helpful sagacity amid conflicts, and to appreciate the limits of their knowledge. The wisdom to know when we know a thing and when we do not is born of experience. Working at Berlin’s Max Planck Institute, psychologist Paul Baltes and his colleagues developed wisdom tests that assess people’s life knowledge and judgments about how to conduct themselves in complex circumstances. Wisdom “is one domain in which some older individuals excel,” they report. “In youth we learn, in age we understand,” observed the 19th-century novelist Marie Von Ebner-Eschenbach. Stable emotionality. As the years go by, our feelings mellow. Unlike teens, who tend to rebound up from gloom or down from elation within an hour, our highs are less high and our lows less low. As we age, we find ourselves less often feeling excited or elated. But our lives are also less often disrupted by depression. We late-70s people are better able to look beyond the moment. Compliments produce less elation; criticisms, less despair. At the outset of my career, praise and criticism would inflate and deflate my head. A publication might have me thinking I was God’s new gift to my profession, while a rejection led me to ponder moving home to join the family business. With experience, both acclaim and reproach become mere iotas of additional feedback atop a mountain of commentary. Thus, when responding to the day’s slings and arrows, we can better take a big-picture, long-term perspective. Mr. President-elect, I understand these things, as I suspect you do, too. When in my 60s, I assumed—wrongly—that by age 78, I would no longer have the energy to read, to think, to write. Instead, I take joy in daily entering my office at a place called Hope. I relish each day learning something new. I find delight in making words march up a screen. And I’m mellower, as it takes more to make me feel either ecstatic or despondent. And you? Will you, as a newly minted 78-year-old, show your age? Yes, that jog up to the podium will surely slow. You will likely more often misspeak or forget a point. Your sleep will be more interrupted. But you will also benefit from the crystallized intelligence that comes with your lifetime’s experience. You can harness the wisdom that comes with age. And you can give us the gift of emotional maturity that will enable you, better than most, to navigate, as you have said, the “battle between our better angels and our darkest impulses.” (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com or follow him on Twitter @DavidGMyers).
... View more
Labels
-
Development Psychology
2
0
6,425
Topics
-
Abnormal Psychology
6 -
Achievement
1 -
Affiliation
1 -
Cognition
7 -
Consciousness
7 -
Current Events
25 -
Development Psychology
11 -
Developmental Psychology
9 -
Emotion
10 -
Gender
1 -
Gender and Sexuality
1 -
Genetics
2 -
History and System of Psychology
2 -
Industrial and Organizational Psychology
2 -
Intelligence
3 -
Learning
3 -
Memory
2 -
Motivation
3 -
Motivation: Hunger
2 -
Nature-Nurture
4 -
Neuroscience
6 -
Personality
9 -
Psychological Disorders and Their Treatment
8 -
Research Methods and Statistics
22 -
Sensation and Perception
8 -
Social Psychology
77 -
Stress and Health
8 -
Teaching and Learning Best Practices
7 -
Thinking and Language
12 -
Virtual Learning
2
Popular Posts