-
About
Our Story
back- Our Mission
- Our Leadershio
- Accessibility
- Careers
- Diversity, Equity, Inclusion
- Learning Science
- Sustainability
Our Solutions
back
-
Community
Community
back- Newsroom
- Discussions
- Webinars on Demand
- Digital Community
- The Institute at Macmillan Learning
- English Community
- Psychology Community
- History Community
- Communication Community
- College Success Community
- Economics Community
- Institutional Solutions Community
- Nutrition Community
- Lab Solutions Community
- STEM Community
- Newsroom
- Macmillan Community
- :
- Psychology Community
- :
- Psychology Blog
- :
- Psychology Blog - Page 38
Psychology Blog - Page 38
Options
- Mark all as New
- Mark all as Read
- Float this item to the top
- Subscribe
- Bookmark
- Subscribe to RSS Feed
Psychology Blog - Page 38

Author
07-18-2016
11:37 AM
Originally posted on September 29, 2015. My last blog essay reported surveys that show social psychologists are mostly political liberals. But I also noted that “To our credit, we social psychologists check our presumptions against data. We have safeguards against bias. And we aim to let the chips fall where they may.” Fresh examples of such evidence-based reasoning come from two recent analyses. The first Analysis has been welcomed by some conservatives (who doubt that sexism is rife in academic hiring). The second has been welcomed by liberals (who see economic inequality as psychologically and socially toxic). (1) Using both actuarial and experimental studies, Cornell psychologists Stephen Ceci and Wendy Williams looked for possible sexism in academic hiring, but found that “in tenure-track hiring, faculty prefer female job candidates over identically qualified male [candidates].” Their Chronicle of Higher Education defense of their work reminded me of a long-ago experience. Hoping to demonstrate sexism in action, I attempted a class replication of Philip Goldberg’s famous finding that people give higher ratings to an article attributed to a male (John McKay) than to a female (Joan McKay). Finding no such difference, my student, Janet Swim (now a Penn State social psychologist) and I searched for other attempts to replicate the finding. Our published meta-analysis, with Eugene Borgida and Geoffrey Maruyama, confirmed Ceci/ Williams’ negligible finding. Neither Ceci/ Williams today, nor us yesterday, question other manifestations of cultural sexism. Rather, in both cases, “Our guiding principle,” to use Ceci/ Williams’ words, “has been to follow the data wherever it takes us.” (2) Following the data also has led social psychologists to see the costs of extreme inequality. As I noted in an earlier TalkPsych essay, “psychologists have found that places with great inequality tend to be less happy places...with greater health and social problems, and higher rates of mental illness.” In soon-to-be-published research, Shigehiro Oishi and Selin Kesebir observe that inequality also explains why economic growth often does not improve human happiness. My most oft-reprinted figure, below, shows that Americans today are no happier than they were in 1957 despite having triple the average income. But average income is not real income for most Americans. If the top 1 percent experience massive income increases, that could raise the average but not the actual income for most. Indeed, real (inflation-adjusted) median U.S. wages have in fact been flat for some years now. With the rising economic tide lifting the yachts but not the rowboats, might we be paying a psychological price for today’s greater inequality? By comparing economic growth in 34 countries, Oishi and Kesebir show that economic growth does improve human morale when it is widely distributed, but not when “accompanied by growing income inequality...Uneven growth is unhappy growth.” Ergo, it’s neither conservative nor liberal to follow the data, and—as text authors and essayists—to give the data a voice.
... View more
Labels
0
0
1,611

Author
07-18-2016
11:23 AM
Originally posted on October 20, 2015. In response to the big “Reproducibility Project” news that only 36 percent of a sample of 100 psychological science studies were successfully replicated, psychologists have reassured themselves that other fields, including medicine, also have issues with reproducibility. Moreover, differing results sometimes illuminate differing circumstances that produce an effect. Others have agreed on a lesson for textbook authors. “A finding is not worth touting or inserting in the textbooks until a well-powered, pre-registered, direct replication is published,” argues Brent Roberts. “The conclusions of textbooks should be based not on single studies but on multiple replications and large-scale meta-analyses,” advise Wolfgang Stroebe and Miles Hewstone. Those are high standards that would preclude textbook authors reporting on first-time discoveries, some of which are based on big data. Ironically, it would even preclude reporting on the one-time Reproducibility Project finding (can it be replicated?). Even so, my introductory psychology co-author, Nathan DeWall, and I are cautious about reporting single-shot findings. Some intriguing new studies end up not in our texts but in our next-edition resource files, marked “needs a replication.” And we love meta-analyses, which give us the bigger picture, digested from multiple studies. So, I wondered: How did we do? How many of the nonreproducible studies ended up in Psychology, 11th Edition? Checking the list, my projects manager, Kathryn Brownson, found three of the 100 studies in our bibliography—one of which successfully replicated, one of which produced insufficient data for a replication, and one of which failed to replicate. Thus, from page 504: In several studies, giving sugar (in a naturally rather than an artificially sweetened lemonade) had a sweet effect: It strengthened people’s effortful thinking and reduced their financial impulsiveness (Masicampo & Baumeister, 2008; Wang & Dvorak, 2010). will likely become: In one study, giving sugar (in a naturally rather than an artificially sweetened lemonade) had a sweet effect: It reduced people’s financial impulsiveness (Wang & Dvorak, 2010). Ergo, out of 5174 bibliographic citations, one citation—and its five associated text words—will end up on the cutting room floor.
... View more
0
0
1,157

Author
07-18-2016
11:19 AM
Originally posted on October 27, 2015. Phantom limb sensations are one of psychology’s curiosities. Were you to suffer the amputation of a limb, your brain might then misinterpret spontaneous activity in brain areas that once received the limb’s sensory input. Thus, amputees often feel pain in a nonexistent limb, and even try to step out of bed onto a phantom leg, or to lift a cup with a phantom hand. Phantoms also haunt other senses as the brain misinterprets irrelevant brain activity. Therefore, those of us with hearing loss may experience the sound of silence—tinnitus (ringing in the ears in the absence of sound). Those with vision loss may experience phantom sights (hallucinations). Those with damaged taste or smell systems may experience phantom tastes or smells. And now comes word from the Turkish Journal of Psychiatry that 54 percent of 41 patients who had undergone a mastectomy afterwards experienced a continued perception of breast tissue, with 80 percent of those also experiencing “phantom breast pain.” As I shared this result (gleaned from the Turkish journal’s contents in the weekly Current Contents: Social and Behavioral Sciences) with my wife, I wondered: Is there any part of the body that we could lose without the possibility of phantom sensations? If an ear were sliced off, should we not be surprised at experiencing phantom ear syndrome? The larger lesson here: There’s more to perception than meets our sense receptors. We feel, see, hear, taste, and smell with our brain, which can experience perceptions with or without functioning senses.
... View more
0
0
1,622

Author
07-18-2016
11:18 AM
Originally posted on November 4, 2015. Writing in the August, 2015, Scottish Banner, University of Dundee historian Murray Watson puzzled over having “failed to find a satisfactory answer” for why Scots’ Scottish identity is so much stronger than their English identity. It’s a phenomenon I, too, have noticed, not only in the current dominance of the Scottish Nationalist Party, but also in more mundane ways. When recording their nationality in B&B guest books, I’ve observed people from England responding “British,” while people from Scotland often respond “Scottish” (though the two groups are equally British). Paul Mansfield Photography/Moment Open/Getty Images And Watson notes another example: England’s 53 million people outnumber Scotland’s 5+ million by 10 to 1. Yet the U.S. and Canada have, between them, only 9 English clubs (Royal Societies of St. George) and 111 Scottish clubs (St. Andrews Societies). What gives? Social psychologists have an answer. As the late William McGuire and his Yale University colleagues demonstrated, people’s “spontaneous self-concepts” focus on how they differ from the majority around them. When invited to “tell us about yourself,” children mostly mention their distinctive attributes: Foreign-born children mention their birthplace. Redheads mention their hair color. Minority children mention their race. This insight—that we are conscious of how we differ from others—explains why gay people are more conscious of their sexual identity than are straight people (except when straight folks are among gays), and why any numerical minority group tends to be conscious of its distinctiveness from the larger, surrounding culture. When occasionally living in Scotland, where my American accent marks me as a foreigner, I am conscious of my national identity and sensitive to how others may react. Being a numerical British minority, Scots are conscious of their identity and of their rivalries with the English. Thus, rabid fans of Scottish football (soccer) may rejoice in either a Scotland victory or an English defeat. “Phew! They Lost!,” headlined one Scottish tabloid after England’s 1996 Euro Cup defeat—by Germany, no less. Likewise, report a New Zealand-Australian research team, the 4 million New Zealanders are more conscious of their New Zealand identity vis-à-vis the 23 million Australians than vice-versa, and they are more likely to root for Australia’s sports opponents. “Self-conciousness,” noted C. S. Lewis in The Problem of Pain, exists only in “contrast with an ‘other,’ a something which is not the self.” So, why do the Scots have a stronger social identity than the English? They have their more numerous and powerful neighbors, the English, to thank for that.
... View more
Labels
0
0
2,061

Author
07-18-2016
11:14 AM
Originally posted on November 14, 2015. Thursday night’s Buffalo Bills versus New York Jets football game was lampooned for its red and green uniforms. “Christmas pjs” in the NFL? But for “colorblind” people there was a bigger problem. As Nathan DeWall and I explain in Psychology, 11th Edition, “Most people with color-deficient vision are not actually ‘colorblind.’ They simply lack functioning red- or green-sensitive cones, or sometimes both.” The classic textbook illustration at left—which the NFL apparently forgot—reminds us that for some folks (most of whom, like most NFL fans, are male) those red and green uniforms likely looked more like this. Twitter messages flowed: Note to the NFL, from psychology teachers and text authors: Thanks for the great example!
... View more
Labels
0
0
1,683

Author
07-18-2016
10:56 AM
Originally posted on December 1, 2015. “Happiness doesn’t bring good health,” headlines a December 9 New York Times article. “Go ahead and sulk,” explain its opening sentences. “Unhappiness won’t kill you.” Should we forget all that we have read and taught about the effects of negative emotions (depression, anger, stress) on health? Yes, this is “good news for the grumpy,” one of the study authors is quoted as saying. In this Lancet study, which followed a half million British women over time, “unhappiness and stress were not associated with an increased risk of death,” reported the Times. A closer look at the study tells a somewhat different story, however. Its title—“Does Happiness Itself Directly Affect Mortality?”—hints at an explanation for the surprising result. Contrary to what the media report suggests, the researchers found that “Compared with those reporting being happy most of the time, women who had reported being unhappy had excess all-cause mortality when adjusting only for age.” Said simply, the unhappy women were 36 percent more likely to die during the study period. But the happy women also exercised more, smoked less, and were more likely to live with a partner and to participate in religious and other group activities. Controlling for those variables “completely eliminated” the happiness-longevity association, and that explains the headline. In much the same way, one can reduce or eliminate the religiosity-health association by controlling for the factors that mediate the religiosity effect (social support, healthier lifestyle, greater positive emotion). Ditto, one can eliminate the seeming effect of a hurricane by “controlling for” the confounding effect of the wind, rain, and storm surge. A hurricane “by itself,” after eliminating such mediating factors, has little or no “direct effect.” Likewise, happiness “by itself” has little or no direct effect on health—a finding that few researchers are likely to contest. P.S. For more critique of the happiness-health study, see here.
... View more
Labels
0
0
1,751

Author
07-18-2016
09:51 AM
Originally posted on January 12, 2016. Recent presidential debates offered a consensus message: be afraid. “They’re trying to kill us all,” warned Lindsay Graham. “America is at war,” echoed Ted Cruz. “Think about the mothers who will take those children tomorrow morning to the bus stop wondering whether their children will arrive back on that bus safe and sound,” cautioned Chris Christie. The terrorist threat is real, and its results horrific. With scenes from the Paris and San Bernardino attacks flooding our minds, the politics of fear has grown. Twenty-seven percent of Americans recently identified terrorism as their biggest worry—up from 8 percent just before the Paris attacks. In two new national surveys (here and here), terrorism topped the list of “most important” issues facing the country. We are, observed Senator Marco Rubio, “really scared and worried” . . . and thus the fears of Syrian refugees, or even all Muslims. We may, however, be too afraid of terrorism, and too little afraid of other much greater perils. Moreover, fearing the wrong things has social and political consequences, as I explain here (a site that also offers other behavioral scientists’ reflections on important scientific news).
... View more
Labels
0
0
1,429

Author
07-18-2016
09:48 AM
Originally posted on January 22, 2016. At the invitation of Princeton University Press, I have just read a fascinating forthcoming book, Stranger in the Mirror: The Scientific Search for the Self, by Fresno State psychologist Robert Levine. In one chapter, Levine, who is one of psychology’s most creative writers, recalls a time when ideas rushed into his head, which he quickly put on paper. “It felt as if there was a very clever fellow somewhere inside me, a guy who came up with better ideas than I ever could. What right did I have to pat myself on the back? I was little more than a recording secretary.” Levine recounts people’s experiences of ideas popping to mind unbidden. Many writers report feeling like scribes for story lines and sentences that come from, to use Charles Dickens’ words, “some beneficent power.” An artist friend of mine tells me of his delight in observing what his hand is painting. “The writer Robert Davies summed it up neatly,” reports Levine: “‘I am told the story. I record the story.’” As a writer, that, too, is my frequent experience. As I make words march up the screen, I often feel more like a secretary, a mere recorder of ideas and words that come from I know not where. And yet I also know that if I keep reading and reflecting—and feeding the friendly little genie that each of us has in our heads—it will keep dictating, and I will continue transcribing. To a 21st century psychological scientist, the genie-like muse is an eruption of our active unconscious mind. In study after study, people benefit from letting their mind work on a problem while not consciously thinking about it. Facing a difficult decision, we’re wise to gather information, and then say, “Give me some time to not think about this.” After letting it incubate, perhaps even sleeping on it, a better answer—or a better narrative—may appear unbidden. To others, the voice in one’s head may seem like “the Spirit at work,” or even the still small voice of God. Or, perhaps, it is both?
... View more
Labels
0
0
1,624

Author
07-18-2016
09:45 AM
Originally posted on February 2, 2016. You’ve likely heard the NPR ads for brain fitness games offered by Lumosity. “70 Million brain trainers in 182 countries challenge their brains with Lumosity,” declares its website. The hoped-for results range from enhanced cognitive powers to increased school and work performance to decreased late-life cognitive decline or dementia. But do brain-training games really makes us smarter or enlarge our memory capacity? In our just-released Exploring Psychology, 10th Edition, Nathan DeWall and I suggest “that brain training can produce short-term gains, but mostly on the trained tasks and not for cognitive ability in general.” As an earlier TalkPsych blog essay reported, Zachary Hambrick and Randall Engle have “published studies and research reviews that question the popular idea that brain-training games enhance older adults’ intelligence and memory. Despite the claims of companies marketing brain exercises, brain training appears to produce gains only on the trained tasks (without generalizing to other tasks).” And that is also the recently announced conclusion of the Federal Trade Commission (FTC), when fining Lumosity’s maker, Lumos Labs, $2 million for false advertising. As FTC spokesperson Michelle Rusk reported to Science, “The most that they have shown is that with enough practice you get better on these games, or on similar cognitive tasks...There’s no evidence that training transfers to any real-world setting.” Although this leaves open the possibility that certain other brain-training programs might have cognitive benefits, the settlement affirms skeptics who doubt that brain games have broad cognitive benefits.
... View more
0
0
1,935

Author
07-18-2016
09:41 AM
Originally posted on February 10, 2016. Three items from yesterday’s reading: 1. The Society for Industrial and Organizational Psychology (SIOP) has just offered a nice video introduction to I/O Psychology (here). At 4-minutes, it’s well-suited to class use. 2. Not brand new—but new to me—is a wonderful 7½ minute illustrated synopsis of social-cognitive explanations of why, despite converging evidence, so many people deny human-caused climate change. The video, from biologist-writer Joe Hansen and PBS Digital Studios, is available here. For more on how psychological science can contribute to public education about climate change—and to a pertinent new U.N. Climate Panel conference—see here. 3. Does witnessing peers academically excelling inspire other students to excel? Or does it discourage them? Schools, with their love of prizes and awards, seem to assume the former. Researchers Todd Rogers and Avi Feller report (here) that exposure to exemplary peers can deflate, discourage, and demotivate other students (and increase their droppiing out from a MOOC course).
... View more
0
0
1,615

Author
07-18-2016
09:37 AM
Originally posted on March 1, 2016. Amid concerns about the replicability of psychological science findings comes “a cause for celebration,” argue behavior geneticist Robert Plomin and colleagues (here). They identify ten “big” take-home findings that have been “robustly” replicated. Some of these are who-would-have-guessed surprises. 1. “All psychological traits show significant and substantial genetic influence.” From abilities to personality to health, twin and adoption studies consistently reveal hereditary influence. 2. “No traits are 100% heritable.” We are knitted of both nature and nurture. 3. “Heritability [differences among individuals attributable to genes] is caused by many genes of small effect.” There is no single “smart gene,” “gay (or straight) gene,” or “schizophrenia gene.” 4. "Correlations between psychological traits show significant and substantial genetic mediation.” For example, genetic factors largely explain the correlation found among 12-year-olds’ reading, math, and language scores. 5. “The heritability of intelligence increases throughout development.” I would have guessed—you, too?—that as people mature, their diverging life experiences would reduce the heritability of intelligence. Actually, heritability increases, from about 41% among 9-year-olds to 66% among 17-year-olds, and to even more in later adulthood, studies suggest. 6. “Age-to-age stability is mainly due to genetics.” This—perhaps the least surprising finding—indicates that our trait stability over time is genetically disposed. 7. “Most measures of ‘environment’ show significant genetic influence.” Another surprise: many measures of environmental factors—such as parenting behaviors—are genetically influenced. Thus if physically punitive parents have physically aggressive children both may share genes that predispose aggressive responding. 8. “Most associations between environmental measures and psychological traits are significantly mediated genetically.” For example, parenting behaviors and children’s behaviors correlate partly due to genetic influences on both. 9. “Most environmental effects are not shared by children growing up in the same family.” As Nathan DeWall and I report in Psychology, 11th Edition, this is one of psychology’s most stunning findings: “The environment shared by a family’s children has virtually no discernible impact on their personalities.” 10. “Abnormal is normal.” Psychological disorders are not caused by qualitatively distinct genes. Rather, they reflect variations of genetic and environmental influences that affect us all. HOMETOWNCD/Getty Images From this “firm foundation of replicable findings,” Plomin and colleagues conclude, science can now build deeper understandings of how nature and nurture together weave the human fabric.
... View more
0
0
2,535

Author
07-18-2016
09:32 AM
Originally posted on March 4, 2016. Would you agree or disagree with these statements from the Narcissistic Personality Inventory? 1. I know that I am good because everybody keeps telling me so. 2. People love me. And you know what, I have been very successful. 3. I really like to be the center of attention. 4. Some people would say I’m very, very, very intelligent. Excuse my fibbing. The even-numbered sentences are actually the words of Donald Trump—a “remarkably narcissistic” person, surmises developmental psychologist Howard Gardner. Trump’s narcissism even extends to his self-perceived superior humility (which brings to mind the wisdom of C. S. Lewis: “If a man thinks he is not conceited, he is very conceited indeed.”): So how do self-important, self-focused, self-promoting narcissists fare over time? Does their self-assurance, charm, and humor make a generally favorable impression, especially in leadership roles? Or is their egotism, arrogance, and hostility off-putting? In a recent Journal of Personality and Social Psychology article, Marius Leckelt and colleagues, report that narcissists make good first impressions, but over time, their arrogance, bragging, and aggressiveness gets old. Their findings replicate what Delroy Paulhus long ago observed in a seven-session study of small teams: People’s initially positive impressions of narcissists turned negative by the end. So, will this phenomenon hold true for Trump and eventually deflate his popularity during this U.S. presidential campaign season? What do you think?
... View more
Labels
0
0
1,481

Author
07-18-2016
09:19 AM
Originally posted on March 29, 2016. Some psychological science findings are just plain fun. Few are more so than the studies of what Brett Pelham and his colleagues call “implicit egotism”—our tendency to like what’s associated with us. We tend to like letters that appear in our name, numbers that resemble our birthdate, politicians whose faces are morphed to include features of our own, and even—here comes the weird part— places and careers with names resembling our own name. Believe it or not, Philadelphia has a disproportionate number of residents named Phil. Virginia Beach has a disproportionate number of people named Virginia, as does St. Louis with men named Louis. And California and Toronto have an excess number of people whose names begin, respectively, with Cali (as in Califano) and Tor. Pelham and his colleagues surmise that “People are attracted to places that resemble their names” . . . and to name-related careers, with American dentists being twice as likely to be named “Dennis” as the equally popular names “Jerry” or “Walter.” As I mentioned in a previous blog essay, Pelham’s work has been criticized. Pelham replied, and now, with Mauricio Carvallo, offers Census data showing that people named Baker, Butcher, Carpenter, Mason, Farmer, and so forth disproportionately gravitate toward occupations that bear their names (despite the separation of countless generations from the original “Baker” who was a baker). And “men named Cal and Tex disproportionately moved to states resembling their names.” Moreover, in unpublished work, Pelham and his colleagues found that a century and more ago, when most people were born at home and birth certificates were completed later, people tended to declare birth dates associated with a positive identity. Assuming that births (before induced labor and C-sections) were randomly distributed, people between 1890 and 1910 over-claimed Christmas Day birthdays by 66 percent and New Year’s Day birthdays by 62 percent. Parents also over-claimed birthdays associated with famous people’s birthdays, such as George Washington’s—though only U.S. immigrants from Ireland strongly over-claimed St. Patrick’s Day birthdays (at more than three times the expected rate). The birth registration process once allowed wiggle room, and “where there is wiggle room, there is often wiggling,” report Pelham and his team. “And a potent motivation for wiggling might be the desire to claim a positive social identity.” Implicit egotism rides again.
... View more
Labels
0
0
1,851

Author
07-18-2016
09:14 AM
Originally posted on April 7, 2016. I cut my eye teeth in social psychology with a dissertation followed by a decade of research exploring group polarization. Our repeated finding: When like minds interact, their views often become more extreme. For example, when high-prejudice students discussed racial issues, they became more prejudiced, and vice versa when we grouped low-prejudice students with one another. When doing that research half a lifetime ago, I never imagined the benefits, and the dangers, of virtual like-minded groups... with both peacemakers and conspiracy theorists reinforcing their kindred spirits. In a recent New York Times essay, University of North Carolina professor Zeynep Tufekci studied the Twitter feeds of Donald Trump supporters, and observed cascading self-affirmation. People naturally thrive by finding like-minded others, and I watch as Trump supporters affirm one another in their belief that white America is being sold out by secretly Muslim lawmakers, and that every unpleasant claim about Donald Trump is a fabrication by a cabal that includes the Republican leadership and the mass media. I watch as their networks expand, and as followers find one another as they voice ever more extreme opinions. In the echo chamber of the virtual world, as in the real world, separation + conversation = polarization. The Internet has such wonderful potential to create Mark Zuckerburg’s vision of “a more connected world.” But it also offers a powerful mechanism for deepening social divisions and promoting extremist views and actions. On my list of the future’s great challenges, somewhere not far below restraining climate change, is learning how to harness the great benefits of the digital future without exacerbating group polarization.
... View more
Labels
0
0
1,426

Author
07-18-2016
09:04 AM
Originally posted on May 4, 2016. “Self-made” people underestimate their fortunate circumstances and their plain good luck. That’s the argument, in the May Atlantic, of Robert Frank, a Cornell economist whose writings I have admired. Drawing from his new book, Success and Luck: Good Fortune and the Myth of Meritocracy, Frank notes that “Wealthy people overwhelmingly attribute their own success to hard work rather than to factors like luck or being in the right place at the right time.” This brings to mind Albert Bandura’s description of the enduring significance of chance events that can deflect us down a new vocational road, or into marriage. My favorite example is his anecdote of the book editor who came to Bandura’s lecture on the “Psychology of Chance Encounters and Life Paths” and ended up marrying the woman seated next to him. Frank notes that when wealthy people discount both others’ support and plain luck (which includes not being born in an impoverished place) the result is “troubling, because a growing body of evidence suggests that seeing ourselves as self-made—rather than as talented, hardworking, and lucky—leads us to be less generous and public spirited.” “Surely,” he adds, “it’s a short hop from overlooking luck’s role in success to feeling entitled to keep the lion’s share of your income—and to being reluctant to sustain the public investments that let you succeed in the first place.” In a presumed just world, the rich get the riches they deserve, which they don’t want drained by high taxes that support the less deserving. I am keenly aware of my own good luck. My becoming a textbook author, and all that has followed from that—including trade books and other science writing and speaking—is an outgrowth of my a) being invited in 1978 to a small international retreat of social psychologists near Munich, b) being seated throughout the conference near a distinguished American colleague, who c) chanced to mention my name the following January when McGraw-Hill called him seeking an author for a new social psychologist text. I could live my life over and the combined probability of those convergent events would be essentially nil. The resulting book, and the introductory texts that followed, were not my idea. But they are an enduring reminder that chance or luck—or I might call it Providence—can channel lives in new directions.
... View more
Labels
0
0
1,640
Topics
-
Abnormal Psychology
16 -
Achievement
3 -
Affiliation
2 -
Behavior Genetics
2 -
Cognition
33 -
Consciousness
32 -
Current Events
26 -
Development Psychology
18 -
Developmental Psychology
30 -
Drugs
5 -
Emotion
55 -
Evolution
3 -
Evolutionary Psychology
4 -
Gender
17 -
Gender and Sexuality
7 -
Genetics
10 -
History and System of Psychology
6 -
History and Systems of Psychology
5 -
Industrial and Organizational Psychology
47 -
Intelligence
6 -
Learning
63 -
Memory
37 -
Motivation
13 -
Motivation: Hunger
2 -
Nature-Nurture
5 -
Neuroscience
45 -
Personality
29 -
Psychological Disorders and Their Treatment
21 -
Research Methods and Statistics
98 -
Sensation and Perception
43 -
Social Psychology
121 -
Stress and Health
51 -
Teaching and Learning Best Practices
54 -
Thinking and Language
18 -
Virtual Learning
25
- « Previous
- Next »
Popular Posts