-
About
Our Story
back- Our Mission
- Our Leadership
- Accessibility
- Careers
- Diversity, Equity, Inclusion
- Learning Science
- Sustainability
Our Solutions
back
-
Community
Community
back- Newsroom
- Discussions
- Webinars on Demand
- Digital Community
- The Institute at Macmillan Learning
- English Community
- Psychology Community
- History Community
- Communication Community
- College Success Community
- Economics Community
- Institutional Solutions Community
- Nutrition Community
- Lab Solutions Community
- STEM Community
- Newsroom
Born to Learn From Us
- Subscribe to RSS Feed
- Mark as New
- Mark as Read
- Bookmark
- Subscribe
- Printer Friendly Page
- Report Inappropriate Content
Originally posted on March 27, 2015.
At a recent foundation consultation at Stanford, I enjoyed meeting Andrew Meltzoff, the amiable and articulate co-director of the University of Washington’s Institute for Learning and Brain Sciences in my home city (where he lives but a short walk from my former high school).
Meltzoff is known to psychology teachers and students for his many studies of infant imitation, including his classic 1977 Science report on 2- to 3-week old infants imitating his facial gestures. It was, he reported, a powerful experience to stick out his tongue and have newborns do the same. “This demonstrates to me the essential socialness of human beings.”
I’ve always wondered what newborns really are capable of visually perceiving, and he reminded me that it’s not much—but that they have their best acuity for the distance between their mother’s breast and eyes, which also was the distance between his face and the infants’ eyes.
His lab is now reading infants brains using the world’s only infant brain imaging MEG (magnetoencephalography) machine, which reads brain magnetic activity more finely than possible with EEG.
He reports that “When a brain sees, feels, touches, or hears, its neuronal activity generates weak magnetic fields that can be pinpointed and tracked millisecond-by-millisecond by a MEG machine.” That is allowing Meltzoff and his colleagues to visualize an infant’s working brain as the infant listens to language, experiences a simple touch on the hand, or (in future studies) engages in social imitation and cognitive problem solving.
On the horizon, he envisions future studies of how children develop empathy, executive self-control, and identity. He also anticipates exploring how children’s brains process information from two-dimensional digital media versus their three-dimensional everyday world, and how technology can best contribute to children’s development. In such ways, they hope to “help children maximize their learning capabilities.”
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
-
Abnormal Psychology
19 -
Achievement
3 -
Affiliation
1 -
Behavior Genetics
2 -
Cognition
40 -
Consciousness
35 -
Current Events
28 -
Development Psychology
19 -
Developmental Psychology
34 -
Drugs
5 -
Emotion
55 -
Evolution
3 -
Evolutionary Psychology
5 -
Gender
19 -
Gender and Sexuality
7 -
Genetics
12 -
History and System of Psychology
6 -
History and Systems of Psychology
7 -
Industrial and Organizational Psychology
51 -
Intelligence
8 -
Learning
70 -
Memory
39 -
Motivation
14 -
Motivation: Hunger
2 -
Nature-Nurture
7 -
Neuroscience
47 -
Personality
29 -
Psychological Disorders and Their Treatment
22 -
Research Methods and Statistics
107 -
Sensation and Perception
46 -
Social Psychology
132 -
Stress and Health
55 -
Teaching and Learning Best Practices
59 -
Thinking and Language
18 -
Virtual Learning
26
- « Previous
- Next »