Masks Strongly Recommended but Not Required in Maryland, Starting Immediately

Due to the downward trend in respiratory viruses in Maryland, masking is no longer required but remains strongly recommended in Johns Hopkins Medicine clinical locations in Maryland. Read more .

  • Vaccines  
  • Masking Guidelines
  • Visitor Guidelines  

A graphic illustration of memory and thoughts

Inside the Science of Memory

When  Rick Huganir, Ph.D. , was a teenager, he set out to better understand the physical and emotional changes of adolescence. “I was wondering what was happening to me, and I realized it was my brain changing,” says Huganir, director of the Johns Hopkins Department of Neuroscience.

That led to a senior project on protein synthesis and memory in goldfish, as well as a lifelong fascination in how we learn and remember things.

“Memories are who we are,” says Huganir. “But making memories is also a biological process.” This process raises many questions. How does the process affect our brain? How do experiences and learning change the connections in our brains and create memories?

Those are just some of the issues Huganir and his colleagues are studying. Their work may lead to new treatments for post-traumatic stress syndrome, as well as ways to improve memory in people with dementia and other cognitive problems.

Memory: It’s All About Connections

When we learn something—even as simple as someone’s name—we form connections between neurons in the brain. These  synapses  create new circuits between nerve cells, essentially remapping the brain. The sheer number of possible connections gives the brain unfathomable flexibility—each of the brain’s 100 billion nerve cells can have 10,000 connections to other nerve cells.

Those synapses get stronger or weaker depending on how often we’re exposed to an event. The more we’re exposed to an activity (like a golfer practicing a swing thousands of times) the stronger the connections. The less exposure, however, the weaker the connection, which is why it’s so hard to remember things like people’s names after the first introduction.

“What we’ve been trying to figure out is how does this occur, and how do you strengthen synapses at a molecular level?” Huganir says.

New Discoveries in Memory

Many of the research questions surrounding memory may have answers in complex interactions between certain brain chemicals—particularly glutamate—and neuronal receptors, which play a crucial role in the signaling between brain cells. Huganir and his team discovered that when mice are exposed to traumatic events, the level of neuronal receptors for glutamate increases at synapses in the amygdala, the fear center of the brain, and encodes the fear associated with the memory. Removing those receptors, however, reduces the strength of these connections, essentially erasing the fear component of the trauma but leaving the memory.

Now Huganir and his lab are developing drugs that target those receptors. The hope is that inactivating the receptors could help people with post-traumatic stress syndrome by reducing the fear associated with a traumatic memory, while strengthening them could improve learning, particularly in people with cognitive dysfunction or  Alzheimer’s disease .

#TomorrowsDiscoveries: Using Data to Diagnose Brain Diseases | Michael I. Miller, Ph.D.

research on memory finds that quizlet

Johns Hopkins researcher Michael Miller explains how we can use data to create better diagnostic tools for neurodegenerative disorders like Alzheimer's disease.

Definitions

Dementia (di-men-sha) : A loss of brain function that can be caused by a variety of disorders affecting the brain. Symptoms include forgetfulness, impaired thinking and judgment, personality changes, agitation and loss of emotional control. Alzheimer’s disease, Huntington’s disease and inadequate blood flow to the brain can all cause dementia. Most types of dementia are irreversible.

Post-traumatic stress disorder (PTSD) : A disorder in which your “fight or flight,” or stress, response stays switched on, even when you have nothing to flee or battle. The disorder usually develops after an emotional or physical trauma, such as a mugging, physical abuse or a natural disaster. Symptoms include nightmares, insomnia, angry outbursts, emotional numbness, and physical and emotional tension.

Find a Doctor

Specializing In:

  • Neurological Disease
  • Geriatric Psychiatry
  • Cognitive Decline in Older Adults
  • Alzheimer's Disease (AD)
  • Memory Disorders

At Another Johns Hopkins Member Hospital:

  • Howard County Medical Center
  • Sibley Memorial Hospital
  • Suburban Hospital

Find a Treatment Center

  • Geriatric Medicine and Gerontology
  • Memory Disorders Center
  • Memory and Alzheimers Treatment Center

Find Additional Treatment Centers at:

Yellow sticky notes with "don't forget" written on them

Request an Appointment

Yellow sticky notes with "don't forget" written on them

Memory: Myth Versus Truth

brain with glasses and book

Memory: 5 Ways to Protect Your Brain Health

Old couple smiling for the camera

The Power of Positive Thinking

Related Topics

Stanford researchers observe memory formation in real time

research on memory finds that quizlet

By Alan Toth

Why is it that someone who hasn’t ridden a bicycle in decades can likely jump on and ride away without a wobble, but could probably not recall more than a name or two from their 3rd grade class?

This may be because physical skills — dubbed motor memories by neuroscientists — are encoded differently in our brains than our memories for names or facts.

Now, a new study by scientists with the Wu Tsai Neurosciences Institute is revealing exactly how motor memories are formed and why they are so persistent. It may even help illuminate the root causes of movement disorders like Parkinson’s disease.

“We think motor memory is unique,” said Jun Ding , an associate professor of neurosurgery and of neurology. “Some studies on Alzheimer’s disease included participants who were previously musicians and couldn’t remember their own families, but they could still play beautiful music. Clearly, there’s a huge difference in the way that motor memories are formed.”

Memories are thought to be encoded in the brain in the pattern of activity in networks of hundreds or thousands of neurons, sometimes distributed across distant brain regions. The concept of such a memory trace — sometimes called a memory engram — has been around for more than a century, but identifying exactly what an engram is and how it is encoded has proven extremely challenging. Previous studies have shown that some forms of learning activate specific neurons, which reactivate when the learned memory is recalled. However, whether memory engram neurons exist for motor skill learning remains unknown.

Ding and postdoctoral scholars Richard Roth and Fuu-Jiun Hwang wanted to know how these engram-like groups of cells get involved in learning and remembering a new motor skill.

“When you’re first learning to shoot a basketball, you use a very diverse set of neurons each time you throw, but as you get better, you use a more refined set that’s the same every time,” said Roth. “These refined neuron pathways were thought to be the basis of a memory engram, but we wanted to know exactly how these pathways emerge.”

In their new study, published July 8, 2022 in Neuron , the researchers trained mice to use their paws to reach food pellets through a small slot. Using genetic wizardry developed by the lab of Liqun Luo , a Wu Tsai Neurosciences Institute colleague in the Department of Biology, the researchers were able to identify specific neurons in the brain’s motor cortex — an area responsible for controlling movements — that were activated during the learning process. The researchers tagged these potential engram cells with a fluorescent marker so they could see if they also played a role in recalling the memory later on.

When the researchers tested the animals’ memory of this new skill weeks later, they found that those mice that still remembered the skill showed increased activity in the same neurons that were first identified during the learning period, showing that these neurons were responsible for encoding the skill: the researchers had observed the formation of memory engrams.

But how do these particular groups of neurons take on responsibility for learning a new task in the first place? And how do they actually improve the animal’s performance?

To answer these questions, the researchers zoomed in closer. Using two-photon microscopy to observe these living circuits in action, they observed the so-called “engram neurons” reprogram themselves as the mice learned. Motor cortex engram cells took on new synaptic inputs — potentially reflecting information about the reaching movement — and themselves formed powerful new output connections in a distant brain region called the dorsolateral striatum — a key waystation through which the engram neurons can exert refined control over the animal’s movements. It was the first time anyone had observed the creation of new synaptic pathways on the same neuron population — both at the input and the output levels — in these two brain regions.

Graphical abstract summarizing the current study

The ability to trace new memories forming in the mouse brain allowed the research team to weigh in on a long-standing debate about how skills are stored in the brain: are they controlled from one central memory trace, or engram, or is the memory redundantly stored across many different brain areas? Though this study cannot discount the idea of centralized memory, it does lend credibility to the opposing theory. Another fascinating question is whether the activation of these engram neurons is required for the performance of already learned motor tasks. The researchers speculated that by suppressing the activity of neurons that had been identified as part of the motor cortex memory engram, the mice probably still would be able to perform the task.

“Think of memory like a highway. If 101 and 280 are both closed, you could still get to Stanford from San Francisco, it would just take a lot longer,” said Ding.   

These findings suggest that, in addition to being dispersed, motor memories are highly redundant. The researchers say that as we repeat learned skills, we are continually reinforcing the motor engrams by building new connections — refining the skill. It’s what is meant by the term muscle memory — a refined, highly redundant network of motor engrams used so frequently that the associated skill seems automatic.

Jun Ding, associate professor of neurology and of neurosurgery and Wu Tsai Neurosciences Institute affiliate

Ding believes that this constant repetition is one reason for the persistence of motor memory, but it’s not the only reason. Memory persistence may also be affected by a skill being associated with a reward, perhaps through the neurotransmitter dopamine. Though the research team did not directly address it in this study, Ding’s previous work in Parkinson’s disease suggests the connection.

“Current thinking is that Parkinson’s disease is the result of these motor engrams being blocked, but what if they’re actually being lost and people are forgetting these skills?” said Ding. “Remember that even walking is a motor skill that we all learned once, and it can potentially be forgotten.”

It’s a question that the researchers hope to answer in a follow-up study, because it may be the key to developing effective treatments for motor disorders. If Parkinson’s disease is the result of blocked motor memories, then patients should be able to improve their movement abilities by practicing and reinforcing these motor skills. On the other hand, if Parkinson’s destroys motor engrams and inhibits the creation of new ones — by targeting motor engram neurons and their synaptic connection observed in the team’s new study — then a completely different approach must be taken to deliver effective treatments.

“Our next goal is to understand what’s happening in movement disorders like Parkinson’s,” Ding said. “Obviously, we’re still a long way from a cure, but understanding how motor skills form is critical if we want to understand why they’re disrupted by disease.”

The research was published July 8 in Neuron: https://doi.org/10.1016/j.neuron.2022.06.006

Study authors were Fuu-Jiun Hwang, Richard H. Roth, Yu-Wei Wu, Yue Sun, Destany K. Kwon, Yu Liu, and Jun B. Ding.

The research was supported by the National Institutes of Health (NIH) and National Institute for Neurological Disease and Stroke (NINDS); the Klingenstein Foundation's Aligning Science Across Parkinson’s initiative; and GG gift fund, the Stanford School of Medicine Dean’s Postdoctoral Fellowship; and Parkinson’s Foundation Postdoctoral Fellowship.

Related stories

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Published: 24 September 2019

Focus on learning and memory

Nature Neuroscience volume  22 ,  page 1535 ( 2019 ) Cite this article

16k Accesses

3 Citations

24 Altmetric

Metrics details

In this special issue of Nature Neuroscience , we feature an assortment of reviews and perspectives that explore the topic of learning and memory.

Learning new information and skills, storing this knowledge, and retrieving, modifying or forgetting these memories over time are critical for flexibly responding to a changing environment. How these processes occur has fascinated philosophers, psychologists, and neuroscientists for generations, and the question continues to inspire research encompassing diverse approaches. In this special issue, Nature Neuroscience presents a collection of reviews and perspectives that reflects the breadth and vibrancy of this field. Many of these pieces touch on topics that have animated decades of investigation, including the roles of synaptic plasticity, adult neurogenesis, neuromodulation, and sleep in learning and memory. Yet recently developed technologies continue to provide novel insights in these areas, leading to the updated views presented here.

Synaptic plasticity, such as long-term potentiation and depression, remains the prevailing cellular model for learning and memory. While many presume that these processes are engaged by learning and mediate lasting changes in behavior, this link has yet to be conclusively demonstrated in vivo. Humeau and Choquet ( https://doi.org/10.1038/s41593-019-0480-6 ) outline the latest tools that can be used to visualize and manipulate synaptic activity and signaling in behaving animals, and they discuss further advances that are needed to help bridge this gap in our understanding.

Neuroscientists have also long been intrigued by the role that the formation of new neurons could play in memory formation and maintenance of new memories. Miller and Sahay ( https://doi.org/10.1038/s41593-019-0484-2 ) integrate recent research on adult hippocampal neurogenesis to present a model of how the maturation of adult-born dentate granule cells contributes to memory indexing and interference.

While the neural mechanisms underlying memory acquisition and consolidation are relatively well-described, less is known about how memories are retrieved. Frankland, Josselyn, and Köhler ( https://doi.org/10.1038/s41593-019-0493-1 ) discuss how recent approaches that enable the manipulation of memory-encoding neural ensembles (termed ‘engrams’) have informed our current understanding of retrieval. They highlight the ways in which retrieval success is influenced by retrieval cues and the congruence between encoding and retrieval states. They also discuss important open questions in the field.

External stimuli and internal states can affect various aspects of learning and memory, which is mediated in part by neuromodulatory systems. Likhtik and Johansen ( https://doi.org/10.1038/s41593-019-0503-3 ) detail how acetylcholine, noradrenaline, and dopamine systems participate in fear encoding and extinction. They discuss emergent themes, including how neuromodulation can act throughout the brain or in specifically targeted regions, how it can boost selected neural signals, and how it can tune oscillatory relationships between neural circuits.

The efficacy of memory storage is also influenced by sleep. Klinzing, Niethard, and Born ( https://doi.org/10.1038/s41593-019-0467-3 ) review evidence from rodent and human studies that implicates reactivation of memory ensembles (or ‘replay’), synaptic scaling, and oscillations during sleep in memory consolidation. They also discuss recent findings that suggest that the thalamus coordinates these processes.

Effective learning requires us to identify critical information and ignore extraneous details, all of which varies depending on the task at hand. Yael Niv ( https://doi.org/10.1038/s41593-019-0470-8 ) discusses computational and neural processes involved in the formation of such task representations, how factors such as attention and context affect these representations, and how we use task representations to make decisions.

The ability to issue appropriate outputs in response to neural activity is a critical brain function, and is often disrupted in injury and disease. Maryam Shanechi ( https://doi.org/10.1038/s41593-019-0488-y ) discusses how ‘closed-loop’ brain–machine interfaces (BMIs) have been used to monitor motor impulses and in turn control prosthetic or paralyzed limbs in order to restore function. Furthermore, she discusses how manipulation of BMI parameters can aid the study of learning. Finally, she explores how BMIs could be used in a similar vein to monitor and correct aberrant mood processes in psychiatric disorders.

By highlighting the topic of learning and memory, we honor its importance and centrality in neuroscience, while also celebrating the ways that other disciplines, including psychology, cellular and molecular biology, computer science, and engineering fuel insights in this area. We hope to continue to publish outstanding research in this area, particularly studies that resolve long-standing questions, that develop or leverage new methodologies, and that integrate multiple approaches.

Rights and permissions

Reprints and permissions

About this article

Cite this article.

Focus on learning and memory. Nat Neurosci 22 , 1535 (2019). https://doi.org/10.1038/s41593-019-0509-x

Download citation

Published : 24 September 2019

Issue Date : October 2019

DOI : https://doi.org/10.1038/s41593-019-0509-x

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

research on memory finds that quizlet

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

CHAPTER 7: FORGETTING & AMNESIA

Chances are that you have experienced memory lapses and been frustrated by them. You may have had trouble remembering the definition of a key term on an exam or found yourself unable to recall the name of an actor from one of your favorite TV shows. Maybe you forgot to call your aunt on her birthday or you routinely forget where you put your cell phone. Oftentimes, the bit of information we are searching for comes back to us, but sometimes it does not. Clearly, forgetting seems to be a natural part of life. Why do we forget? And is forgetting always a bad thing?

A woman covers her face in embarrassment.

Causes of Forgetting

One very common and obvious reason why you cannot remember a piece of information is because you did not learn it in the first place. If you fail to encode information into memory, you are not going to remember it later on. Usually ,  encoding  failures occur because we are distracted or are not paying attention to specific details. For example, people have a lot of trouble recognizing an actual penny out of a set of drawings of very similar pennies, or lures, even though most of us have had a lifetime of experience handling pennies ( Nickerson & Adams, 1979 ). However, few of us have studied the features of a penny in great detail, and since we have not attended to those details, we fail to recognize them later. Similarly, it has been well documented that distraction during learning impairs later memory (e.g.,  Craik, Govoni, Naveh-Benjamin, & Anderson, 1996 ). Most of the time this is not problematic, but in certain situations, such as when you are studying for an exam, failures to encode due to distraction can have serious repercussions.

Another proposed reason why we forget is that memories fade, or  decay , over time. It has been known since the pioneering work of Hermann Ebbinghaus ( 1885/1913 ) that as time passes, memories get harder to recall. Ebbinghaus created more than 2,000 nonsense syllables, such as  dax ,  bap , and  rif , and studied his own memory for them, learning as many as 420 lists of 16 nonsense syllables for one experiment. He found that his memories diminished as time passed, with the most forgetting happening early on after learning. His observations and subsequent research suggested that if we do not rehearse a memory and the neural representation of that memory is not reactivated over a long period of time, the memory representation may disappear entirely or fade to the point where it can no longer be accessed. As you might imagine, it is hard to definitively prove that a memory has decayed as opposed to it being inaccessible for another reason. Critics argued that forgetting must be due to processes other than simply the passage of time, since disuse of a memory does not always guarantee forgetting ( McGeoch, 1932 ). More recently, some memory theorists have proposed that recent memory traces may be degraded or disrupted by new experiences ( Wixted, 2004 ). Memory traces need to be  consolidated , or transferred from the hippocampus to more durable representations in the cortex, in order for them to last ( McGaugh, 2000 ). When the consolidation process is interrupted by the encoding of other experiences, the memory trace for the original experience does not get fully developed and thus is forgotten.

Cheerleaders pictured in a 1980s high school yearbook.

Both encoding failures and decay account for more permanent forms of forgetting, in which the memory trace does not exist, but forgetting may also occur when a memory exists yet we temporarily cannot access it. This type of forgetting may occur when we lack the appropriate  retrieval  cues for bringing the memory to mind. You have probably had the frustrating experience of forgetting your password for an online site. Usually, the password has not been permanently forgotten; instead, you just need the right reminder to remember what it is. For example, if your password was “pizza0525,” and you received the password hints “favorite food” and “Mom’s birthday,” you would easily be able to retrieve it. Retrieval hints can bring back to mind seemingly forgotten memories ( Tulving & Pearlstone, 1966 ). One real-life illustration of the importance of retrieval cues comes from a study showing that whereas people have difficulty recalling the names of high school classmates years after graduation, they are easily able to recognize the names and match them to the appropriate faces ( Bahrick, Bahrick, & Wittinger, 1975 ). The names are powerful enough retrieval cues that they bring back the memories of the faces that went with them. The fact that the presence of the right retrieval cues is critical for remembering adds to the difficulty in proving that a memory is permanently forgotten as opposed to temporarily unavailable.

Retrieval failures can also occur because other memories are blocking or getting in the way of recalling the desired memory. This blocking is referred to as  interference . For example, you may fail to remember the name of a town you visited with your family on summer vacation because the names of other towns you visited on that trip or on other trips come to mind instead. Those memories then prevent the desired memory from being retrieved. Interference is also relevant to the example of forgetting a password: passwords that we have used for other websites may come to mind and interfere with our ability to retrieve the desired password. Interference can be either proactive, in which old memories block the learning of new related memories, or retroactive, in which new memories block the retrieval of old related memories. For both types of interference, competition between memories seems to be key ( Mensink & Raaijmakers, 1988 ). Your memory for a town you visited on vacation is unlikely to interfere with your ability to remember an Internet password, but it is likely to interfere with your ability to remember a different town’s name. Competition between memories can also lead to forgetting in a different way. Recalling a desired memory in the face of competition may result in the inhibition of related, competing memories ( Levy & Anderson, 2002 ). You may have difficulty recalling the name of Kennebunkport, Maine, because other Maine towns, such as Bar Harbor, Winterport, and Camden, come to mind instead. However, if you are able to recall Kennebunkport despite strong competition from the other towns, this may actually change the competitive landscape, weakening memory for those other towns’ names, leading to forgetting of them instead.

Five impediments to remembering. 1. Encoding failures – we don’t learn the information in the first place. 2. Decay – memories fade over time. 3. Inadequate retrieval cues – we lack sufficient reminders. 4. Interference – other memories get in the way. 5. Trying not to remember – we deliberately attempt to keep things out of mind.

Finally, some memories may be forgotten because  we deliberately attempt to keep them out of mind . Over time, by actively trying not to remember an event, we can sometimes successfully keep the undesirable memory from being retrieved either by inhibiting the undesirable memory or generating diversionary thoughts ( Anderson & Green, 2001 ). Imagine that you slipped and fell in your high school cafeteria during lunch time, and everyone at the surrounding tables laughed at you. You would likely wish to avoid thinking about that event and might try to prevent it from coming to mind. One way that you could accomplish this is by thinking of other, more positive, events that are associated with the cafeteria. Eventually, this memory may be suppressed to the point that it would only be retrieved with great difficulty ( Hertel & Calcaterra, 2005 ).

Adaptive Forgetting

A group of hikers are stopped in the middle of a trail with confused looks on their faces.

We have explored five different causes of forgetting. Together they can account for the day-to-day episodes of forgetting that each of us experience. Typically, we think of these episodes in a negative light and view forgetting as a memory failure. Is forgetting ever good? Most people would reason that forgetting that occurs in response to a deliberate attempt to keep an event out of mind is a good thing. No one wants to be constantly reminded of falling on their face in front of all of their friends. However, beyond that, it can be argued that forgetting is adaptive, allowing us to be efficient and hold onto only the most relevant memories ( Bjork, 1989 ;  Anderson & Milson, 1989 ). Shereshevsky, or “S,” the mnemonist studied by Alexander Luria ( 1968 ), was a man who almost never forgot. His memory appeared to be virtually limitless. He could memorize a table of 50 numbers in under 3 minutes and recall the numbers in rows, columns, or diagonals with ease. He could recall lists of words and passages that he had memorized over a decade before. Yet Shereshevsky found it difficult to function in his everyday life because he was constantly distracted by a flood of details and associations that sprung to mind. His case history suggests that remembering everything is not always a good thing. You may occasionally have trouble remembering where you parked your car, but imagine if every time you had to find your car, every single former parking space came to mind. The task would become impossibly difficult to sort through all of those irrelevant memories. Thus, forgetting is adaptive in that it makes us more efficient. The price of that efficiency is those moments when our memories seem to fail us ( Schacter, 1999 ).

Model of the human brain with temporal lobes highlighted.

Clearly, remembering everything would be maladaptive, but what would it be like to remember nothing? We will now consider a profound form of forgetting called amnesia that is distinct from more ordinary forms of forgetting. Most of us have had exposure to the concept of amnesia through popular movies and television. Typically, in these fictionalized portrayals of amnesia, a character suffers some type of blow to the head and suddenly has no idea who they are and can no longer recognize their family or remember any events from their past. After some period of time (or another blow to the head), their memories come flooding back to them. Unfortunately, this portrayal of amnesia is not very accurate. What does amnesia typically look like?

The most widely studied amnesic patient was known by his initials H. M. ( Scoville & Milner, 1957 ). As a teenager, H. M. suffered from severe epilepsy, and in 1953, he underwent surgery to have both of his medial temporal lobes removed to relieve his epileptic seizures. The  medial temporal lobes  encompass the hippocampus and surrounding cortical tissue. Although the surgery was successful in reducing H. M.’s seizures and his general intelligence was preserved, the surgery left H. M. with a profound and permanent memory deficit. From the time of his surgery until his death in 2008, H. M. was unable to learn new information, a memory impairment called  anterograde amnesia . H. M. could not remember any event that occurred since his surgery, including highly significant ones, such as the death of his father. He could not remember a conversation he had a few minutes prior or recognize the face of someone who had visited him that same day. He could keep information in his short-term, or working, memory, but when his attention turned to something else, that information was lost for good. It is important to note that H. M.’s memory impairment was restricted to  declarative memory , or conscious memory for facts and events. H. M. could learn new motor skills and showed improvement on motor tasks even in the absence of any memory for having performed the task before ( Corkin, 2002 ).

In addition to anterograde amnesia, H. M. also suffered from  temporally graded retrograde amnesia .  Retrograde amnesia  refers to an inability to retrieve old memories that occurred before the onset of amnesia. Extensive retrograde amnesia in the absence of anterograde amnesia is very rare ( Kopelman, 2000 ). More commonly, retrograde amnesia co-occurs with anterograde amnesia and shows a temporal gradient, in which memories closest in time to the onset of amnesia are lost, but more remote memories are retained ( Hodges, 1994 ). In the case of H. M., he could remember events from his childhood, but he could not remember events that occurred a few years before the surgery.

Amnesiac patients with damage to the hippocampus and surrounding medial temporal lobes typically manifest a similar clinical profile as H. M. The degree of anterograde amnesia and retrograde amnesia depend on the extent of the medial temporal lobe damage, with greater damage associated with a more extensive impairment ( Reed & Squire, 1998 ). Anterograde amnesia provides evidence for the role of the hippocampus in the formation of long-lasting declarative memories, as damage to the hippocampus results in an inability to create this type of new memory. Similarly, temporally graded retrograde amnesia can be seen as providing further evidence for the importance of memory consolidation ( Squire & Alvarez, 1995 ). A memory depends on the hippocampus until it is consolidated and transferred into a more durable form that is stored in the cortex. According to this theory, an amnesiac patient like H. M. could remember events from his remote past because those memories were fully consolidated and no longer depended on the hippocampus.

The classic amnesiac syndrome we have considered here is sometimes referred to as organic amnesia, and it is distinct from functional, or dissociative, amnesia. Functional amnesia involves a loss of memory that cannot be attributed to brain injury or any obvious brain disease and is typically classified as a mental disorder rather than a neurological disorder ( Kihlstrom, 2005 ). The clinical profile of dissociative amnesia is very different from that of patients who suffer from amnesia due to brain damage or deterioration. Individuals who experience  dissociative amnesia  often have a history of trauma. Their amnesia is retrograde, encompassing autobiographical memories from a portion of their past. In an extreme version of this disorder, people enter a dissociative fugue state, in which they lose most or all of their autobiographical memories and their sense of personal identity. They may be found wandering in a new location, unaware of who they are and how they got there. Dissociative amnesia is controversial, as both the causes and existence of it have been called into question. The memory loss associated with dissociative amnesia is much less likely to be permanent than it is in organic amnesia.

Just as the case study of the mnemonist Shereshevsky illustrates what a life with a near perfect memory would be like, amnesiac patients show us what a life without memory would be like. Each of the mechanisms we discussed that explain everyday forgetting—encoding failures, decay, insufficient retrieval cues, interference, and intentional attempts to forget—help to keep us highly efficient, retaining the important information and for the most part, forgetting the unimportant. Amnesiac patients allow us a glimpse into what life would be like if we suffered from profound forgetting and perhaps show us that our everyday lapses in memory are not so bad after all.

Outside Resources

Discussion questions.

  • Is forgetting good or bad? Do you agree with the authors that forgetting is an adaptive process? Why or why not?
  • Can we ever prove that something is forgotten? Why or why not?
  • Which of the five reasons for forgetting do you think explains the majority of incidences of forgetting? Why?
  • How is real-life amnesia different than amnesia that is portrayed on TV and in film?
  • Anderson, J. R., & Milson, R. (1989). Human memory: An adaptive perspective. *Psychological Review*, 96, 703–719.
  • Anderson, M. C., & Green, C. (2001). Suppressing unwanted memories by executive control.  Nature , 410, 366–369.
  • Bahrick, H. P., Bahrick, P. O., & Wittinger, R. P. (1975). Fifty years of memory for names and faces: A cross-sectional approach.  Journal of Experimental Psychology: General , 104, 54–75.
  • Bjork, R. A. (1989). Retrieval inhibition as an adaptive mechanism in human memory. In H. L. Roediger, III, & F. I. M. Craik (Eds.),  Varieties of Memory and Consciousness  (pp. 309– 330). Hillsdale, NJ: Erlbaum.
  • Corkin, S. (2002). What’s new with the amnesic patient H. M.? *Nature Reviews Neuroscience*, 3, 153–160.
  • Craik, F. I. M., Govoni, R., Naveh-Benjamin, M., & Anderson, N. D. (1996). The effects of divided attention on encoding and retrieval processes in human memory. *Journal of Experimental Psychology: General*, 125, 159–180.
  • Ebbinghaus, H. (1913).  Memory. A contribution to experimental psychology . New York: Teachers College/Columbia University (Engl. ed.). (Original work published in 1885.)
  • Hertel, P. T., & Calcaterra, G. (2005). Intentional forgetting benefits from thought substitution.  Psychonomic Bulletin & Review , 12, 484–489.
  • Hodges, J. R. (1994). Retrograde amnesia. In A. Baddeley, B. A. Wilson, & F. Watts (Eds.),  Handbook of Memory Disorders  (pp. 81–107). New York: Wiley.
  • Kihlstrom, J. F. (2005). Dissociative disorders. Annual  Review of Clinical Psychology , 1, 227– 253.
  • Kopelman, M. (2000). Focal retrograde amnesia and the attribution of causality: An exceptionally critical review.  Cognitive Neuropsychology , 17, 585–621.
  • Levy, B. J., & Anderson, M. C. (2002). Inhibitory processes and the control of memory retrieval.  Trends in Cognitive Sciences , 6, 299–305.
  • Luria, A. R. (1968).  The mind of a mnemonist: A little book about a vast memory  (L. Solataroff, Trans.). New York: Basic Books.
  • McGaugh, J. L. (2000). Memory: A century of consolidation.  Science , 287, 248–251.
  • McGeoch, J. A. (1932). Forgetting and the law of disuse.  Psychological Reviews , 39, 352– 370.
  • Mensink, G., & Raaijmakers, J. G. (1988). A model for interference and forgetting. *Psychological Review*, 95, 434–455.
  • Nickerson, R. S., & Adams, M. J. (1979). Long-term memory for a common object. *Cognitive Psychology*, 11, 287–307.
  • Reed, J. M. & Squire, L. R. (1998). Retrograde amnesia for facts and events: Findings from four new cases.  Journal of Neuroscience , 18, 3943–3954.
  • Schacter, D. L. (1999). The seven sins of memory: Insights from psychology and cognitive neuroscience.  American Psychologist , 54, 182–203.
  • Scoville, W. B. & Milner, B. (1957). Loss of recent memory after bilateral hippocampal lesions.  Journal of Neurology , Neurosurgery, & Psychiatry, 20, 11–21.
  • Squire, L. R., & Alvarez, P. (1995). Retrograde amnesia and memory consolidation: A neurobiological perspective.  Current Opinions in Neurobiology , 5, 169–177.
  • Tulving, E., & Pearlstone, Z. (1966). Availability versus accessibility of information in memory for words.  Journal of Verbal Learning and Verbal Behavior , 5, 381–391.
  • Wixted, J. T. (2004). The psychology and neuroscience of forgetting. *Annual Reviews of Psychology*, 55, 235–269.

Cognition Copyright © 2023 by Karenna Malavanti is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License , except where otherwise noted.

Share This Book

  • Bipolar Disorder
  • Therapy Center
  • When To See a Therapist
  • Types of Therapy
  • Best Online Therapy
  • Best Couples Therapy
  • Best Family Therapy
  • Managing Stress
  • Sleep and Dreaming
  • Understanding Emotions
  • Self-Improvement
  • Healthy Relationships
  • Student Resources
  • Personality Types
  • Guided Meditations
  • Verywell Mind Insights
  • 2024 Verywell Mind 25
  • Mental Health in the Classroom
  • Editorial Process
  • Meet Our Review Board
  • Crisis Support

Earliest Memories Start at Age Two and a Half, Study Finds

Joni Sweet is an experienced writer who specializes in health, wellness, travel, and finance.

research on memory finds that quizlet

Nick Blackmer is a librarian, fact-checker, and researcher with more than 20 years’ experience in consumer-oriented health and wellness content. He keeps a DSM-5 on hand just in case.

research on memory finds that quizlet

Key Takeaways

  • New research shows that our earliest memories may begin at age 2.5, about a year sooner than previously thought.
  • How far back you can remember depends on a long line-up of factors, including your culture, gender, family, and the way in which you’re asked to recall memories.
  • You may be able to remember further back when asked repeatedly over time what your earliest memory is.

How far back can you remember? The answer might be even earlier than you think, according to new research.

In a study recently published in the journal Memory , researchers found that people could recall things that happened to them from as far back at age 2.5 years old on average—about a year earlier than previously estimated.

The research also suggests that there’s actually a “pool of potential memories” that people can pull from, rather than a fixed beginning, and you may be able to recall even older memories when interviewed repeatedly about them.

Here’s what the latest research says about how far back our memory actually goes and why it matters for the narrative of your life.

For this study, researcher Carole Peterson, PhD , professor in the department of psychology at Memorial University of Newfoundland, reviewed previous research on childhood amnesia and analyzed data collected in her laboratory over the last two decades to better understand early memories .

The data showed that people’s earliest memories can often be traced back to age 2.5. Scientists previously believed that a person’s memory clock started at around 3.5 years old.  

David Copeland, PhD

It might be difficult to pinpoint the one true ‘earliest memory’ for anyone.

“This article explored the idea of infantile amnesia—this is an idea that researchers have considered for years and it states that people do not remember much (or anything) from their first 2 to 3 years of life,” explains David Copeland, PhD , associate professor of psychology at the University of Nevada, Las Vegas. “This line of research is suggesting that we might have memories a little bit earlier than that.”

The research also found that just how far back any one individual’s memory goes depends on a variety of factors, such as: 

  • nationality
  • home environment (urban vs. rural)
  • how your parent recalls their memories
  • intelligence
  • birth order
  • the size of your family

Cassandra Fallon, LMFT

This study will lend validation to people that even from a young age, children do see and are impacted by their environment, the people in them, and events around them.

“This study will lend validation to people that even from a young age, children do see and are impacted by their environment, the people in them, and events around them,” says Cassandra Fallon, LMFT , a therapist at Thriveworks.

Fallon continues, “The fact that recalling memories is a challenge and that this study gives permission for this to be acceptable is helpful for validating that we may not ever know some details, like dates and times, but that it does not take away from the fact that we experienced or felt what we did and that it impacts us.”

Another important factor in how far you can remember is how you’re asked to recall your earliest memory, the study found. Your earliest memory may not be permanently fixed. Instead, extensive interviews and multiple follow-ups over the span of months or years could help you pull even earlier recollections from your memory bank in some cases.

“This aligns with what I observe in my clinic. I advise my patients to create timelines of their life, and this helps them access early memories,” says Leela Magavi, MD , psychiatrist and regional medical director at  Community Psychiatry  in Newport Beach, California. “They are often surprised by how much they can remember once they complete this activity.”

The research concluded there’s fluidity in retrieving early experiences and that one’s earliest memory may actually be malleable.

“In other words, it might be difficult to pinpoint the one true ‘earliest memory’ for anyone,” adds Copeland.

Why Early Memories Matter

Regardless of how far back they go, your earliest memories may provide therapeutic opportunities.

“Early memories often align with individuals’ core values, fears, hopes, and dreams. Learning about early memories can allow individuals to nurture their inner child and heal from the stressful or traumatic situations they have endured throughout their life,” says Dr. Magavi. “It can also help them gain clarity and embrace what matters the most to them.”

Leela Magavi, MD

Early memories often align with individuals’ core values, fears, hopes, and dreams. Learning about early memories can allow individuals to nurture their inner child and heal from the stressful or traumatic situations they have endured throughout their life.

Early memories—even those that have been reconstructed from external sources beyond what’s in our minds—can also play an important role in constructing the overall narrative of your life, says Copeland.

“For example, whether someone truly remembers the experience of falling off of a tricycle at age 3 or they learn about it from family members’ stories or from seeing pictures, it might not matter—as long as the event actually happened, it can be a part of one’s life narrative,” he says. “Someone might use it as a theme in their life of overcoming difficulties ever since they were young.”

Overall, these early memories help us to better understand ourselves, which can help us lead more fulfilling lives.

“The better we know ourselves, both attributes and challenges, the better we are able to make changes or maintain awareness for consistency. It is a powerful thing to know our strengths to continue using them and to know our weaknesses so that we can grow and learn to become a better become better version of ourselves,” says Fallon.

She adds: “This improves self-confidence, eases anxiety, reduces depression, and builds our grit, determination, and resiliency to handle anything life throws at us.”

What This Means For You

Your earliest memories can teach you a lot about yourself. Just how far back you can recall depends on a variety of factors, but new research shows that our memory bank may start at age 2.5 on average.

Repeatedly being interviewed about your earliest memories may allow you to remember things that happened at an even younger age. But experts say the age at which your earliest memory occurred doesn’t matter quite as much as putting that information into the context of your life and finding ways to grow from it. These memories, when placed into our overall narratives, provide opportunities to heal from trauma and handle the obstacles of life. 

Peterson C. What is your earliest memory? It depends .  Memory . 2021;29(6):811-822. doi:10.1080/09658211.2021.1918174

By Joni Sweet Joni Sweet is an experienced writer who specializes in health, wellness, travel, and finance.

  • U.S. Department of Health & Human Services

National Institutes of Health (NIH) - Turning Discovery into Health

  • Virtual Tour
  • Staff Directory
  • En Español

You are here

News releases.

News Release

Monday, March 7, 2022

Researchers uncover how the human brain separates, stores, and retrieves memories

NIH-funded study identifies brain cells that form boundaries between discrete events.

Illustration of a brain with photographs.

Researchers have identified two types of cells in our brains that are involved in organizing discrete memories based on when they occurred. This finding improves our understanding of how the human brain forms memories and could have implications in memory disorders such as Alzheimer’s disease. The study was supported by the National Institutes of Health’s  Brain Research Through Advancing Innovative Neurotechnologies (BRAIN) Initiative and published in Nature Neuroscience .

“This work is transformative in how the researchers studied the way the human brain thinks,” said Jim Gnadt, Ph.D., program director at the National Institute of Neurological Disorders and Stroke and the NIH BRAIN Initiative. “It brings to human neuroscience an approach used previously in non-human primates and rodents by recording directly from neurons that are generating thoughts.”

This study, led by Ueli Rutishauser, Ph.D., professor of neurosurgery, neurology and biomedical sciences at Cedars-Sinai Medical Center in Los Angeles, started with a deceptively simple question: how does our brain form and organize memories? We live our awake lives as one continuous experience, but it is believed based on human behavior studies, that we store these life events as individual, distinct moments. What marks the beginning and end of a memory? This theory is referred to as “event segmentation,” and we know relatively little about how the process works in the human brain.

To study this, Rutishauser and his colleagues worked with 20 patients who were undergoing intracranial recording of brain activity to guide surgery for treatment of their drug-resistant epilepsy. They looked at how the patients’ brain activity was affected when shown film clips containing different types of “cognitive boundaries”—transitions thought to trigger changes in how a memory is stored and that mark the beginning and end of memory “files” in the brain.

The first type, referred to as a “soft boundary,” is a video containing a scene that then cuts to another scene that continues the same story. For example, a baseball game showing a pitch is thrown and, when the batter hits the ball, the camera cuts to a shot of the fielder making a play. In contrast, a “hard boundary” is a cut to a completely different story—imagine if the batted ball were immediately followed by a cut to a commercial.

Jie Zheng, Ph.D., postdoctoral fellow at Children’s Hospital Boston and first author of the study, explained the key difference between the two boundaries.

“Is this a new scene within the same story, or are we watching a completely different story? How much the narrative changes from one clip to the next determines the type of cognitive boundary,” said Zheng.  

The researchers recorded the brain activity of participants as they watched the videos, and they noticed two distinct groups of cells that responded to different types of boundaries by increasing their activity. One group, called “boundary cells” became more active in response to either a soft or hard boundary. A second group, referred to as “event cells” responded only to hard boundaries. This led to the theory that the creation of a new memory occurs when there is a peak in the activity of both boundary and event cells, which is something that only occurs following a hard boundary.

One analogy to how memories might be stored and accessed in the brain is how photos are stored on your phone or computer. Often, photos are automatically grouped into events based on when and where they were taken and then later displayed to you as a key photo from that event. When you tap or click on that photo, you can drill down into that specific event.

“A boundary response can be thought of like creating a new photo event,” said Dr. Rutishauser. “As you build the memory, it’s like new photos are being added to that event. When a hard boundary occurs, that event is closed and a new one begins. Soft boundaries can be thought of to represent new images created within a single event.” 

The researchers next looked at memory retrieval and how this process relates to the firing of boundary and event cells. They theorized that the brain uses boundary peaks as markers for “skimming” over past memories, much in the way the key photos are used to identify events. When the brain finds a firing pattern that looks familiar, it “opens” that event.

Two different memory tests designed to study this theory were used. In the first, the participants were shown a series of still images and were asked whether they were from a scene in the film clips they just watched. Study participants were more likely to remember images that occurred soon after a hard or soft boundary, which is when a new “photo” or “event” would have been created.

The second test involved showing pairs of images taken from film clips that they had just watched. The participants were then asked which of the two images had appeared first. It turned out that they had a much harder time choosing the correct image if the two occurred on different sides of a hard boundary, possibly because they had been placed in different “events.”

These findings provide a look into how the human brain creates, stores, and accesses memories. Because event segmentation is a process that can be affected in people living with memory disorders, these insights could be applied to the development of new therapies.

In the future, Dr. Rutishauser and his team plan to look at two possible avenues to develop therapies related to these findings. First, neurons that use the chemical dopamine, which are most-known for their role in reward mechanisms, may be activated by boundary and event cells, suggesting a possible target to help strengthen the formation of memories.

Second, one of the brain’s normal internal rhythms, known as the theta rhythm, has been connected to learning and memory. If event cells fired in time with that rhythm, the participants had an easier time remembering the order of the images that they were shown. Because deep brain stimulation can affect theta rhythms, this could be another avenue for treating patients with certain memory disorders.

This project was made possible by a multi-institutional consortium through the NIH BRAIN Initiative’s Research on Humans program. Institutions involved in this study were Cedars-Sinai Medical Center, Children’s Hospital Boston (site PI Gabriel Kreiman, Ph.D.), and Toronto Western Hospital (site PI Taufik Valiante, M.D., Ph.D.). The study was funded by the NIH BRAIN Initiative (NS103792, NS117839), the National Science Foundation, and Brain Canada.

The BRAIN Initiative ® is a registered trademark of the U.S. Department of Health and Human Services.

The NIH BRAIN Initiative   is managed by 10 institutes whose missions and current research portfolios complement the goals of The BRAIN Initiative ® : National Center for Complementary and Integrative Health, National Eye Institute, National Institute on Aging, National Institute on Alcohol Abuse and Alcoholism, National Institute of Biomedical Imaging and Bioengineering,  Eunice Kennedy Shriver  National Institute of Child Health and Human Development, National Institute on Drug Abuse, National Institute on Deafness and other Communication Disorders, National Institute of Mental Health, and National Institute of Neurological Disorders and Stroke.

NINDS  ( https://www.ninds.nih.gov ) is the nation’s leading funder of research on the brain and nervous system. The mission of NINDS is to seek fundamental knowledge about the brain and nervous system and to use that knowledge to reduce the burden of neurological disease.

About the National Institutes of Health (NIH): NIH, the nation's medical research agency, includes 27 Institutes and Centers and is a component of the U.S. Department of Health and Human Services. NIH is the primary federal agency conducting and supporting basic, clinical, and translational medical research, and is investigating the causes, treatments, and cures for both common and rare diseases. For more information about NIH and its programs, visit www.nih.gov .

NIH…Turning Discovery Into Health ®

Zheng J. et al. Neurons detect cognitive boundaries to structure episodic memories in humans. Nature Neuroscience. March 7, 2022. DOI: 10.1038/s41593-022-01020-w

Connect with Us

  • More Social Media from NIH

Stanford University

Along with Stanford news and stories, show me:

  • Student information
  • Faculty/Staff information

We want to provide announcements, events, leadership messages and resources that are relevant to you. Your selection is stored in a browser cookie which you can remove at any time using “Clear all personalization” below.

The smartphones that are now ubiquitous were just gaining popularity when Anthony Wagner became interested in the research of his Stanford colleague, Clifford Nass, on the effects of media multitasking and attention. Though Wagner, a professor of psychology at Stanford University and director of the Stanford Memory Laboratory , wasn’t convinced by the early data, he recommended some cognitive tests for Nass to use in subsequent experiments. More than 11 years later, Wagner was intrigued enough to write a review on past research findings, published in Proceedings of the National Academy of Sciences , and contribute some of his own.

Woman holding phone in one hand, tablet in another, at a laptop computer.

A decade’s worth of research has shown that people who frequently use many types of media at once performed significantly worse on simple memory tasks. (Image credit: Getty Images)

The paper , co-authored with neuroscientist Melina Uncapher of the University of California, San Francisco, summarizes a decade’s worth of research on the relationship between media multitasking and various domains of cognition, including working memory and attention. In doing that analysis, Wagner noticed a trend emerging in the literature: People who frequently use many types of media at once, or heavy media multitaskers, performed significantly worse on simple memory tasks.

Wagner spoke with Stanford Report to explain the findings from his review on media multitasking and cognition, and discuss why it’s premature to determine the impact of these results.

How did you become interested in researching media multitasking and memory?

I was brought into a collaboration with Cliff Nass, a Stanford faculty member in communication who passed away a few years ago, and his master’s student, Eyal Ophir. They had this question: With the explosion of media technologies that has resulted in there being multiple simultaneous channels available that we can switch between, how might this relate to human cognition? Eyal and Cliff would come chat with me about their early findings and – I have to say – I thought it was complete hooey. I was skeptical. But, after a few experiments, the data were increasingly pointing to a link between media multitasking and attention. Their findings struck me as potentially important given the way we’re living as humans in this attention economy. Years later, as a memory scientist my interests continued to grow. Given that attention and cognitive control are so fundamental for memory, I wanted to see if there was a relationship between media multitasking and memory.

How do you define media multitasking, and can you give hypothetical examples of people that would be “heavy” and “light” media multitaskers?

Well, we don’t multitask. We task switch. The word “multitasking” implies that you can do two or more things at once, but in reality our brains only allow us to do one thing at a time and we have to switch back and forth.

Heavy media multitaskers have many media channels open at once and they switch between them. A heavy media multitasker might be writing an academic paper on their laptop, occasionally checking the Stanford basketball game on TV, responding to texts and Facebook messages, then getting back to writing – but then an email pops up and they check it. A light media multitasker would only be writing the academic paper or may only switch between a couple of media. They may turn off Wi-Fi, put away their phone or change their settings so they only get notified every hour. Those are some extreme examples, but they provide a sense of how people differ in their media use. Moreover, because our media landscape has continued to accelerate and change, those who are considered a heavy or light media multitasker today may not be the same as those a decade ago.

How do scientists assess someone’s memory?

There are many forms of memory, and thus many ways of probing memory in the lab. For working memory – the ability to keep a limited amount of information active in mind – we often use simple short-delay memory tasks. For example, in one test we show a set of oriented blue rectangles, then remove them from the screen and ask the subject to retain that information in mind. Then we’ll show them another set of rectangles and ask if any have changed orientation. To measure memory capacity, we do this task with a different number of rectangles and determine how performance changes with increasing memory loads. To measure the ability to filter out distraction, sometimes we add distractors, like red rectangles that the subjects are told to ignore.

What overall trends did you notice when you were looking through the literature to write this review?

In about half of the studies, the heavy media multitaskers are significantly underperforming on tasks of working memory and sustained attention. The other half are null results; there’s no significant difference. It strikes me as pretty clear that there is a negative relationship between media multitasking and memory performance – that high media multitasking is associated with poor performance on cognitive memory tasks. There’s not a single published paper that shows a significant positive relationship between working memory capacity and multitasking.

In the review we noticed an interesting potential emerging story. One possibility is that reduced working memory occurs in heavy media multitaskers because they have a higher probability of experiencing lapses of attention. When demands are low, they underperform. But, when the task demands are high, such as when the working memory tasks are harder, there’s no difference between the heavy and light media multitaskers. This observation, combined with the negative relationship between multitasking and performance on sustained attention tasks, prompted us to start looking at intrasubject variability and moment-to-moment fluctuations in a person’s ability to use task goals to direct attention in a sustained manner.

How do these findings affect how people should engage with media, or should they at all?

I would never tell anyone that the data unambiguously show that media multitasking causes a change in attention and memory. That would be premature. It’s too early to definitively determine cause and effect.

One could choose to be cautious, however. Many of us have felt like our technology and media are controlling us – that email chime or text tone demands our attention. But we can control that by adopting approaches that minimize habitual multitasking; we can decide to be more thoughtful and reflective users of media.

That said, multitasking isn’t efficient. We know there are costs of task switching. So that might be an argument to do less media multitasking – at least when working on a project that matters academically or professionally. If you’re multitasking while doing something significant, like an academic paper or work project, you’ll be slower to complete it and you might be less successful.

Browse Course Material

Course info.

  • Prof. John D. E. Gabrieli

Departments

  • Brain and Cognitive Sciences

As Taught In

  • Cognitive Science

Learning Resource Types

Introduction to psychology.

« Previous | Next »

This exam covers material from Introduction through Learning .

Once you are comfortable with the content of these sessions, you can review further by trying some of the practice questions before proceeding to the exam.

These optional practice questions and solutions are from prior years’ exams.

  • 2010: Practice Exam 1 Questions (PDF) ; Practice Exam 1 Solutions (PDF)
  • 2009: Practice Exam 1 Questions (PDF) ; Practice Exam 1 Solutions (PDF)

The exam should be completed in 90 minutes. This is a closed book exam. You are not allowed to use notes, equation sheets, books or any other aids.

  • Exam 1 Questions (PDF)
  • Exam 1 Solutions (PDF)

facebook

You are leaving MIT OpenCourseWare

  • Alzheimer's disease & dementia
  • Arthritis & Rheumatism
  • Attention deficit disorders
  • Autism spectrum disorders
  • Biomedical technology
  • Diseases, Conditions, Syndromes
  • Endocrinology & Metabolism
  • Gastroenterology
  • Gerontology & Geriatrics
  • Health informatics
  • Inflammatory disorders
  • Medical economics
  • Medical research
  • Medications
  • Neuroscience
  • Obstetrics & gynaecology
  • Oncology & Cancer
  • Ophthalmology
  • Overweight & Obesity
  • Parkinson's & Movement disorders
  • Psychology & Psychiatry
  • Radiology & Imaging
  • Sleep disorders
  • Sports medicine & Kinesiology
  • Vaccination
  • Breast cancer
  • Cardiovascular disease
  • Chronic obstructive pulmonary disease
  • Colon cancer
  • Coronary artery disease
  • Heart attack
  • Heart disease
  • High blood pressure
  • Kidney disease
  • Lung cancer
  • Multiple sclerosis
  • Myocardial infarction
  • Ovarian cancer
  • Post traumatic stress disorder
  • Rheumatoid arthritis
  • Schizophrenia
  • Skin cancer
  • Type 2 diabetes
  • Full List »

share this!

November 28, 2022

Study finds that cell organization in the hippocampus matters for memory formation

by University of Tsukuba

For Memory Formation, Organization Matters

Although we know that groups of cells working together in a specific brain region—the hippocampus—are vital for making, storing, and retrieving many types of memories, we still don't have a clear idea of how these cells are organized.

Researchers in Japan have recently identified an important piece of this puzzle; in rats, fear-based memories were made when cells in the hippocampus formed discrete clusters, suggesting that memory formation requires cells to be organized in a specific arrangement. The research also indicates that sleep is important for the stability of these cell clusters.

Most previous studies looking at the cellular organization of memories have used a technique called electrophysiology, which is based on brain activity that brain cells use to talk to one another. A major limitation of this technique is that it only allows the examination of a relatively small number of cells at a time, and within a limited area. Researchers from the University of Tsukuba used a different approach.

"A technique called 'immediate early gene imaging' allowed us to visualize cells that were active at a specific time within the entire rat hippocampus, rather than just a small part of it," explains Dr. Jiyeon Cho, lead author of the study. "We were able to see that, when memories were being formed, groups of active cells were organized in small, compact clusters throughout the hippocampus."

The researchers had previously used the same technique to identify similar small clusters of active cells during the formation of two other kinds of hippocampal-dependent memory. Together, their findings suggest that memory-encoding cells in the hippocampus need to be organized in a certain way in order to form memories.

Because sleep is vital for memory formation, the research team then decided to examine whether sleep had any effects on cluster organization. When rats were allowed to sleep after being trained to remember a fear-inducing stimulus (a small electric shock to the paws), they had much stronger memories of the fear, and there was also more clusters of active cells in their hippocampi.

"Together, our results demonstrate that the organization of cell clusters in the hippocampus is important for memory formation , and suggest that sleep helps to stabilize cell clusters to improve memory," says senior author of the study Professor Constantine Pavlides. "These findings take us one step closer to understanding exactly how memory works."

A better understanding of memory at a cellular level and how the network in the brain works together to perform memory may help us to improve the quality of life one day of millions of people living with dementia and other memory-related disorders, which are currently very difficult to treat.

The research was published in Hippocampus .

Explore further

Feedback to editors

research on memory finds that quizlet

Regular fish oil supplement use might increase first-time heart disease and stroke risk

4 hours ago

research on memory finds that quizlet

Pedestrians may be twice as likely to be hit by electric/hybrid cars as petrol/diesel ones

research on memory finds that quizlet

Study finds jaboticaba peel reduces inflammation and controls blood sugar in people with metabolic syndrome

5 hours ago

research on memory finds that quizlet

Study reveals how extremely rare immune cells predict how well treatments work for recurrent hives

7 hours ago

research on memory finds that quizlet

Researchers find connection between PFAS exposure in men and the health of their offspring

research on memory finds that quizlet

Researchers develop new tool for better classification of inherited disease-causing variants

research on memory finds that quizlet

Specialized weight navigation program shows higher use of evidence-based treatments, more weight lost than usual care

research on memory finds that quizlet

Exercise bouts could improve efficacy of cancer drug

research on memory finds that quizlet

Drug-like inhibitor shows promise in preventing flu

research on memory finds that quizlet

Scientists discover a novel way of activating muscle cells' natural defenses against cancer using magnetic pulses

8 hours ago

Related Stories

research on memory finds that quizlet

Newly discovered brain cell sheds light on the formation of memories

Jun 1, 2022

research on memory finds that quizlet

Entorhinal cortex acts independently of the hippocampus in remembering movement, study finds

Jan 12, 2017

research on memory finds that quizlet

Brain waves in mice change based on memory age

Dec 17, 2019

research on memory finds that quizlet

Researchers demonstrate a new neural-optic system to manipulate memories

Nov 12, 2021

research on memory finds that quizlet

First glimpse of brains retrieving mistaken memories observed

Jun 8, 2021

research on memory finds that quizlet

A deeper look inside the sleeping bird brain

Feb 14, 2019

Recommended for you

research on memory finds that quizlet

Male and female mice exhibit different empathic behaviors to others' pain

13 hours ago

research on memory finds that quizlet

Study: Certain nutrients may slow brain aging

10 hours ago

research on memory finds that quizlet

Research team demonstrates cortex's self-organizing abilities in neural development

research on memory finds that quizlet

'Space at your fingertips': Research unlocks the potential of touch for 3D spatial perception

research on memory finds that quizlet

Study models how ketamine's molecular action leads to its effects on the brain

12 hours ago

research on memory finds that quizlet

Happy or angry: Researchers discover brain network that recognizes emotions

May 20, 2024

Let us know if there is a problem with our content

Use this form if you have come across a typo, inaccuracy or would like to send an edit request for the content on this page. For general inquiries, please use our contact form . For general feedback, use the public comments section below (please adhere to guidelines ).

Please select the most appropriate category to facilitate processing of your request

Thank you for taking time to provide your feedback to the editors.

Your feedback is important to us. However, we do not guarantee individual replies due to the high volume of messages.

E-mail the story

Your email address is used only to let the recipient know who sent the email. Neither your address nor the recipient's address will be used for any other purpose. The information you enter will appear in your e-mail message and is not retained by Medical Xpress in any form.

Newsletter sign up

Get weekly and/or daily updates delivered to your inbox. You can unsubscribe at any time and we'll never share your details to third parties.

More information Privacy policy

Donate and enjoy an ad-free experience

We keep our content available to everyone. Consider supporting Science X's mission by getting a premium account.

E-mail newsletter

  • Share full article

Advertisement

Supported by

Social Media Use Is Linked to Brain Changes in Teens, Research Finds

Teens who frequently checked social media showed an increasing sensitivity to peer feedback, although the cause of the changes was not clear.

A close-up view of a teenager holding a smartphone in both hands.

By Ellen Barry

The effect of social media use on children is a fraught area of research, as parents and policymakers try to ascertain the results of a vast experiment already in full swing. Successive studies have added pieces to the puzzle, fleshing out the implications of a nearly constant stream of virtual interactions beginning in childhood.

A new study by neuroscientists at the University of North Carolina tries something new, conducting successive brain scans of middle schoolers between the ages of 12 and 15, a period of especially rapid brain development.

The researchers found that children who habitually checked their social media feeds at around age 12 showed a distinct trajectory, with their sensitivity to social rewards from peers heightening over time. Teenagers with less engagement in social media followed the opposite path, with a declining interest in social rewards.

The study , published on Tuesday in JAMA Pediatrics, is among the first attempts to capture changes to brain function correlated with social media use over a period of years.

The study has important limitations, the authors acknowledge. Because adolescence is a period of expanding social relationships, the brain differences could reflect a natural pivot toward peers, which could be driving more frequent social media use.

“We can’t make causal claims that social media is changing the brain,” said Eva H. Telzer, an associate professor of psychology and neuroscience at the University of North Carolina, Chapel Hill, and one of the authors of the study.

But, she added, “teens who are habitually checking their social media are showing these pretty dramatic changes in the way their brains are responding, which could potentially have long-term consequences well into adulthood, sort of setting the stage for brain development over time.”

A team of researchers studied an ethnically diverse group of 169 students in the sixth and seventh grades from a middle school in rural North Carolina, splitting them into groups according to how often they reported checking Facebook, Instagram and Snapchat feeds.

At around age 12, the students already showed distinct patterns of behavior. Habitual users reported checking their feeds 15 or more times a day; moderate users checked between one and 14 times; nonhabitual users checked less than once a day.

The subjects received full brain scans three times, at approximately one-year intervals, as they played a computerized game that delivered rewards and punishment in the form of smiling or scowling peers.

While carrying out the task, the frequent checkers showed increasing activation of three brain areas: reward-processing circuits, which also respond to experiences like winning money or risk-taking behavior; brain regions that determine salience, picking out what stands out in the environment; and the prefrontal cortex, which helps with regulation and control.

The results showed that “teens who grow up checking social media more often are becoming hypersensitive to feedback from their peers,” Dr. Telzer said.

The findings do not capture the magnitude of the brain changes, only their trajectory. And it is unclear, authors said, whether the changes are beneficial or harmful. Social sensitivity could be adaptive, showing that the teenagers are learning to connect with others; or it could lead to social anxiety and depression if social needs are not met.

Researchers in the field of social media warned against drawing sweeping conclusions based on the findings.

“They are showing that the way you use it at one point in your life does influence the way your brain develops, but we don’t know by how much, or whether it’s good or bad,” said Jeff Hancock, the founding director of the Stanford Social Media Lab, who was not involved in the study. He said that many other variables could have contributed to these changes.

“What if these people joined a new team — a hockey team or a volleyball team — so started getting a lot more social interaction?” he said. It could be, he added, that the researchers are “picking up on the development of extroversion, and extroverts are more likely to check their social media.”

He described the paper as “a very sophisticated piece of work,” contributing to research that has emerged recently showing that sensitivity to social media varies from person to person.

“There are people who have a neurological state that means they are more likely to be attracted to checking frequently,” he said. “We’re not all the same, and we should stop thinking that social media is the same for everyone.”

Over the last decade, social media has remapped the central experiences of adolescence, a period of rapid brain development.

Nearly all American teenagers engage through social media, with 97 percent going online every day and 46 percent reporting that they are online “almost constantly,” according to the Pew Research Center. Black and Latino adolescents spend more hours on social media than their white counterparts, research has shown.

Researchers have documented a range of effects on children’s mental health. Some studies have linked use of social media with depression and anxiety, while others found little connection. A 2018 study of lesbian, gay and bisexual teenagers found that social media provided them validation and support, but also exposed them to hate speech.

Experts who reviewed the study said that because the researchers measured students’ social media use only once, around age 12, it was impossible to know how it changed over time, or to rule out other factors that might also affect brain development.

Without more information about other aspects of the students’ lives, “it is challenging to discern how specific differences in brain development are to social media checking,” said Adriana Galvan, a specialist in adolescent brain development at the University of California Los Angeles, who was not involved in the study.

Jennifer Pfeifer, a professor of psychology at the University of Oregon and co-director of the National Scientific Council on Adolescence , said, “All experience accumulates and is reflected in the brain.”

“I think you want to put it into this context,” she said. “So many other experiences that adolescents have will also be changing the brain. So we don’t want to get into some kind of moral panic about the idea that social media is use is changing adolescents’ brains.”

Dr. Telzer, one of the study’s authors, described the rising sensitivity to social feedback as “neither good nor bad.”

“It’s helping them connect to others and obtain rewards from the things that are common in their social world, which is engaging in social interactions online,” she said.

“This is the new norm,” she added. “Understanding how this new digital world is influencing teens is important. It may be associated with changes in the brain, but that may be for good or for bad. We don’t necessarily know the long-term implications yet.”

Ellen Barry covers mental health. She has served as The Times’s Boston bureau chief, London-based chief international correspondent and bureau chief in Moscow and New Delhi. She was part of a team that won the 2011 Pulitzer Prize for International Reporting. More about Ellen Barry

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • HHS Author Manuscripts

Logo of nihpa

George Miller’s Magical Number of Immediate Memory in Retrospect: Observations on the Faltering Progression of Science

Nelson cowan.

University of Missouri

Miller’s (1956) article about storage capacity limits, “The magical number seven plus or minus two…,” is one of the best-known articles in psychology. Though influential in several ways, for about 40 years it was oddly followed by rather little research on the numerical limit of capacity in working memory, or on the relation between three potentially related phenomena that Miller described. Given that the article was written in a humorous tone and was framed around a tongue-in-cheek premise (persecution by an integer), I argue that it may have inadvertently stymied progress on these topics as researchers attempted to avoid ridicule. This commentary relates some correspondence with Miller on his article and concludes with a call to avoid self-censorship of our less conventional ideas.

How did it come about that a widely-cited work on a subject of fundamental and obvious interest could halt some areas of research rather than inspire them? I would argue that the famous article of George Miller (1956) on “the magical number seven plus or minus two” did just that. It was followed by a 40-year hiatus of work on the topic of item capacity limits in working memory. It seems a paradox for such a widely cited and esteemed source to inspire little closely-related follow-up work for such a long period. I will explore the situation, partly based on published sources and partly based on my own e-mail communications in 2000 with Miller, who died in 2012 ( APS Observer, 2012 ; Pinker, 2013 ; Vitello, 2012 ).

One of the key concepts in the field of cognitive science is that of working memory, often called short-term memory or immediate memory , terms that all refer to the temporarily heightened availability of information about a small number of recent events and thoughts. The terms have somewhat different connotations and detailed meanings, but these are inconsistent among investigators and unimportant for the present purposes. Google Scholar lists over 2.5 million entries for these three phrases. The concept of immediate memory was made popular by George A. Miller’s (1956) article on capacity limits in information processing, suggesting that it is limited to about seven units. It is one of the best-known works in the cognitive and psychological sciences, with about 20,000 scientific citations as of this writing (17 October, 2014). Its wider popular appeal is illustrated in a Google search for the key phrase from the article’s title, the magical number seven (or 7), which yielded about 873,000 results. Yet, for over 40 years, there was very little follow-up research on the specific processing limitations mentioned in the article. During most of that time, emphasis of the field shifted away from the item limits that Miller discussed, toward limits in the persistence or decay of items across time, and toward interference between items based on their similarity, rather than on capacity limits (following the seminal lead of Baddeley & Hitch, 1974 ). The investigation of item limits finally picked up again with a surge of research on visual working memory item limits after groundwork by Luck & Vogel (1997) and renewed interest based on a reappraisal of the limits in various domains ( Baddeley, 2000 , 2001 ; Cowan, 1999 , 2001 ). Currently, research on item limits is thriving (e.g., see Cowan, Rouder, Blume, & Saults, 2012 ; Ma, Husain, & Bays, 2014 ).

During the 40-year hiatus there were, to be sure, important works that made use of an item limit in working memory in order to model the human information processing system at large (e.g., Atkinson & Shiffrin, 1968 ; Broadbent, 1958 ). Most often, though, the authors fit or explained their data by assuming a somewhat smaller capacity limit of closer to 4 items (for a review see Cowan, 2001 ). Yet, no one denies that adult humans typically can repeat, without error, lists of up to about 7 items, such as random words or digits. Given such discrepancies, it has often been suggested that items capacity limits are highly task-specific or “just depend” on the circumstances.

Why did the field settle for such a vague pronouncement for so long rather than investigating the discrepancies directly? The hiatus in research on this topic may stem largely from the manner in which Miller (1956) wrote his famous article. To explain this, I will review the article briefly and then will describe his autobiographical remarks about how it was written, finally extracting some lessons from these remarks for the pursuit of science.

Description of Miller (1956)

In the 1956 article, Miller said he was persecuted by an integer: 7. He reviewed his own work along with other work in the current literature, discussing three kinds of tasks in which human abilities are shown to be limited to about 7 things. (1) In absolute judgment (or absolute identification) tasks, individuals experience one stimulus at a time and must indicate the category to which the stimulus belongs. Examples of simple stimuli include tones of different frequencies (perceived as different pitches) and lines of different lengths. The response to each stimulus is to be made in terms of arbitrary category labels provided by the experimenter (e.g., 1–10) that the participant has learned in a training phase. The basic finding is that participants can effectively use only about 5 to 9 different categories, that is, 7 plus or minus two as in the article’s title. The limit was thought to be expressed in binary choices or bits, in line with thinking of the day in which the conception of human processing was influenced by the blossoming field of computers (e.g., Newell & Simon, 1956 ). (2) A second type of task with a limit, the one remembered best by the public, was for memory span, the maximal length of a random word list that could be recalled with the items in order. Again, the answer was that roughly 7 items could be recalled in normal adults, give or take a few. Here, however, it was clear that the result could not be expressed in binary choices. People can remember lists of about 7 digits, letters, or words even though the response choices include 10 digits, 26 letters in English, and many thousands of words. Miller offered the concept of a chunk or unit of information that is coherent to the participant, suggesting that the limit of memory was about seven familiar chunks. For example, the letter string FBICIAUSA can be remembered without much difficulty if it is parsed into 3 chunks, each of which is an acronym representing an American agency if you know it: FBI , CIA , and USA . This idea about chunks, rather than the limit to about 7 items, may be the most important specific contribution of the article. (3) In the third type of task, which is known as subitizing, it was suggested that people can quickly assess, without counting, the number of simple objects in a collection of up to about 7, but no more (e.g., a very early study in which a handful of beans was dropped in a heap onto a table to be estimated: Jevons, 1871 ). Miller concluded his grand review with a question as to why these limits are similar, and with the answer that the similarity is probably coincidental.

Autobiographical Account of Miller (1956)

The retrospective account offered by Miller (1989) indicates that the 1956 article was based on a public address that he had to be cajoled into giving to the Eastern Psychological Association. He did not think that his two lines of research, on absolute judgments and memory span, provided the basis for a coherent hour-long address, and initially declined the invitation. Following the organizer’s persistence, he accepted the invitation when he realized that he could knit his findings into a story about their commonality: the number of bits of capacity in absolute judgment amounted to about 7 categories, similar to the number of chunks making up the limit in memory span. To make the story even better, he threw in the newly-awakening subitizing literature. When writing in 1989, Miller was unsure why the 1956 paper was so popular, compared to other work he had done.

When my follow-up, 2001 article on item capacity limits ( Cowan, 2001 ) was accepted for publication, the editor, Stevan Harnad, invited Miller to write a preamble to the article, but Miller, having focused on other topics after 1956, felt unwilling (and unqualified) to make a published statement. He did, however, offer some striking observations in emails. On 10 January, 2000 he wrote:

Nice to hear from you and to learn that 7 is still of interest. As you know, I have not been working on short-term memory problems or absolute judgment for many years. I have only occasionally commented on it when someone who apparently hadn’t read it over-generalized the conclusions–I had an interesting interaction with the billboard industry during the Ladybird Johnson era, which made me revisit the topic briefly… More recently, Steve Malinowsky put it up on his web site and I had an excuse to read it once again. I had rather dismissed the paper, assuming that its notoriety was attributable to the amusing idea of putting a confidence interval around a magical number. But on rereading I decided that, even after all these years, it was a good piece of work. I think there is some art in it, as well as ideas.

To my further surprise and delight, after reading my in-press target article, Miller (on 11 January, 2000) magnanimously indicated that he liked it and, further, was not surprised by the discrepancy between my finding and his own in 1956. He said:

I have now had a chance to read your BBS article. I think it is great! There were some ideas I thought you might have missed, but you didn’t. A good job…Herb Simon used to say “George had the right idea, but the wrong number.” I think Herb favored 3. I never argue with Herb Simon. I was aware, even in 1954 when I was writing my invited address, that the running memory span was only about 4, and my introspections convinced me that the 7 that was standard on the intelligence tests at that time must have been a hooking together of a rehearsed initial segment with a final segment from what Nancy Waugh used to call the “echo box.” But I was stuck with 7 because of the absolute judgment results (the first half of the article that people forget).

Miller went on to explain about his autobiographical chapter ( Miller, 1989 ) and with great humor indicated that “that is how I came to own the number 7.” Continuing, he explained his perspective on the field after 1956:

Since 1956, of course, my attention has wandered off in many other directions. As I recall it now, my interest was in the use of chunking to surmount the limits of short-term memory, but everyone else wanted to fight about the size of the limit, or whether there was any limit at all–questions of little interest to me. I was happy to let Alan Baddeley claim the stage…Now I am delighted that you have slogged through the accumulated “literature” and produced a thoughtful, well-organized interpretation. I have nothing more to add to what you have said so well. Congratulations…

The field thus gravitated toward time limits and away from item (or chunk) limits for many years after Miller (1956) . To lend further perspective to how the field changed throughout the years following Miller’s article, it is interesting to examine the changing views of Herb Simon, the 1978 winner of the Nobel Prize in economics whom Miller mentioned in his correspondence to me. Simon (1974) was one of very few early articles illustrating the limit in working memory to just several chunks regardless of their size. Simon later served as one of about 8 peer reviewers of my article ( Cowan, 2001 ) when it was first evaluated, and signed his review. Now, publication in Behavioral and Brain Sciences requires that the thesis be controversial enough to serve as a good basis for many following commentaries and my work met that criterion, I believe, because the reviewers radically differed. Whereas several reviewers thought that my conclusion was not newsworthy because it was already known, several others, including Simon, thought that I was wrong and that there was no clearly identifiable item limit. I was surprised that Simon seemed to be departing from the view in his 1974 article that favored fixed limits. I now wonder whether Simon might have lost enthusiasm for that view in part because his and Miller’s early work was met by such a dearth of follow-up studies on capacity limits.

Why Was Miller (1956) Revered and Passed Over at the Same Time?

I have touched on at least three possible reasons why Miller’s famous article did not engender a groundswell of research into item capacity limits that one might expect in the immediately following years. (1) For one thing, the limit seemed to depend on circumstances or task demands in ways that were not understood. Therefore, it was not clear that there was any fixed limit to be examined. (2) Also, the more important contribution of Miller (1956) was probably his observation that measurement of information in bits so popular in the engineering world could not explain human (as opposed to computer) working memory limits. Instead, what mattered for humans was the number of coherent, meaningful units in mind, or what has become known as chunks. This observation was clear and practically self-evident, with few researchers disputing it, so the case seemed settled. (3) Last, Miller juxtaposed descriptions of several very different item-related limits. He began with the tongue-in-cheek complaint that he was being persecuted by an integer and concluded with the coy statement that he suspected the similarity between the phenomena he discussed to be no more than “a pernicious, Pythagorean coincidence.” Scientists shy away from topics that could make them the butt of a joke (e.g., cold fusion), so research on possible real commonalities between the phenomena was thereby discouraged, inadvertently I would assume.

The Case for Returning to Research on the Topics of Miller (1956)

Each of the reasons why not much research was done on the specific topics Miller described leads me to a response about how vital that endeavor actually is.

  • (1) First is the notion that item capacity limits just depend on circumstances. I have tried to establish that meaningful sets of boundary conditions can in fact be found ( Cowan, 2001 ). Consider that telephone numbers around the world typically contain 6 to 10 digits, but consider also that the numbers are divided into groups of 2 to 4 digits. When attention must be focused on an ensemble of items all at once, the limit seems severe, in the range of about 3 for adults. The field now argues about whether the capacity limit is a matter of there being a fixed number of items in working memory or, instead, whether the capacity limit resembles a fluid resource that can be spread thinly over all stimulus items, sometimes too thinly to contribute to the necessary response for all items ( Ma et al., 2014 ). Now, at least, the issue is finally seen as an intensely interesting and tractable one to investigate (largely following a groundbreaking article by Zhang & Luck, 2008 ).
  • (2) Next is the sentiment that the process of chunking is self-evident. Although that may well be the case, the issue of capacity limits cannot be solved until the actual chunks that people form and use can be consistently identified across many situations. Therefore, it can be argued that effort should be invested in trying to find out if capacity is a constant number of identified chunks (cf. Chase & Simon, 1973 ; Gobet & Clarkson, 2004 ). This generally has seemed too difficult for the field and, what is more, time limits of working memory probably have seemed simpler, to such an extent that one might believe that only time limits, without item limits, could explain working memory performance (cf. Baddeley, 1986 ). However, one can sometimes observe either apparent time limits or apparent item limits, depending on whether verbal rehearsal of the material has been curtailed ( Chen & Cowan, 2009 ).
  • (3) Last, people were probably discouraged from exploratory research by Miller’s (1956) comment that the commonality between the tasks he observed was probably due to a coincidence. In fact, though, there are good reasons to suspect that common principles may be at work across all three of the phenomena that Miller described (immediate memory of lists, absolute identification of stimuli, and estimation of small numbers of items). The reasons are as follows.

Absolute judgment is one of the simplest tasks imaginable and seems like a process that is often carried out in ordinary life. A single stimulus is delivered to the research participant, who only needs to decide to which of several pre-taught categories the stimulus belongs. The apparent simplicity of the task does not, however, ensure that the mental processes are in fact that simple. Holland and Lockhead (1968) offered evidence on the absolute judgment of tone loudness indicating that these judgments are made with respect to previous trials. To judge the tone on Trial N, the tones presented on Trial N-1 had an assimilative effect: when N-1 had been more intense it made the tone on Trial N seem louder, as well; and vice versa. The previous few tones instead had a small contrast effect: when these tones had been more intense they made the tone on Trial N seem quieter and vice versa. The effect seemed to extend back to about Tone N-5, in the range of Miller’s magic number. Although the mechanism of these effects is still debatable, it appears that across trials the task involves a list from the participant’s point of view, and the sequential effects are subject to something akin to the usual working memory constraints. Similarly, Siegel and Siegel (1972, Figure 2) showed that when feedback was given, there was a large effect of the number of trials intervening between the stimulus on Trial N and the trial in which that same stimulus was last presented, again in the range of several items. These, however, were isolated investigations and it is only recently that a model of absolute judgment has been proposed, incorporating working memory effects ( Brown, Marley, Donkin, & Heathcote, 2008 ). Recent research also has finally overcome the notion that absolute identification of objects on a simple continuum cannot be improved through training; it can if participants can find a way to link the judgment to some stable frame of reference, somewhat analogous to how immediate memory can be improved through the application of knowledge for chunk formation. Several recent studies on training of absolute judgment have been quite successful ( Dodds, Donkin, Brown, & Heathcote, 2011 ; Rouder, Morey, Cowan, & Pfaltz, 2004 ).

Rapid enumeration of small clusters of items, the third topic described by Miller (1956) , also may rely on similar working-memory mechanisms. In one version of working memory theory, the item capacity limit is identified with the capacity of the human focus of attention and how many objects can be attended at once ( Cowan, 2001 ). The hallmark of small-set enumeration, or subitizing, is that the items must be attended and individuated – perceived as separate objects – within a short period. The typical limit in the literature is not about 7 as in the suggestion of Miller, but on the order of 3 or 4 as in the literature on memory for items in an array ( Luck & Vogel, 1997 ), and performance levels on the two tasks are highly correlated and appear to share a common limited-capacity resource ( Piazza, Fumarola, Chinello, & Melcher, 2011 ).

The Subsistence and Revival of Chunk Capacity Limits

It would be too extreme to say that all research on item and chunk capacity limits ceased after Miller’s 1956 article. As an analogy, although behaviorism had an inhibiting effect on research on mental mechanisms, such research did not totally cease during that era. There were, likewise, important studies involving item or chunk capacity limits in the years following Miller (e.g., Broadbent, 1975 ; Graesser & Mandler, 1978 ; Simon, 1974 ; Tulving & Patkau, 1962 ; Waugh & Norman, 1965 ; Zhang and Simon, 1985 ). As I ( Cowan, 2001 ) have previously pointed out, there were also a number of other works that assumed a limited capacity as part of a larger model of cognitive processes, just without a primary focus on the assumed capacity limit and without an independent, direct evaluation of it (e.g., the seminal work of Atkinson & Shiffrin, 1968 ). Thus, I think that the work on chunk capacity limits was stymied, but not halted, by Miller’s humorous presentation.

When it was suggested that much of working memory has a time limit ( Baddeley & Hitch, 1974 ; Brown, 1958 ; Peterson & Peterson, 1959 ), that concept easily predominated in the field, a fact that I attribute largely to the weakness of support in that era for basic item or chunk limits. Indeed, my recollection is that for many years, sophisticated researchers tended to hold the opinion that, although the number of items in working memory is somehow limited, the limit just depends on the circumstances in a complex way that cannot easily be pinned down. Naturally, such opinions did not result in many publications articulating that view, which basically says that we don’t know enough to begin to measure a capacity limit expressed in chunks. That is clearly a view that was held by a number of the initial critical reviewers of Cowan (2001) , and by some of the published commentaries included in that work.

In the mid-1970s, the field rather readily adopted Baddeley’s general assumption that a time limit took the place of an item limit (cf. Barrouillet, Portrat, & Camos, 2011 ; Case, Kurland, & Goldberg, 1982 ), with apparent item limits emerging indirectly, because of time limits (e.g., Schweickert & Boruff, 1986 ). Beginning with my graduate training starting in 1974, I, too, focused mostly on time limits, but my reading in the field led me to a nagging sense that different people were saying things that badly conflicted with one another, given the stark difference between item- and time- limited effects. The question for me was whether both kinds of limits could have validity. I thought they could, suggesting ( Cowan, 1988 , p. 166):

“Estimates of short-term storage capacity may be inflated by contributions of the long-term store. To obtain pure estimates of short-term storage, some investigators (e.g., Glanzer & Razel, 1974 ; Watkins, 1974 ) have subtracted out the assumed contribution of long-term storage. The resulting estimate for adults is two or three items in short-term storage. Perhaps the number of activated memory items is limited to about seven, whereas the subset of these items in awareness and voluntary attention is limited to two or three…Other researchers (Baddeley, Thomson, & Buchanan, 1975; Schweickert & Boruff, 1986 ) have suggested that verbal short-term memory is limited in the duration of storage as well as the number of items. When the list contains no organizing cues and rote rehearsal must he used, subjects appear able to recall as much as they can rehearse in 1.5–2.0 s. (This duration does not necessarily estimate simple memory decay, because the process of rehearsing or recalling one part of a sequence could interfere with memory for another part.) Thus, there appear to be constraints in both the number of items and the duration of pronounceable sequences in short-term storage. Although it is not clear how these two constraints work together, it might be possible to retain up to two or three chunks nonverbally while rehearsing other information (cf. Zhang & Simon, 1985 ). At least, some studies ( Brooks, 1968 ; Scarborough, 1972 ) suggest that there are separate verbal and nonverbal components of short-term memory that can be used together.”

This kind of dual, item-plus-time-limit approach seems to have some validity (e.g., Cowan, Lichty, & Grove, 1990 ; Chen & Cowan, 2009 ), though both kinds of limits must be painstakingly disentangled from interference effects ( Cowan, Saults, & Blume, 2014 ; Oberauer, Lewandowsky, Farrell, Jarrold, & Greaves, 2012 ; Ricker & Cowan, 2014 ).

In the 1970s through the 1990s, personally it felt all right to have item capacity limits in the background, but I sometimes felt ridiculed if these capacity limits were foregrounded in conversation. For example, the feeling of ridicule sometimes came up in conversations at a conference at the University of Colorado in 1997 that resulted in a book of conflicting theoretical viewpoints ( Miyake & Shah, 1999 ). The conference also gave me encouragement, though, as I could see that (1) views on working memory were quite disparate, (2) some researchers agreed with me about the existence of chunk capacity limits, and (3) I was able to make some headway in convincing others and refining my own views. The conference helped in the formulation of the 2001 article in Behavioral and Brain Sciences , though I felt scorn from a number of reviewers. The feeling subsided when the paper was accepted, albeit by a journal that thrived upon controversy.

Baddeley (1994) correctly pointed out that Miller (1956) did have the profound effect of ending a general quest, in the early formative years of cognitive psychology: a quest to explain human limits in terms of the bits (binary choices) of information theory. Miller argued that bits were not the basic units of immediate memory; meaningful chunks were. While disproving bit limits, however, item and chunk limits were not explored and pinned down nearly as much as I would have expected and hoped, until the recent research boom in that area.

Concluding Remarks: Lessons for the Progress of Science

Talk about things being pernicious ( Miller, 1956 ); one of the most important constraints that science faces is the restriction of topics that individual scientists pursue. They place these restrictions on themselves because they do not wish to be perceived in a manner that would hurt their careers, discourage funding, or make them seem foolish or laughable. These concerns are not without a basis in reality. For example, in accounts of the career of Judah Folkman, who pioneered the well-accepted theory of the angiogenesis of tumors, he was laughingstock whose findings were often thoughtlessly dismissed until the theory was finally accepted [Newsweek, 28 January, 2008, A quiet hero in the cancer war: Dr. Judah Folkman, 74 .] It is important for reviewers to try to be open-minded to unconventional ideas, albeit without lowering the bar for the requirement of solid evidence. George Miller was a humble man who never would have dreamed that his article would become so important, nor that the entertaining manner in which it was presented might discourage others from pursuing the basic phenomena described within.

Acknowledgments

I thank Ed Awh, Mike Kane, Steve Luck, and my wife, Jean Ispa, for helpful comments. This work was completed with support from NIH Grant R01-HD21338.

  • APS Observer. Remembering George A Miller 2012 Oct; 25 (8) [ Google Scholar ]
  • Atkinson RC, Shiffrin RM. Human memory: A proposed system and its control processes. In: Spence KW, Spence JT, editors. The psychology of learning and motivation: Advances in research and theory. Vol. 2. New York: Academic Press; 1968. pp. 89–195. [ Google Scholar ]
  • Baddeley AD. Working memory. Oxford, England: Clarendon Press; 1986. [ Google Scholar ]
  • Baddeley A. The magical number seven: Still magical after all these years? Psychological Review. 1994; 101 :353–356. [ PubMed ] [ Google Scholar ]
  • Baddeley A. The episodic buffer: a new component of working memory? Trends in Cognitive Sciences. 2000; 4 :417–423. [ PubMed ] [ Google Scholar ]
  • Baddeley A. The magic number and the episodic buffer. Behavioral and Brain Sciences. 2001; 24 :117–118. [ Google Scholar ]
  • Baddeley AD, Hitch G. Working memory. In: Bower GH, editor. The psychology of learning and motivation. Vol. 8. New York: Academic Press; 1974. pp. 47–89. [ Google Scholar ]
  • Barrouillet P, Portrat S, Camos V. On the law relating processing to storage in working memory. Psychological Review. 2011; 118 :175–192. [ PubMed ] [ Google Scholar ]
  • Broadbent DE. Perception and communication. New York: Pergamon Press; 1958. [ Google Scholar ]
  • Broadbent DE. The magic number seven after fifteen years. In: Kennedy A, Wilkes A, editors. Studies in long term memory. Oxford, England: John Wiley & Sons; 1975. pp. 3–18. [ Google Scholar ]
  • Brooks LR. Spatial and verbal components of the act of recall. Canadian Journal of Psychology. 1968; 22 :349–368. [ Google Scholar ]
  • Brown J. Some tests of the decay theory of immediate memory. Quarterly Journal of Experimental Psychology. 1958; 10 :12–21. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Brown SD, Marley AAJ, Donkin C, Heathcote A. An integrated model of choices and response times in absolute identification. Psychological Review. 2008; 115 :396–425. [ PubMed ] [ Google Scholar ]
  • Case R, Kurland DM, Goldberg J. Operational efficiency and the growth of short term memory span. Journal of Experimental Child Psychology. 1982; 33 :386–404. [ Google Scholar ]
  • Chase W, Simon HA. The mind’s eye in chess. In: Chase WG, editor. Visual information processing. New York: Academic Press; 1973. pp. 215–281. [ Google Scholar ]
  • Chen Z, Cowan N. Core verbal working memory capacity: The limit in words retained without covert articulation. Quarterly Journal of Experimental Psychology. 2009; 62 :1420–1429. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Costello F, Watts P. Surprisingly rational: Probability theory plus noise explains biases in judgment. Psychological Review. 2014; 121 :463–480. [ PubMed ] [ Google Scholar ]
  • Cowan N. Evolving conceptions of memory storage, selective attention, and their mutual constraints within the human information processing system. Psychological Bulletin. 1988; 104 :163–191. [ PubMed ] [ Google Scholar ]
  • Cowan N. An embedded-processes model of working memory. In: Miyake A, Shah P, editors. Models of Working Memory: Mechanisms of active maintenance and executive control. Cambridge, U.K.: Cambridge University Press; 1999. pp. 62–101. [ Google Scholar ]
  • Cowan N. The magical number 4 in short-term memory: A reconsideration of mental storage capacity. Behavioral and Brain Sciences. 2001; 24 :87–185. [ PubMed ] [ Google Scholar ]
  • Cowan N, Lichty W, Grove TR. Properties of memory for unattended spoken syllables. Journal of Experimental Psychology: Learning, Memory, & Cognition. 1990; 16 :258–269. [ PubMed ] [ Google Scholar ]
  • Cowan N, Rouder JN, Blume CL, Saults JS. Models of verbal working memory capacity: What does it take to make them work? Psychological Review. 2012; 119 :480–499. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Cowan N, Saults JS, Blume CL. Central and peripheral components of working memory storage. Journal of Experimental Psychology: General. 2014; 143 :1806–1836. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Dodds P, Donkin C, Brown SD, Heathcote A. Increasing capacity: practice effects in absolute identification. Journal of Experimental Psychology: Learning, Memory, and Cognition. 2011; 37 :477–492. [ PubMed ] [ Google Scholar ]
  • Gobet F, Clarkson G. Chunks in expert memory: Evidence for the magical number four – or is it two? Memory. 2004; 12 :732–747. [ PubMed ] [ Google Scholar ]
  • Glanzer M, Razel M. The size of the unit in short term storage. Journal of Verbal Learning & Verbal Behavior. 1974; 13 :114–131. [ Google Scholar ]
  • Graesser A, II, Mandler G. Limited processing capacity constrains the storage of unrelated sets of words and retrieval from natural categories. Journal of Experimental Psychology: Human Learning and Memory. 1978; 4 :86–100. [ Google Scholar ]
  • Holland M, Lockhead GR. Sequential effects in absolute judgments of loudness. Perception & Psychophysics. 1968; 3 :409–414. [ Google Scholar ]
  • Jevons WS. The power of numerical discrimination. Nature. 1871; 3 :281–282. [ Google Scholar ]
  • Luck SJ, Vogel EK. The capacity of visual working memory for features and conjunctions. Nature. 1997; 390 :279–281. [ PubMed ] [ Google Scholar ]
  • Ma WJ, Husain M, Bays PM. Changing concepts of working memory. Nature Neuroscience. 2014; 17 :347–56. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Miller GA. The magical number seven, plus or minus two: Some limits on our capacity for processing information. Psychological Review. 1956; 63 :81–97. [ PubMed ] [ Google Scholar ]
  • Miller GA. George A. Miller. In: Gardner Lindzey., editor. A history of psychology in autobiography. VIII. Stanford, CA: Stanford University Press; 1989. pp. 391–418. [ Google Scholar ]
  • Miyake A, Shah P, editors. Models of Working Memory: Mechanisms of active maintenance and executive control. Cambridge, U.K.: Cambridge University Press; 1999. [ Google Scholar ]
  • Newell A, Simon HA. The logic theory machine: A complex information processing system. Santa Monica, CA: Rand Corporation; 1956. [ Google Scholar ]
  • Oberauer K, Lewandowsky S, Farrell S, Jarrold C, Greaves M. Modeling working memory: An interference model of complex span. Psychonomic Bulletin & Review. 2012; 19 :779–819. [ PubMed ] [ Google Scholar ]
  • Peterson LR, Peterson MJ. Short term retention of individual verbal items. Journal of Experimental Psychology. 1959; 58 :193–198. [ PubMed ] [ Google Scholar ]
  • Piazza M, Fumarola A, Chinello A, Melcher D. Subitizing reflects visuo-spatial object individuation capacity. Cognition. 2011; 121 :147–153. [ PubMed ] [ Google Scholar ]
  • Pinker S. George A. Miller (1920–2012) American Psychologist. 2013; 68 :467–468. [ PubMed ] [ Google Scholar ]
  • Ricker TJ, Cowan N. Differences between presentation methods in working memory procedures: A matter of working memory consolidation. Journal of Experimental Psychology: Learning, Memory, and Cognition. 2014; 40 :417–428. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Rouder JN, Morey RD, Cowan N, Pfaltz M. Learning in a unidimensional absolute identification task. Psychonomic Bulletin & Review. 2004; 11 :938–944. [ PubMed ] [ Google Scholar ]
  • Scarborough DL. Stimulus modality effects on forgetting in short term memory. Journal of Experimental Psychology. 1972; 95 :285–289. [ PubMed ] [ Google Scholar ]
  • Schweickert R, Boruff B. Short term memory capacity: Magic number or magic spell? Journal of Experimental Psychology: Learning, Memory, and Cognition. 1986; 12 :419–425. [ PubMed ] [ Google Scholar ]
  • Simon HA. How big is a chunk? Science. 1974; 183 :482–488. [ PubMed ] [ Google Scholar ]
  • Tulving E, Patkau JE. Concurrent effects of contextual constraint and word frequency on immediate recall and learning of verbal material. Canadian Journal of Psychology. 1962; 16 :83–95. [ PubMed ] [ Google Scholar ]
  • Vitello P. George A. Miller, a pioneer in cognitive psychology, is dead at 92. New York Times. 2012 Aug 2;:A19. [ Google Scholar ]
  • Watkins MJ. Concept and measurement of primary memory. Psychological Bulletin. 1974; 81 :695–711. [ Google Scholar ]
  • Waugh NC, Norman DA. Primary memory. Psychological Review. 1965; 72 :89–104. [ PubMed ] [ Google Scholar ]
  • Zhang W, Luck SJ. Discrete fixed-resolution representations in visual working memory. Nature. 2008; 453 :23–35. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Zhang G, Simon HA. STM capacity for Chinese words and idioms: Chunking and acoustical loop hypotheses. Memory and Cognition. 1985; 13 :193–201. [ PubMed ] [ Google Scholar ]

Child maltreatment and memory

Affiliation.

  • 1 Department of Psychology, University of California, Davis, California 95616, USA. [email protected]
  • PMID: 19575622
  • DOI: 10.1146/annurev.psych.093008.100403

Exposure to childhood trauma, especially child maltreatment, has important implications for memory of emotionally distressing experiences. These implications stem from cognitive, socio-emotional, mental health, and neurobiological consequences of maltreatment and can be at least partially explained by current theories concerning the effects of childhood trauma. In this review, two main hypotheses are advanced: (a) Maltreatment in childhood is associated with especially robust memory for emotionally distressing material in many individuals, but (b) maltreatment can impair memory for such material in individuals who defensively avoid it. Support for these hypotheses comes from research on child abuse victims' memory and suggestibility regarding distressing but nonabusive events, memory for child abuse itself, and autobiographical memory. However, more direct investigations are needed to test precisely when and how childhood trauma affects memory for emotionally significant, distressing experiences. Legal implications and future directions are discussed.

Publication types

  • Research Support, U.S. Gov't, Non-P.H.S.
  • Adaptation, Psychological / physiology
  • Avoidance Learning / physiology
  • Child Abuse / psychology*
  • Child, Preschool
  • Emotions / physiology*
  • Life Change Events
  • Memory / physiology*
  • Stress, Psychological / physiopathology

Suggestions or feedback?

MIT News | Massachusetts Institute of Technology

  • Machine learning
  • Social justice
  • Black holes
  • Classes and programs

Departments

  • Aeronautics and Astronautics
  • Brain and Cognitive Sciences
  • Architecture
  • Political Science
  • Mechanical Engineering

Centers, Labs, & Programs

  • Abdul Latif Jameel Poverty Action Lab (J-PAL)
  • Picower Institute for Learning and Memory
  • Lincoln Laboratory
  • School of Architecture + Planning
  • School of Engineering
  • School of Humanities, Arts, and Social Sciences
  • Sloan School of Management
  • School of Science
  • MIT Schwarzman College of Computing

Research explains how the brain finds Waldo

Press contact :.

At any given moment, the world bombards the senses with more information than the brain can process, and for more than a century scientists and psychologists have debated how the brain filters out distractions and focuses attention on the things that matter.

Using the visual system as a model, Professor Robert Desimone, director of the McGovern Institute for Brain Research at MIT, and his former colleagues at the National Institutes of Health show that neurons synchronize their signals to command attention, like a chorus rising above the din of noisy chatter in a crowded room.

"We think that synchronizing signals could be a general way the brain focuses on what's important," says Desimone, who also holds an appointment through MIT's Department of Brain and Cognitive Sciences. "Attention is a general problem for the brain, and maybe it has a general solution."

This new study, published in a recent issue of Science, addresses a central question that anyone who has tackled a "Where's Waldo?" book can appreciate. When looking for Waldo on the crowded page, does the brain scan the page spatially (serial processing), like a mental spotlight moving across an otherwise dark page? Or does the brain take in the whole page at once and gradually zoom in on relevant features such as color and shape (parallel processing).

In the first model, the spotlight of attention would track across the page, checking each detail against a mental image of Waldo's red stocking cap and striped shirt. In the second model, the color red and stocking-cap shapes would gradually come to the foreground and other shapes and colors would recede.

For decades, scientists divided into two camps regarding these models, but recent evidence made some scientists suspect that the brain conducts a combination of the two. "What's cool about this paper is that it shows both processes are going on in the same chunk of the brain and in the same neurons," says Jeremy Wolfe, professor of Ophthalmology at Harvard Medical School, who wrote an accompanying review article in Science.

To explore visual attention, researchers study macaque monkeys, recording the activity of specific neurons, along with the eye movements, while the monkeys scan a complex array in an experimental equivalent of looking for Waldo. The neurons belong to the V4 area, a midregion of the visual cortex known to be important to attention.

Neurons specialize as to what they detect best. A "red" neuron gives off a stronger signal when red appears in the field of view, and the signal is even stronger if the monkey is actively searching for red. Moreover, if the monkey is searching for a red object, red neurons turn up their activity before the eyes even move toward the red item, as if the louder signal were calling: Look over here! "We think the yelling neurons are commanding the eyes to move toward a feature that matches something in the mental image," Desimone says.

Even so, the ability of a neuron to raise its lone voice does not explain how it gets heard over the cacophony of all the other neurons. "We think it's not just a question of the individual neuron," he says. "It's how it cooperates with other neurons to make their voices heard. We showed that to increase the signal, the neurons synchronize their activity."

Desimone uses the analogy of a room full of people talking. If random individuals raise their voices, the room just gets louder. If a group of people starts chanting in unison, their voices rise above the background noise.

Synchronization of the signals helps explain how the brain uses parallel processing to concentrate on relevant features in a complex scene. Then the brain switches to serial processing, scrutinizing relevant objects sequentially to find the object of desire.

The study was funded by the National Institute of Mental Health.

A version of this article appeared in MIT Tech Talk on June 8, 2005 (download PDF) .

Share this news article on:

Related links.

  • Desimone to direct McGovern Institute
  • McGovern Institute for Brain Research at MIT
  • Science Magazine

Related Topics

  • Neuroscience

More MIT News

Iwnetim Abate addresses an audience with other panelists sitting behind him.

H2 underground

Read full story →

Ten portrait photos are featured in geometrical shapes on a dark blue background. Text indicates "2024 Design Fellows"

2024 MAD Design Fellows announced

Artistic photo of MIT columns in the background with frozen tree branches in the foreground.

School of Engineering first quarter 2024 awards

Audrey Chen crouches on top of a small orange boat while Jared Byars stands in the water, steadying the boat. Both are laughing.

From NASA to MIT to Formlabs

Portrait headshot of Robert Gilliard standing in front of pine trees

An expansive approach to making new compounds

A young man wearing a long-sleeve T-shirt, jeans, and sneakers scrambles over a rocky ledge atop a high mountain. Clouds, a broad sky, and forested hilltops are visible in the background.

Q&A: A graduating student looks back on his MIT experience

  • More news on MIT News homepage →

Massachusetts Institute of Technology 77 Massachusetts Avenue, Cambridge, MA, USA

  • Map (opens in new window)
  • Events (opens in new window)
  • People (opens in new window)
  • Careers (opens in new window)
  • Accessibility
  • Social Media Hub
  • MIT on Facebook
  • MIT on YouTube
  • MIT on Instagram

ScienceDaily

Eurasian jays can use 'mental time travel' like humans, study finds

Study finds jays remember incidental details, similar to episodic memory in humans.

Eurasian jays can remember incidental details of past events, which is characteristic of episodic memory in humans, according to a study published May 15, 2024, in the open-access journal PLOS ONE by James Davies of the University of Cambridge, UK and colleagues.

When remembering events, humans have the ability of "mental time travel," consciously reimagining past experiences and potentially recalling details that seemed unimportant at the time. Some researchers have suggested that this "episodic memory" is unique to humans. In this study, Davies and colleagues ran a memory experiment to test for episodic-like memory in seven Eurasian jays, birds that excel at remembering the location of stored food.

In the experiment, the birds watched food get placed beneath one cup in a line of four identical cups and were then rewarded for correctly selecting the baited cup. Over several trials, the birds were trained to identify the correct cup by remembering its position in line. Then, at test, the jays were given an unexpected memory assessment: they watched food get placed beneath one of the cups, which now all had unique visual characteristics, but they were then separated from the cups for 10 minutes while the cups were relocated and rearranged. Despite the changed positions of the cups and the added time delay, the birds still correctly identified the baited cup according to their visual characteristics 70% of the time.

These results suggest that even though visual differences between the cups were unimportant during training, the birds were able to notice those differences at test and recall them later, similar to episodic memory in humans. This study indicates that episodic-like memory might aid jays in finding food stores, and the researchers suggest that future studies might investigate whether the birds can perform similar feats of memory in other non-food-related scenarios.

The authors add: "As the jays were able to remember details that held no specific value or relevance at the time that the memory was created, this suggests that they are able to record, recall, and access incidental information within a remembered event. This is an ability that characterises the type of human memory through which we mentally 'relive' past events (or episodes ), known as 'episodic' memory."

  • Intelligence
  • Educational Psychology
  • Animal Learning and Intelligence
  • Canada Lynx
  • Jane Goodall
  • Memory-prediction framework
  • Memory bias
  • Limbic system
  • Philosophy of mind

Story Source:

Materials provided by PLOS . Note: Content may be edited for style and length.

Journal Reference :

  • James R. Davies, Elias Garcia-Pelegrin, Nicola S. Clayton. Eurasian jays (Garrulus glandarius) show episodic-like memory through the incidental encoding of information . PLOS ONE , 2024; 19 (5): e0301298 DOI: 10.1371/journal.pone.0301298

Cite This Page :

Explore More

  • Stopping Flu Before It Takes Hold
  • Cosmic Rays Illuminate the Past
  • Star Suddenly Vanish from the Night Sky
  • Dinosaur Feather Evolution
  • Warming Climate: Flash Droughts Worldwide
  • Record Low Antarctic Sea Ice: Climate Change
  • Brain 'Assembloids' Mimic Blood-Brain Barrier
  • 'Doomsday' Glacier: Catastrophic Melting
  • Blueprints of Self-Assembly
  • Meerkat Chit-Chat

Trending Topics

Strange & offbeat.

IMAGES

  1. Long term memory Diagram

    research on memory finds that quizlet

  2. cognitive psychology connecting mind, research and everyday experience

    research on memory finds that quizlet

  3. Memory Diagram

    research on memory finds that quizlet

  4. Memory : The Multi- Store Model of Memory Diagram

    research on memory finds that quizlet

  5. memory research methods Flashcards

    research on memory finds that quizlet

  6. Chapter 8

    research on memory finds that quizlet

VIDEO

  1. Examiety: How to help students through test anxiety

  2. Python in Python: The PyPy System

  3. MEMORY

  4. आपकी memory में हैं कितनी झूठीं यादें?

  5. Maximize memory retention with microlearning

  6. C++ MFC Memory Scanner

COMMENTS

  1. psych chapter 7 Flashcards

    One was a false event that involved them committing a crime. The experiment demonstrated that. a) it is surprisingly easy to lead people to construct false memories. b) hypnotic suggestion is an effective technique for accurate memory retrieval. c) people can easily distinguish between their own true and false memories.

  2. PSY383 Exam 3 Flashcards

    Study with Quizlet and memorize flashcards containing terms like Research on memory consolidation suggests that when retrieving an old memory:, Which statement is TRUE regarding episodic memory?, According to Endel Tulving, which type of memory stores specific autobiographical events? and more. ... one finds that in the basal ganglia:

  3. Chapter 3: Memory Flashcards

    Study with Quizlet and memorize flashcards containing terms like Which of the following statements is true about memory? a. Recalling information is easier during stressful situations. b. A memory is retrieved from storage each time it is recalled. c. Memories that are recalled or retrieved less often are easier to find. d. Memories are created when brain cells connect several times in a ...

  4. Chapter 6 Psyc practice quiz: not all correct Flashcards

    Study with Quizlet and memorize flashcards containing terms like people may experience _____ when they have forgotten the true source of a memory., Elizabeth Loftus's research on memory has demonstrated the _____ effect by showing that the way a question is worded influenced the information that is retained., When you speak to your professor about your concern that you will not do well on your ...

  5. Chapter 8 Psychology Flashcards

    Study with Quizlet and memorize flashcards containing terms like The inability to remember events and experiences that occurred during the first two or three years of life is termed _____., In the 1930s, the research of the British psychologist Sir Frederic Bartlett provided evidence to support the view that memory is:, Telo convinces a woman he finds attractive to give him her telephone number.

  6. psychology 8,10,11 Flashcards

    Study with Quizlet and memorize flashcards containing terms like Research on young children's false eyewitness recollections has indicated that a. children are no more susceptible to the misinformation effect than adults. b. it is surprisingly difficult for both children and professional interviewers to reliably separate the children's true memories from false memories. c. children are less ...

  7. Study Finds That Memory Works Differently in the Age of Google

    The rise of Internet search engines like Google has changed the way our brain remembers information, according to research by Columbia University psychologist Betsy Sparrow published July 14 in Science. "Since the advent of search engines, we are reorganizing the way we remember things," said Sparrow. "Our brains rely on the Internet for ...

  8. Inside the Science of Memory

    Dementia (di-men-sha) : A loss of brain function that can be caused by a variety of disorders affecting the brain. Symptoms include forgetfulness, impaired thinking and judgment, personality changes, agitation and loss of emotional control. Alzheimer's disease, Huntington's disease and inadequate blood flow to the brain can all cause dementia.

  9. Internet Use Affects Memory, Study Finds

    Internet Use Affects Memory, Study Finds. 124. By Patricia Cohen. July 14, 2011. The widespread use of search engines and online databases has affected the way people remember information ...

  10. Stanford researchers observe memory formation in real time

    Stanford neuroscientists observe memory formation in real time. Watch on. In their new study, published July 8, 2022 in Neuron, the researchers trained mice to use their paws to reach food pellets through a small slot. Using genetic wizardry developed by the lab of Liqun Luo, a Wu Tsai Neurosciences Institute colleague in the Department of ...

  11. Focus on learning and memory

    In this special issue of Nature Neuroscience, we feature an assortment of reviews and perspectives that explore the topic of learning and memory. Learning new information and skills, storing this ...

  12. Chapter 7. Forgetting and Amnesia

    CHAPTER 7: FORGETTING & AMNESIA. Chances are that you have experienced memory lapses and been frustrated by them. You may have had trouble remembering the definition of a key term on an exam or found yourself unable to recall the name of an actor from one of your favorite TV shows. Maybe you forgot to call your aunt on her birthday or you ...

  13. Earliest Memories Start at Age Two and a Half, Study Finds

    New research shows that our earliest memories may begin at age 2.5, about a year sooner than previously thought. How far back you can remember depends on a long line-up of factors, including your culture, gender, family, and the way in which you're asked to recall memories. You may be able to remember further back when asked repeatedly over ...

  14. Researchers uncover how the human brain separates, stores, and

    They theorized that the brain uses boundary peaks as markers for "skimming" over past memories, much in the way the key photos are used to identify events. When the brain finds a firing pattern that looks familiar, it "opens" that event. Two different memory tests designed to study this theory were used.

  15. Study Find First in Human Evidence of How Memories Form

    December 6, 2021. Summary: Researchers have identified the characteristics of over 100 memory-sensitive neurons that play a key role in how memories are recalled in the brain. Source: UT Southwestern Medical Center. In a discovery that could one day benefit people suffering from traumatic brain injury, Alzheimer's disease, and schizophrenia ...

  16. Heavy multitaskers have reduced memory

    A decade of data reveals that heavy multitaskers have reduced memory, Stanford psychologist says. People who frequently engage with multiple types of media at once performed worse on simple memory ...

  17. Exam 1

    Exam. The exam should be completed in 90 minutes. This is a closed book exam. You are not allowed to use notes, equation sheets, books or any other aids. Exam 1 Questions (PDF) Exam 1 Solutions (PDF)

  18. Study finds that cell organization in the hippocampus matters for

    Together, their findings suggest that memory-encoding cells in the hippocampus need to be organized in a certain way in order to form memories. Because sleep is vital for memory formation, the ...

  19. Social Media Use Is Linked to Brain Changes in Teens, Research Finds

    Over the last decade, social media has remapped the central experiences of adolescence, a period of rapid brain development. Nearly all American teenagers engage through social media, with 97 ...

  20. George Miller's Magical Number of Immediate Memory in Retrospect

    Miller's (1956) article about storage capacity limits, "The magical number seven plus or minus two…," is one of the best-known articles in psychology. Though influential in several ways, for about 40 years it was oddly followed by rather little research on the numerical limit of capacity in working memory, or on the relation between three potentially related phenomena that Miller ...

  21. Child maltreatment and memory

    In this review, two main hypotheses are advanced: (a) Maltreatment in childhood is associated with especially robust memory for emotionally distressing material in many individuals, but (b) maltreatment can impair memory for such material in individuals who defensively avoid it. Support for these hypotheses comes from research on child abuse ...

  22. Research explains how the brain finds Waldo

    Research explains how the brain finds Waldo. At any given moment, the world bombards the senses with more information than the brain can process, and for more than a century scientists and psychologists have debated how the brain filters out distractions and focuses attention on the things that matter. Using the visual system as a model ...

  23. Eurasian jays can use 'mental time travel' like humans, study finds

    Your source for the latest research news. Follow: Facebook X ... 2020 — A study adds nuance to the idea that an aging memory is a poor one and finds a potential correlation between the way ...