Each week, we read what's going on the world of science and bring the wildest findings straight to you. Here's the latest:
Your brain is listening even while you’re sleeping
Scientists have classically described sleep as a state of isolation— your brain shuts down and no longer receives information from the outside world. But this description doesn’t always match with common everyday observations about sleep, says Thomas Andrillon, a research fellow at Monash University in Australia.
If our senses are shut off, how do we wake up at the right moments, to the right sensory intrusions? It’s been reported that people can wake up when something important happens in their environment. People are more likely to wake up to relevant sounds, Andrillon says, like someone saying their name or the sound of someone’s baby crying, compared to louder, more irrelevant sounds.
“It seems therefore that the sleeping brain remains somewhat vigilant,” he tells me. In a new study published in Nature Human Behavior, Andrillon and his colleagues took a closer look at what happens when sleepers were exposed to sounds but didn’t wake up, to see if they were somehow paying more attention to some over another.
They applied something called the ‘cocktail party problem’ to sleeping study participants, a situation many of us have found ourselves in: Being confronted with many voices to listen to, and being able to select one to focus on while ignoring the others. This is why when you’re at a bar, you can hold a conversation with the friend you came with, and not simultaneously listen to every other person around you.
“This is not a simple job,” Andrillon says. Unlike vision, where objects are spaced out, auditory information overlaps and mixes when it hits our ears. “The job of the brain is to disentangle these overlapping streams to process them independently.”
To replicate this for the sleeping people in their study, they played two voices, one in each ear, while the subjects slept. One voice was saying meaningful phrases, collected from Wikipedia articles or movies dialogues, and the other voice was speaking a kind of gibberish called Jabberwocky, which looked and sounded like French (the experiment was done in Paris) but is meaningless.
Then, the researched used a technique called stimulus reconstruction, which can take electrical brain activity recorded from an electroencephalogram, or EEG, and reconstruct what people were listening to. The brain activity in the sleeping brains more closely reflected the meaningful phrases, rather than the Jabberwocky.
“As if the brain was turning its internal volume up for this particular speaker,” Andrillon says.
Importantly, if the Jabberwocky was played alone, the brain would reflect that. The study showed that when presented with two options, our brains prefer speech that makes sense.
Does that mean that we have any conscious awareness of what’s said to us in our sleep? It’s a difficult question to answer. Future studies could get at that by waking up people right after they heard the voices and asking them then. This study’s participants didn’t remember the sentences played to them, but “that does not mean though that they were not aware or not understanding them, just that they cannot remember,” Andrillon tells me. “Same thing goes with dreaming: it is not because you don’t remember your dreams that you don’t dream.”
Our brains reconstruct memories in reverse order, which means we might sacrifice details for the overall gist
Something we already know, but is uncomfortable to acknowledge, is that our memories are reconstructions. They’re not 10 percent accurate snapshots of past experiences. And new findings suggest when we recall them, they might become abstract and gist-like rather than detail-oriented.
But let’s back up. What happens in your brain when you look at an object? For the most part, scientists know what happens, and it goes something like this: When we see a complex image, say a teapot, visual brain areas start to analyze the physical properties of it, like its shape, textures and colors within 100 milliseconds. 100 to 200 milliseconds later, the brain starts to extract more abstract information from what it’s seeing, like recognizing the object as a teapot and knowing what it’s used for. Visual information travels through the brain in a systematic, hierarchical way.
The new study, published in in Nature Communications, found that the way we remember visual things is different from the way we first saw them, says senior author Maria Wimber, a psychologist at the University of Birmingham in the UK.
In the new research, Wimber and her colleagues watched how information travels through the brain when an image was reconstructed from memory, not encountered in person. They gave people a simple learning task—to associate words with images—then later showed them the words alone and asked them to remember the accompanying images in as much detail as possible. They used 128 small electrodes attached to the scalp to watch their participant’s brains while they were remembering.
They saw that when their subjects saw an image for the first time, their brain activity proceeded in the expected order: It first decoded visual details like color, and then more abstract information. “Interestingly, this pattern completely flipped when we asked them to reconstruct the images from memory,” Wimber says. “Now we saw that the abstract, semantic attributes came online first, followed only later by the physical details.”
“If we first recover the semantics, it means that our memories can be strongly biased or distorted by our own interpretations of what happened,” Wimber says. We might be capable of trying to pay more attention to details and boost our remembering of specifics, but it’s possible that the reverse order in which we retrieve memories is hardwired in the brain. “This is because the brain areas that process semantics are connected more closely with the hippocampus, one of the most important memory regions of the brain,” Wimber tells me. “Because of these anatomical connections, memory in real life will probably always be biased towards semantics.”
It’s surprisingly easy to create the illusion that you have stretchy fingers (here’s a video of it)
We take for granted the sensation that “you” inhabit your own body, and that your body belongs to you. But out-of-body and body-ownership illusions quickly reveal how fragile that sensation is—it’s quite easy to think a rubber hand is your own, or to see the world from the perspective of a visual reality avatar.
A new study in Perception sought to further explore the limits of body ownership. Previous studies have shown that we don’t easily take to feeling like an inanimate object, like a block of wood, is part of our body. But we can experience distorted body parts, like longer fingers and arms, as our own.
Catharine Preston, a cognitive neuroscientist at the University of York, says the key to these illusions is a matching of what people see and what they feel. In the new work, they gently pulled on participants’ fingers that were hidden out of sight, while simultaneously performing the same motion at a faster speed, and longer than the actual finger, within view—as if they were stretching out an invisible finger. “Because what they see (an invisible finger being stretched) matches what they feel (namely, their own finger being pulled), the brain interprets this as one event—so they feel like their finger is stretching,” she tells me.
She has done other preliminary research that found that these kinds of illusions might possibly help people in chronic pain. When she and others created the illusion of stretching or shrinking the hands of 20 people with arthritic pain, they found that 85 percent of participants said their pain was halved. But in that study, they induced the illusion using expensive video equipment; the new work shows that feeling like your fingers are being stretched can be done much more simply.
“This tells us that we can adapt to impossible perceptual changes to our body without it actually being in view,” she says.
You can see some videos of these illusions here, or even try the finger stretching one at home! She says you can use an old shoe box, or something comparable, as a makeshift platform to conceal your fingers, and enlist a friend to do the pulling of your real finger and invisibly elongated one.
“This demonstrates how flexible our body representation is,” Preston says.
Your weekly science and health reading list
Most Personality Quizzes Are Junk Science. Take One That Isn’t. By Maggie Koerth-Baker and Julia Wolfe in FiveThirtyEight.
Who can resist a quiz that will divulge information about your favorite subject (you)? This one from FiveThirtyEight is based on the Big Five, the traits psychologists actually use to study personality.
Have Aliens found us? A Harvard astronomer on the mysterious interstellar object ‘Oumuamua’ By Isaac Chotiner in The New Yorker.
Chotiner interviews the chair of Harvard’s astronomy department, Avi Loeb, on why we shouldn’t be so quick to discount the weird object that unexpectedly entered our solar system.
The smartphone psychiatrist By David Dobbs in The Atlantic.
An excellent profile on Tom Insel, the former director of the National Institute of Mental Health, and his turn to technology to address rising mental health issues.
The Statistician Who Debunked Sexist Myths About Skull Size and Intelligence By Leila McNeill in Smithsonian Magazine.
Alice Lee challenged the once widely held belief that skull size was related to intelligence. She did so by marching into an all-male Anatomical Society meeting at Trinity College and measuring the heads of 35 of them. “Lo and behold—some of the most well-regarded intellects in their field turned out to possess rather small, unremarkable skulls.”
Sign up for our newsletter to get the best of Tonic delivered to your inbox.