Around 15 years ago, University of Wisconsin-Madison psychologist Seth Pollak recruited a couple hundred children to study the relationship between childhood stress and certain immune system markers. "We had a whole range [of participants], from kids with really boring, stable, average lives all the way up to kids with severe child abuse, and neglect, and poverty, and really extreme childhood stressors,” Pollak says. He published his research in the esteemed journal Proceedings of the National Academy of Sciences (PNAS).
A decade later, Pollak and his collaborators began scouring the Internet, trying to find these kids again. They were specifically looking for the participants with the highest and the lowest childhood stress exposure. Eventually they tracked everyone down, and thus began a previously-unplanned second study on the same people, who are now between 20 and 23 years old.
This time, Pollak was interested in what stress ultimately did to their brains. Over the past 20 years, Pollak says, “there has been a tremendous amount of research that children who come from very, very stressful early childhood backgrounds have a real range of problems.” From underemployment and low levels of school achievement to health complications and behavioral issues, early life stress is associated with poor outcomes later. But the research to-date is mostly descriptive—establishing links between, for example, self-reported stress and later school and medical records.
Pollak wanted to know how childhood stress continues to affect people long after they’ve left the house. So he decided to study one particular aspect of neurological functioning: decision-making and risk-reward processing. His latest study, also published in PNAS, found that adults with high-stress childhoods responded to potential losses and rewards differently.
For the study, Pollak and his collaborators administered a series of gambling experiments designed to activate regions of the brain associated with risk-taking. Participants performed each task while hooked up to a brain imaging machine. The people who’d grown up in high-stress environments didn’t accurately weigh their risk and took longer to make decisions—even though they ultimately made worse ones—than participants who’d grown up with low stress. For instance, high-childhood-stress participants often placed large bets on trials they were unlikely to win, while failing to place large bets on trials they probably would win. Even after repeated losses, these participants failed to change their behavior.
More from Tonic:
Differences between high- and low- childhood stress groups manifested neurologically: high-stress participants had less activation in brain regions linked with anticipation of potential losses and potential rewards. But the regions associated with the actual experience of losing a bet were significantly more active in the high-stress group. In short, participants who’d grown up in high-stress environments under-reacted to potential losses but overreacted when they actually lost. (In case you’re wondering if these differences are the result of preexisting cognitive deficits, consider that there were no differences between groups on myriad other neurocognitive tasks unrelated to decision-making.)
But Pollak wondered if his controlled experiments would tie into participants’ everyday lives. It turned out that their lab behavior was strongly correlated to their real-world risk-taking—such as not wearing a seatbelt, being hurt or sick and not calling a doctor, and smoking. In other words, Pollak’s experimental results echoed the participants’ actual decision-making behaviors. Brain activation in risk-perceiving regions “completely predicted what people said they did in their everyday lives.”
Notably, however, their everyday lives seemed not to affect their behaviors in the lab. Pollak found that current life stress was not correlated with poor decision-making. Only childhood stress was. “So we know it's a childhood experience from decades ago that is influencing people's current decision-making, not what's going on in their lives right now. That’s very powerful.”
The discovery that adults with high-stress childhoods aren’t as good at perceiving potential losses has broad implications, Pollak says. “So many aspects of our judicial and social services system are predicated on the idea that people understand punishment.” For example, judges are more lenient on first offenses but less lenient on repeated offenses, and group homes for delinquent youth post “three strikes and you’re out” rules. These kinds of warnings are based on the notion that everyone equally processes the consequences of their behavior. “We assume people learn from punishment experiences, but what if this information is truly not getting into [some people’s] brains?” he asks.
Now that research is beginning to flesh out how stress might impair later brain function, Pollak’s next question is, “Is this reversible?” Can we train people to pay attention to things that might lead to a bad outcome? Can we help people learn when they’re about to make a bad decision? Pollak is optimistic. “If we can train people to learn math and foreign languages, and we can train people's attention, I don't see any reason why not.”
Read This Next: Study Links Childhood Anxiety to Heavy Weed Use Later in Life