Credit: Flickr user romainguy
Every morning around 9 or 10, I take a break from my work and do an online crossword puzzle. I’ve gotten pretty good at them: I can solve almost any puzzle without asking the computer for extra hints. I can even complete the extra-hard “Sunday Challenge” most of the time. But the question that has puzzled scientists for years is whether skills like crossword-solving transfer to other, seemingly unrelated tasks. Does my skill at crosswords make me a better writer, or help me remember what I need at the grocery store?
For decades, it seemed that the answer was, in nearly every case, “no.” In 1973, William Chase and Herbert Simon conducted a seminal study, which found that chess experts were better than novices at remembering the configuration of pieces on a chess board. But the effect disappeared if the pieces were arranged in a random configuration instead of a position from a real game.
Chess experts’ memory, it seemed, was better than non-experts’ for remembering chess games, but chess experts are no better than anyone else at recalling their neighbor’s name when they run into them at the post office.
But within the last decade, a number of studies have seemed to show that some abilities do transfer. I blogged about one such study, by C. Shawn Green and Daphne Bavelier, conducted in 2003 and published in Nature. They found that avid video gamers were significantly better than non-gamers at certain tests of visual ability. Even more impressive, among non-gamers, just playing an “action video game” for about 10 hours led to a significant improvement of key abilities that might make them more aware of their surroundings, potentially reducing automobile accidents.
Studies like Green and Bavelier’s have spawned a burgeoning “brain training” industry based on the idea that playing a few simple games each day can improve your cognitive fitness. But a new study questions whether brain training is really as effective as all the hype. Tal Yarkoni, Christian Jarrett, and Daniel Simons all blogged about the study. A team led by Adrian Owens recruited over 11,000 online participants via the BBC program Bang Goes the Theory. Participants were tested on a number of cognitive skills, then randomly placed into three groups. Two of the groups did typical “brain training” tasks several times a week, and as a control, the third group had to search the internet for answers to trivia questions. After six weeks, everyone was tested again. While each group showed a marginal improvement in the cognitive tasks, there wasn’t much of a difference between any of the groups. The conclusion: Any improvement was simply a result of having taken the cognitive tests once before, at the beginning of the study. The research was published in Nature.
There was considerable backlash on pro-brain-training blogs. Alvaro Fernandez, who runs a brain-training consulting business, points out that the study’s “dosage” of brain-training was relatively low, averaging about 10 hours of total training. By contrast, many brain-training studies offer as many as 30 hours of training. In a guest post on Fernandez’s blog, Elizabeth Zelinski, a researcher who has conducted brain-training studies, was critical of the study’s methods for analyzing results. She said the study also restricted its analysis to participants under 60 years old. Many brain-training products are marketed specifically to the elderly as a means of stemming loss of cognitive function later in life. This study, Zelinski and Fernandez say, sidesteps that issue.
Daniel Simons, a renowned cognitive scientist who himself co-authored a key attempt to replicate the original Green and Bavelier study, acknowledges that transfer of skills may occur in older adults, but sticks to his guns about the lack of benefit of brain training for younger adults. He says that physical exercise has shown to be much more important to brain fitness than any sort of brain training in young adults.
But if brain training can be effective in older individuals, why doesn’t it work in younger people? I suspect the reason may be that young people—and especially the sorts of young people who sign up for brain-training studies—are already very mentally active. It’s hard to find a college student who doesn’t play video games or do other challenging mental activities on a regular basis. If someone is already a marathon runner, asking them to walk an extra half-mile every day probably won’t do much for their fitness level.
For senior citizens, any number of activities might represent a significant change in their mental activity, which in turn could help prevent cognitive decline.
But what about that original Green and Bavelier study that found an improvement in college students’ visual skills after just 10 hours of gaming? Simons says his attempt to replicate it, teamed with four other psychologists and led by Walter Boot, was unsuccessful. Non-gamers who played the action game for 10 hours didn’t show any more gains in visual ability than others who played no games. It’s possible that the original Green and Bavelier study was a statistical anomaly.
Brain-training advocates counter that 10 hours of training isn’t enough, and that these researchers are testing the wrong abilities. But everyone agrees that the groups most likely to benefit from brain-training exercises are the very old and the very young. In the case of young people, brain-training has another name: school.
Dave Munger is editor of ResearchBlogging.org. He also blogs at The Daily Monthly. Each week, he writes about recent posts on peer-reviewed research from across the blogosphere. See previous Research Blogging columns »
Originally published May 19, 2010