That little bit of exercise
Read the post here.
Brian Mossop is currently the Community Editor at Wired, where he works across the brand, both magazine and website, to build and maintain strong social communities. Brian received a BS in Electrical Engineering from Lafayette College, and a PhD in Biomedical Engineering from Duke University in 2006. His postdoctoral work was in neuroscience at UCSF and Genentech.
Brian has written about science for Wired, Scientific American, Slate, Scientific American MIND, and elsewhere. He primarily cover topics on neuroscience, development, behavior change, and health.
Contact Brian at brian.mossop@gmail.com, on Twitter (@bmossop), or visit his personal website.
Why I Run
On a guest post for Mary Knudson's HeartSense blog, I talk about why I started running:
Being a sprinter, I had never done much long distance work. In the past, making it around the 400m track just once was an accomplishment for me. Plus, my closest friends from college are hard-core distance runners. And by that, I mean they are really, really fast. Like 2:30ish marathon fast. Top 50 in the Boston Marathon fast. Fast fast. You get the point. So getting into this road racing business was a bit intimidating. I didn’t even tell my best friends what I was doing until shortly before my first race.
On a guest post for Mary Knudson's HeartSense blog, I talk about why I started running:
Being a sprinter, I had never done much long distance work. In the past, making it around the 400m track just once was an accomplishment for me. Plus, my closest friends from college are hard-core distance runners. And by that, I mean they are really, really fast. Like 2:30ish marathon fast. Top 50 in the Boston Marathon fast. Fast fast. You get the point. So getting into this road racing business was a bit intimidating. I didn’t even tell my best friends what I was doing until shortly before my first race.
I started out slow, running just twice per week, a sluggish mile or two at a time. Week by week, runs became easier, and I found myself starting to push myself to go further, and faster. I started watching what I ate, making smarter choices on trips to the refrigerator. As the months passed, I began feeling better than ever, and had wrangled my waistline back to its proper diameter. My annual physical revealed more good news, as my cholesterol and blood pressure were now held in check.
You can read the full post here.
What Did the NIH Report on Lifestyle Modification/Alzheimer's Really Say?
My inbox flooded with links to the report released by NIH (and evangelized by TIME) stating that lifestyle interventions (diet, physical activity, mental exercises, etc.) may not be that effective in preventing Alzheimer's Disease. Before I mount my full counterattack, I need to carefully read through the studies the meta-analysis cites. Still, a quick glance at the exclusion criteria of the meta-analysis reveals the authors limited their review to studies using patients over the age of fifty. So really, these results imply that lifestyle modifications may not prevent, delay, or treat Alzheimer's Disease if you start these changes later in life.
My inbox flooded with links to the report released by NIH (and evangelized by TIME) stating that lifestyle interventions (diet, physical activity, mental exercises, etc.) may not be that effective in preventing Alzheimer's Disease. Before I mount my full counterattack, I need to carefully read through the studies the meta-analysis cites. Still, a quick glance at the exclusion criteria of the meta-analysis reveals the authors limited their review to studies using patients over the age of fifty. So really, these results imply that lifestyle modifications may not prevent, delay, or treat Alzheimer's Disease if you start these changes later in life.
My second point is that all lifestyle modifications are not created equal. Scientific evidence in animal studies suggests that of all interventions, aerobic exercise is our best chance of staving off cognitive decline. In fact, this meta-analysis also found some correlation between exercise and preserving or improving cognitive ability.
There's a good article in The Economist that discusses the failures of the drug industry to find a solution to treating Alzheimer's Disease. One particular quote resonates with my feelings on the NIH report:
Another fundamental problem is that, whatever is causing the damage, treatment is starting too late. By the time someone presents behavioural symptoms, such as forgetfulness, his brain is already in a significant state of disrepair. Even a “cure” is unlikely to restore lost function.
Internet: Good or Bad for the Brain?
I was fairly quiet on the blogs and Twitter the latter part of last week, because I spent Thursday and Friday at the Health Horizons Conference, sponsored by the Institute for the Future (IFTF). I’ll post some reflections soon, but first I want to comment on an interesting discussion that was brewing last week. Over at Neuron Culture, David Dobbs has some nice insight into the ongoing debate between renowned science/tech writers Stephen Pinker and Nicholas Carr.
I was fairly quiet on the blogs and Twitter the latter part of last week, because I spent Thursday and Friday at the Health Horizons Conference, sponsored by the Institute for the Future (IFTF). I’ll post some reflections soon, but first I want to comment on an interesting discussion that was brewing last week. Over at Neuron Culture, David Dobbs has some nice insight into the ongoing debate between renowned science/tech writers Stephen Pinker and Nicholas Carr.
Carr apparently states in his new book, The Shallows, (which I have not read), that the internet might be killing our brains with increasing distractions. Pinker, on the other hand, thinks that while many people are initially panicked by new media technology, one day society will see the internet for what it truly is: a way of richly organizing the ever-increasing abyss of information.
A second showdown, this time between writers Jonah Lehrer and Clay Shirky, tackles the question: Is the internet better for creating a ‘Cognitive Surplus’ than television? Shirky believes the era of the mindless television sitcom moved us away from social interaction and deep thought, but the advent of online social exchange – even in the form of inane material such as the lolcats at icanhascheezburger.com – is once again bolstering our feeble brains. Lehrer fires back, saying that television and internet alike can fuel passionate offline discussions and detailed analysis.
From a neuroscience perspective: Is the never-ending online information flow good or bad for our brains? (Or, for that matter, is technology good for our brains?) Is one technology (television) better or worse than another (the internet)?
Just like all aspects of life, I suspect brain growth is all about balance. For me, Twitter serves as a filter for my information stream. I follow people whose insight and opinion I respect (whether I agree with them or not doesn’t matter). But sitting down to write a blog post takes me away from the cacophony of Twitter for a moment to think critically about a particular topic. Often I engage my colleagues and friends in a discussion, either in person or over email/chat, regarding the ideas in my head long before I publish anything online. I balance the real-time information flow with real-life conversations.
One could argue that any information stream – be it reading a book, watching a movie, or surfing the net could deaden our brain if we don’t pause for reflection. Intelligently analyzing, as opposed to passively experiencing, the information that enters our brains is no doubt one of the distinguishing factors that makes us human. So don’t be afraid of technology, and don’t quibble over which technologies are good or bad. Rather, simply use technology to augment human social interaction.
What Did We Really Learn From the BBC Brain-Training Software Study?
Ever since I saw the press releases yesterday telling of a new article to be released in Nature showing that brain-training software was ineffective, I knew a storm was brewing. The paper was still under embargo at that point, so I was anxiously awaiting its release today. Slowly, but surely, the mainstream media got wind of the paper, running headlines like “Brain Games Don’t Make You Smarter”. Then the blogosphere lit up, with ongoing chatter throughout the day on this controversial paper. I was stuck in the lab all day, and couldn’t put a post together, so I’m a little late to the party. But I wanted to give you a rundown of what exactly the study found, and point out a few intricacies of their findings.
Ever since I saw the press releases yesterday telling of a new article to be released in Nature showing that brain-training software was ineffective, I knew a storm was brewing. The paper was still under embargo at that point, so I was anxiously awaiting its release today. Slowly, but surely, the mainstream media got wind of the paper, running headlines like “Brain Games Don’t Make You Smarter”. Then the blogosphere lit up, with ongoing chatter throughout the day on this controversial paper. I was stuck in the lab all day, and couldn’t put a post together, so I’m a little late to the party. But I wanted to give you a rundown of what exactly the study found, and point out a few intricacies of their findings.
When I began graduate school, there was a savvy postdoc in our lab who showed the newbies the ropes. One of the best pieces of advice he offered was, “Don’t believe everything you read, and always check who did the study.” I try to live by these words every time I read a study.
The group who submitted the Nature paper was led by a researcher named Adrian Owen, a professor at MRC Cognition and Brain Sciences Unit, Cambridge UK. Owen developed this brain training program and study in collaboration with the BBC. A quick look at Owen’s PubMed listing shows he’s primarily known for using fMRI to prove that people who are in a constant vegetative/minimally conscious state are, in fact, self-aware (a controversial field and a bold claim, which I’m not going to get into right now). But the point is: Owen isn’t an expert in brain plasticity or behavioral training-induced cognitive changes.
Making brain-training software isn’t a task you just jump into, and experts spend years proving and refining approaches in animal models. But it appears that Owen woke up one day and suddenly decided he had the insight to figure out whether the cognitive benefits claimed by brain training software were true.
Even if we give Owen the benefit of the doubt, and assume he knows what he’s doing, all brain-training programs are not created equal. I try, whenever possible, to refrain from using the term “brain games”, because when training modules are created from sound preclinical and clinical research, they’re really much more than games. Owen and the BBC only tested their program, so the results simply say that their program doesn’t work. This finding does not generalize across the industry.
SharpBrains has the best rundown I’ve seen of what’s wrong with this report, including the nitty-gritty details showing that the participants in the Owen/BBC study used the brain training software for considerably less time than most programs. Also, the training sessions were unsupervised, hence the participants were possibly prone to distraction.
While I’m moderately annoyed with the overreaching conclusions the authors made, I’m even more ticked at the mainstream media headlines. We spend billions of dollars bringing drugs to market, and often things go wrong during drug trials. Companies miss clinical endpoints, or worse, someone has an adverse event. Yet, when this happens, I have to scour the net just to find a mention of the problem. The brain training software industry is still in its infancy, and there will inevitably be bumps in the road. But the truth is, these studies cost a fraction of what it takes to bring a drug to market, and despite what this rogue Nature paper says, have a huge potential to help millions of people.
Sugar-coated Laziness
Check out this study. Researchers found that when "teenage" rats (30-45 days old) consumed massive amounts of sugar, they became extremely difficult to train as adults. For two weeks or so during adolescence, one group of rats had free access to a tasty 5% sucrose solution, while the control group only had water available. Similar to some American teenagers, the experimental group of rats consumed about 20% of their daily caloric intake as simple sugar.
Check out this study. Researchers found that when "teenage" rats (30-45 days old) consumed massive amounts of sugar, they became extremely difficult to train as adults. For two weeks or so during adolescence, one group of rats had free access to a tasty 5% sucrose solution, while the control group only had water available. Similar to some American teenagers, the experimental group of rats consumed about 20% of their daily caloric intake as simple sugar.
To give you some background, it's extremely easy to train adult rats to perform simple tasks, such as pulling levers or pressing buttons in return for a food reward. However, the researchers couldn't motivate the rats that had consumed large amounts of sugar as teenagers to learn the task. My first reaction while reading this paper was: "Big deal. That group of rats just had sugar overload. It no longer had any real value for them, so there was no incentive to learn the new task".
But here's where the story gets interesting: if you repeat the experiment, but replace the teenage rats with adult rats, you get strikingly different results. When adult rats have free access to a sugary drink for two weeks, they never lose motivation for the sweet reward, and easily learn the new lever-pull task later in life. So it's not that rats are simply sick of the sweet reward, but rather, it seems the sweet drink over-stimulated the reward pathway in the brain during adolescent development, leading to problems with motivation in adulthood.
Were the calories in the sugary drink or the sweet taste to blame for hyper-activating the reward circuits in the brain? To answer this, the authors took another group of teenage rats and gave them free access to a drink flavored with artificial sweetener, which has no calories. These rats were also unmotivated and rather difficult to train later in life, so the authors concluded that the sweet taste, but not the sugar itself, was hyper-activating the brain's reward circuits.
Besides, ahem, crazy neuroscientists writing for health blogs, who cares about lazy rats? Well, the authors argue that a sign of depression in rodents is lack of motivation to perform simple tasks. Given that incidence rates for depression and other psychological illness are increasing in today's society, it's interesting to see how seemingly benign events during adolescence -- a critical time in brain development -- affect the mental state of adult animals.
How "The Science of Success" Redefines Psychology
I just finished reading Dave Dobbs' new article in the the December issue of The Atlantic, "The Science of Success". Dobbs turns the classic question of Nature vs. Nurture, whether our genes or our environment are the deterministic drivers of our fate, on its head. Traditionally, those who support "nature" say that our genes are most influential in defining us. On the other hand, those that support the "nurture" side say that our environment plays a more important role. Based on new research, Dobbs introduces the idea of two types of people, "dandelions" and "orchids". Dandelions can thrive anywhere, despite their environment or upbringing. Orchids, however, are more temperamental, and require a stable environment to survive. At first glance, the orchids may seem like a liability, and in fact, they often carry genes that make them susceptible to mood disorders and psychological disease. The astounding part of Dobbs' report is that he shows that given the right care, or environment, the orchids don't just do OK, but far surpass the dandelions in perfomance. In other words, given the right training, orchids may in fact be destined for greatness.
I just finished reading Dave Dobbs' new article in the the December issue of The Atlantic, "The Science of Success". Dobbs turns the classic question of Nature vs. Nurture, whether our genes or our environment are the deterministic drivers of our fate, on its head. Traditionally, those who support "nature" say that our genes are most influential in defining us. On the other hand, those that support the "nurture" side say that our environment plays a more important role. Based on new research, Dobbs introduces the idea of two types of people, "dandelions" and "orchids". Dandelions can thrive anywhere, despite their environment or upbringing. Orchids, however, are more temperamental, and require a stable environment to survive. At first glance, the orchids may seem like a liability, and in fact, they often carry genes that make them susceptible to mood disorders and psychological disease. The astounding part of Dobbs' report is that he shows that given the right care, or environment, the orchids don't just do OK, but far surpass the dandelions in perfomance. In other words, given the right training, orchids may in fact be destined for greatness.
This finding redefines conditions we typically may have classified as undesirable. ADHD, depression, and generalized anxiety disorder, are no longer conditions to dread, because given the right training, people with these predispositions may in fact be the true "movers and shakers" in the world.
Please read the full article for yourself. And, as always, I'd welcome a discussion here...
Why Behavior Change Is (Still) Better Medicine Than Drugs
While attending the Institute for the Future's Health Horizons Fall Conference on Monday, one thing became eminently clear. The 21st century will be the era of brain, the last great scientific frontier. Due to societal shifts, environmental changes, and the fact that we are just living longer, we are poised to see a sharp rise in cases of diseases such as Alzheimer's, Parkinson's, autism, and post-traumatic stress disorder. The only thing worse than the increasing prevalence of brain disease is the sobering fact that few viable treatments currently exist.
While attending the Institute for the Future's Health Horizons Fall Conference on Monday, one thing became eminently clear. The 21st century will be the era of brain, the last great scientific frontier. Due to societal shifts, environmental changes, and the fact that we are just living longer, we are poised to see a sharp rise in cases of diseases such as Alzheimer's, Parkinson's, autism, and post-traumatic stress disorder. The only thing worse than the increasing prevalence of brain disease is the sobering fact that few viable treatments currently exist.
For years, we've heard the mantra of behavior change and health. Exercise more and you'll cut your risk for heart disease and stroke. Eat more fruits and vegetables and you can decrease your risk for colon cancer (or possibly prostate cancer, as discussed in a previous Decision Tree post, "Why Behavior Change is Better Medicine than Drugs"). Could behavior change serve our brain health as well as it did other organs of the body?
On Monday, the neurotechnology community drew a definitive line in the sand with regard to treating the brain. On one side were panelists that believed that society is not being medicated enough for mental disorders, including ADHD in children. On the other side, proponents of behavioral training argued that brain plasticity, the innate ability of the brain to rewire itself continuously throughout life, is our best bet to combat brain disease.
Consider the use of ADHD drugs in children, or cognitive-enhancing drugs, such as modafinil, by professionals (including a large group of scientists) in the workplace. Proponents of medication say that the cognitive enhancers are not doing anything unnatural. Rather, they are taking someone who's a mediocre performer in terms of concentration, and simply moving them to the upper 90th percentile. Then, according to the same logic, I guess these panelists would also support legalizing steroids in major league baseball. After all, the steroids are not making the athletes super-human. Rather, they're taking the middle-of-the-road performers and nudging them to the upper echelon of the sport. Hmmm....
My former postdoc advisor, Dr. Michael Merzenich of the University of California San Francisco, led the charge for behavioral training as a better alternative to drugs for diseases of the brain. Mike's lifelong work focused on the neuroscience of learning, and how brain plasticity occurs at various stages of development. He believes that many ailments of the brain, including ADHD, occur because we are using our brains "incorrectly", but specific behavioral training can reverse and improve these deficits.
The wonders of behavioral training and brain plasticity are not limited to sparse findings in a dark lab. In fact, Mike's most promising research has been translated into several commercial computer software applications, which have enhanced the reading capabilities of dyslexic children, as well as improved the speech processing and memory of senior citizens.
Whether you are sold on behavioral training as a feasible alternative to drug therapy in brain illness or not, one point remains solid: the cost of conducting clinical trials for behavioral training regiments is a mere fraction of the cost of drug trials. Given that it's terribly expensive to run drug trials, and that only a small fraction of drugs in a pharma company's pipeline succeeds in the clinic, we clearly can't afford to ignore behavioral training as a new way to treat the brain.
White-Noise and the Developing Brain
Usually, we think of preventive medicine as a first-person experience, e.g. what we can do to keep ourselves healthy. But preventive medicine includes steps to keep our families healthy as well, as in the case of an elderly relative, or a newborn baby. My first postdoc stint was in a developmental neuroscience lab at UCSF, where many talented researchers spent years answering questions like, "How do different types of environmental noise affect the development of the auditory system?". So when a friend of mine sent me a message the other day, asking about using a white-noise generator to stop her crying, colicky baby, some red flags immediately went off in my head. Because I've been asked this question several times over the past few months, I decided to post my take here.
Usually, we think of preventive medicine as a first-person experience, e.g. what we can do to keep ourselves healthy. But preventive medicine includes steps to keep our families healthy as well, as in the case of an elderly relative, or a newborn baby. My first postdoc stint was in a developmental neuroscience lab at UCSF, where many talented researchers spent years answering questions like, "How do different types of environmental noise affect the development of the auditory system?". So when a friend of mine sent me a message the other day, asking about using a white-noise generator to stop her crying, colicky baby, some red flags immediately went off in my head. Because I've been asked this question several times over the past few months, I decided to post my take here.
There's been a lot of anecdotal evidence that white-noise calms a crying baby. In fact, some parents swear by the method. But this is a clear case where the science disproves the hype. In 2003, our lab at UCSF published a study in Science Magazine with a striking finding. The auditory system of newborn rat pups, which normally progresses like clockwork, was under-developed after the pups were exposed to white noise compared to animals raised in normal conditions. But why would white noise cause a problem with the development of the brain?
First, let's look at what happens to the auditory system during normal development. When rats are born, the area of the brain responsible for making sense of sounds, the auditory cortex, undergoes constant changes. Scientists refer to this phenomenon as brain plasticity. Newborns are unique because brain plasticity occurs just by passive exposure to sounds during a very well-defined time in development called the "critical period", which lasts through several postnatal days in rats. A correlate in humans might be the fact that children learn new languages just by being exposed to sounds, while adults have to spend hours studying, memorizing, and practicing. This developmental period is a crucial time for a newborn, where the brain "sets" itself to efficiently process its native language.
When white-noise was played for the newborn rats, the lab found that the "critical period" remained open indefinitely, which means there was a delay in normal brain development. For this reason, members of the lab were against using white-noise generators on newborn babies. Theories suggest that the white-noise might interfere with a newborn's ability to grasp its native language, leading to progressive developmental problems.
Even if the results found in rats did not directly carry over to humans, I really feel that you just shouldn't screw with Mother Nature when it comes to brain development. I'm not a parent, and I can only imagine the empathy, or even frustration, that ensue when a baby is crying hours on end. But using white-noise generators just doesn't seem like the best answer. Our brains evolved to process biologically- and socially-relevant sounds, and exposing newborns to extremely unnatural sounds seems like an needless gamble.
Cheeseburgers on the Mind
Making a choice that leads to better health is not always easy. Otherwise, we would have many more ex-smokers and far fewer holiday pounds to shed. We would have no need for nicotine gum and patches, or Weight Watcher's meetings. So if it's that difficult, why bother? For years, physicians have told the American public that reducing your calorie intake, eating a diet low in salt/sugar/saturated fat, and exercising 3-5 days per week will reduce your risk for heart disease and diabetes. Now, new information has shown that the benefits of a healthy lifestyle are even more far reaching than initially thought -- diet and exercise can affect our minds. About 5-8% of people over the age of 65, and nearly 50% of people in their 80's, show signs of dementia. As the baby-boomer generation increases the population of the 55-64 age group in the U.S. from 29 to 40 million by 2014 , and their life expectancy continues to rise, the number of people affected by dementia is poised to increase as well. Recent studies have shown that regular exercise may prove to be a potent mediator of dementia and Alzheimer's Disease. In one study, those who exercised 3 or more days per week had a 32% risk reduction in developing dementia compared to those who exercised less. Exercise has also been linked in similar studies to moderate cognitive improvements in adults who are at risk for Alzheimer's Disease, as well as a lower occurrence of vascular dementia.
Making a choice that leads to better health is not always easy. Otherwise, we would have many more ex-smokers and far fewer holiday pounds to shed. We would have no need for nicotine gum and patches, or Weight Watcher's meetings. So if it's that difficult, why bother? For years, physicians have told the American public that reducing your calorie intake, eating a diet low in salt/sugar/saturated fat, and exercising 3-5 days per week will reduce your risk for heart disease and diabetes. Now, new information has shown that the benefits of a healthy lifestyle are even more far reaching than initially thought -- diet and exercise can affect our minds. About 5-8% of people over the age of 65, and nearly 50% of people in their 80's, show signs of dementia. As the baby-boomer generation increases the population of the 55-64 age group in the U.S. from 29 to 40 million by 2014 , and their life expectancy continues to rise, the number of people affected by dementia is poised to increase as well. Recent studies have shown that regular exercise may prove to be a potent mediator of dementia and Alzheimer's Disease. In one study, those who exercised 3 or more days per week had a 32% risk reduction in developing dementia compared to those who exercised less. Exercise has also been linked in similar studies to moderate cognitive improvements in adults who are at risk for Alzheimer's Disease, as well as a lower occurrence of vascular dementia.
Recent pre-clinical results have shown that diet is also tied to brain health. A 2002 study revealed that rats fed a diet high in saturated fat and refined sugar for 2 years exhibited changes in both gene expression in the brain, as well as performance on a memory task (finding its way through a water maze). This fast-food type diet decreased the levels of brain-derived neurotrophic factor (BDNF), which is a versatile molecule that mediates brain cell formation, function, and survival. Both BDNF gene expression (mRNA) and BDNF protein production in the hippocampus, an area crucial for short-term memory, were significantly reduced in the animals fed the high-fat and refined sugar diet, compared to those on a low-fat, complex carbohydrate diet. Although the experiment lasted for 2 years, and the greatest effects were seen at the end of the experiment, changes in gene expression were seen in as little as 6 months after the rats began downing cheeseburgers. Even more striking, the rats had a significant deficit in the water maze memory task after only 3 months on the high fat/sugar diet, which shows that the "McDiet" led to a change in behavior in the mice.
Nevertheless, the research presented here had limitations. The studies that looked at the effects of exercise on dementia were conducted in relatively small, non-diverse human populations and were not completely controlled against other "good health" factors that tend to occur when people exercise. For example, exercisers are much more likely to do other healthy things, such as eating right, quitting smoking, getting quality sleep, or maintaining target weight. The fast-food diet study was well controlled to show that decreased BDNF was not related to hypertension, atherosclerosis, obesity, and changes in activity level -- but the results must be taken at face value since it was conducted in rodents, not humans.
So what does all of this mean? The idea of eating right and getting more exercise is nothing new. We've known for years that changing our health behaviors can stave off heart disease, and potentially let us live longer. The studies mentioned here really highlight the positive-feedback nature of our actions -- behavior changes (diet and exercise) cause physiological and molecular changes in the body, which in turn alter another behavior (memory). This relationship tells us that our behavior choices no longer only determine life or death, but they also can impact our quality of life. It's true that the results don't make a direct link between diet/exercise and brain health, but rather, a loose correlation between the two that requires further study. But in my mind, it doesn't really matter what keeps the brain healthy -- my point isn't that diet and exercise are the end-all cure for disease, but rather, that they are an extremely important part of an overall healthy lifestyle that will allow us to make the most of our golden years.