Saturday, July 26, 2014

A $650 Million Dollar Donation for Psychiatric Research - Misguided?

http://upload.wikimedia.org/wikipedia/commons/8/8a/MIT_Broad_Center.jpg 
The Broad Institute of MIT and Harvard

This was big news last week, when late on Monday, the Broad Institute, a biomedical research center, announced a $650 million donation for psychiatric research from the Stanley Family Foundation — one of the largest private gifts ever for scientific research.

This news broke the same day as a new article in the journal Nature - Biological insights from 108 schizophrenia-associated genetic loci - revealed research that identified an additional 83 genes linked to schizophrenia, as well as confirming 25 previously identified genes - now a total of 108 genes are linked to schizophrenia.

With 108 possible genes linked to schizophrenia, we know little about the specific combinations of genes, their epigenetic triggers, their environmental factors (both physiological and relational) that may turn off or on specific genes, and a lot of other variables.

For example, we know that physical and emotional neglect is more linked to psychosis and schizophrenia than is physical and sexual abuse. For more background on a relational view, see A relational view of causality in normal and abnormal development (2002).

Anyway, this article is about the generous donation given to the Broad Institute. It's wonderful that they gave such a huge sum in support of research, however, by giving it to an organization that focuses on psychiatric research (psychopharmacology and genetics) rather than a wider bio-psycho-social model.

Spark for a Stagnant Research

A $650 Million Dollar Donation for Psychiatric Research

By Carl Zimmer and Benedict Carey
July 21, 2014

“You’re talking to a guy who went from psychotic to normal with some pills,” said Jonathan Stanley, who was found to have bipolar disorder in the 1980s. The donation of a foundation started by his father is one of the largest private gifts ever for scientific research. Credit Max Reed for The New York Times
ONE DAY IN 1988, a college dropout named Jonathan Stanley was visiting New York City when he became convinced that government agents were closing in on him.

He bolted, and for three days and nights raced through the city streets and subway tunnels. His flight ended in a deli, where he climbed a plastic crate and stripped off his clothes. The police took him to a hospital, and he finally received effective treatment two years after getting a diagnosis of bipolar disorder.

“My son’s life was saved,” his father, Ted Stanley, said recently. When he himself was in college, he added, “those drugs didn’t exist; I would have had a nonfunctioning brain all the rest of my life.”

The older Mr. Stanley, 84, who earned a fortune selling collectibles, created a foundation to support psychiatric research. “I would like to purchase that happy ending for other people,” he said.

Late on Monday, the Broad Institute, a biomedical research center, announced a $650 million donation for psychiatric research from the Stanley Family Foundation — one of the largest private gifts ever for scientific research. Audio

It comes at a time when basic research into mental illness is sputtering, and many drug makers have all but abandoned the search for new treatments.

Despite decades of costly research, experts have learned virtually nothing about the causes of psychiatric disorders and have developed no truly novel drug treatments in more than a quarter century.

Broad Institute officials hope that Mr. Stanley’s donation will change that, and they timed their announcement to coincide with the publication of the largest analysis to date on the genetics of schizophrenia.

The analysis, reported by the journal Nature on Monday, identified more than 100 regions of DNA associated with the disease. Many of them contain genes involved in just a few biological functions, like pumping calcium into neurons, that could help guide the search for treatments.

“For the first time, there’s a clear path forward,” said Eric Lander, the president of the Broad Institute.

Experts not affiliated with the institute or the new paper agreed that the news on both fronts was good, but characterized the research as a first step in a long process. “The signals they found are real signals, period, and that is encouraging,” said David B. Goldstein, a Duke University geneticist who has been critical of previous large-scale projects. “But at the same time, they give us no mechanistic insight, no targets for drug development. That will take a lot more work.”

Jonathan Stanley, now 48, cannot explain why he suddenly developed bipolar disorder at 19. All he knows is that his brain responded well to lithium. He was eventually able to return to college, complete law school and become a lawyer. “You’re talking to a guy who went from psychotic to normal with some pills,” he said.

When scientists began to discover psychiatric drugs like lithium in the mid-20th century, they did so mostly by accident, not out of an understanding of the biology of the diseases they hoped to cure. For many years, they worked backward, hoping that by figuring out the action of the drugs, they could understand the causes of the diseases. But they came up empty.

Some researchers argued that a better strategy would be to find the genes involved in psychiatric disorders. This approach would give them new molecular targets for drugs they could test.

Yet the staggering complexity of the brain has yielded few secrets. More than 80 percent of the roughly 20,000 genes in human DNA are active in the brain.

In the 1990s, many scientists argued that the best approach to find “mood genes” was top-down. They would identify promising genes based on their biological properties and then survey their variants in people with and without a diagnosis.

But this approach was something like trying to find a thief in a crowd, based on a hunch of what he or she might look like. The research was “pretty much completely useless,” Dr. Lander said. “It turns out we are terrible guessers.”

By the early 2000s, the ability to decode human DNA had vastly improved, and scientists could look across our complete complement of genetic material, known as the genome, comparing samples from ever-larger groups of people. Ted Stanley’s first donation to the Broad Institute — $100 million in 2007, to found the Stanley Center for Psychiatric Research — went to support precisely such research.

Yet these studies were disappointing, too, and many researchers thought they were a dead end. “We were saying, ‘Maybe it isn’t the right way to go,’ ” Dr. Lander said.

Soon, the Broad Institute joined forces with scores of other research groups to form a consortium that could pool tens of thousands of subjects for analysis. In 2011, the consortium reported five genetic markers associated with schizophrenia. The group added more people to its studies and found even more genetic links.

The new paper in Nature is a culmination of the effort to date. The consortium analyzed 37,000 people who had schizophrenia and 114,000 who did not. It found 83 regions of the genome linked to the disorder that had not been previously flagged, and confirmed 25 previously identified ones, bringing the total to 108.

Dr. Lander cautioned that each variant accounts for only a tiny portion of the risk of developing schizophrenia. “It shouldn’t be used for a risk predictor,” he said.

Still, Dr. Samuel Barondes, a professor of neurobiology and psychiatry at the University of California, San Francisco, who was not involved in the study, called the findings encouraging. Even though schizophrenia is a “diverse disorder, with a horribly complicated genetic basis,” he said, “it is possible to pick up a reliable genetic signal if you have enough people.”

Other research teams are making progress on other conditions, such as bipolar disorder and autism, and finding that some mutations are rare while others are common variants.

On Sunday, an international team of scientists reported a study in Nature Genetics in which they compared 466 autistic people to 2,580 others. They found that most of the genetic risk of autism involved common mutations.

But these studies of brain disorders are also revealing a deep complexity that could pose an obstacle to rapid progress to effective drugs.

For example, recent research has found that mutations in the very same gene can cause a wide range of brain disorders, including autism, schizophrenia and epilepsy. “We are implicating the exact same genes across really different neuropsychiatric disorders,” Dr. Goldstein said. “We have no idea at all about why that is, and the only way to find out is to do some hard biology — to find out not only which genes matter, but what about them matters.”

That will take time and will probably produce plenty of reversals and spurious predictions. “Expect no grand-slam home runs,” said Dr. Allen Frances, professor emeritus of psychiatry at Duke and author of “Saving Normal,” a critique of psychiatric diagnosis. “There will be lots of strikeouts and only occasional singles.” The new study in Nature found that many risk variants clustered around specific body functions, like the immune system and calcium transmission in brain cells.

To understand their underlying biology, Broad researchers plan to grow neurons with mutations in the genes they have discovered, to see how they differ from normal cells. They will engineer mice with some of the mutations to see how their brains are affected. The scientists hope these experiments will lead them to hypotheses about the biology underlying psychiatric disorders — which they will test by giving mice drugs that target specific molecules in the brain.

These studies will be expensive, which is where the Stanley foundation comes in. Last year, after the death of his wife, Vada, Mr. Stanley, the founder of MBI, began considering what he would do with his fortune. He decided that his first gift to Broad Institute was not enough.

“After I’m gone,” he said, “I just want the money to flow to them as it would if I was still alive.”
 

John Martin Fischer - How Does a Belief in Immortality Affect the Way We Live Now?

http://assets4.bigthink.com/system/idea_thumbnails/41071/headline/immortality_3.jpg?1321106328

Immortality seems to me to be morally indefensible. The Earth is right now at 7 billion people, so let's say we develop immortality in 10-15 years, when the population will be at 8 billion or more. So none of those people die, and yet people keep reproducing, forever. Imagine the Earth with 15 billion people, or 25 billion. We have already overpopulated the planet. How would we feed these people, where would we get fresh water (as of now, 780 million people lack clean drinking water and 2.3 billion lack basic sanitation), and what about the lack of natural resources for homes, and so on.

Personally, the idea of immortality is abhorrent. Where and how would we find purpose in life if we never die? Boredom would eventually set in, so the only choice would be suicide. Even in the modern vampire stories, creatures who are immortal, many of the oldest vampires express their sense of exhaustion with existence, their "living" malaise. The fact that this idea turns up even in our myths suggests that it is a commonly held belief.

Anyway - this article sets out a philosophical argument for how the idea of immortality might affect how we live our lives today.

How Does a Belief in Immortality Affect the Way We Live Now?

By John Martin Fischer
July 8, 2014

image: Getty

I was asked to address the question,” "How does a belief in immortality affect the way we live now?" I am going to break this into two separate questions that are related to (if not identical to) it. The first question is, “How would the recognition of extreme longevity or even living forever change the way we would behave (or should behave)?” Then I’ll turn to how a belief in an afterlife would (or should) affect our behavior.

First: imagine that you knew that you would live for a very, very long time. We can simplify and imagine that you know that you will live forever. How would or should this hypothetical supposition affect your behavior? Well, it depends! It depends at least on certain background assumptions about the conditions of your envisaged life. Let’s make the “rosy” assumptions that you are in good health, that your body is not deteriorating, that you are comfortable financially, that you have friends and loved ones who are also immortal (in the sense of living forever). These are, of course, big assumptions; but to ask a really big question, sometimes we have to make big assumptions.

Some would say that, even under these very optimistic assumptions, our lives would be totally different—and unpleasant or even unrecognizable as choiceworthy human lives. Various reasons for this curmudgeonly conclusion have been offered, and we’ll consider just a few. First, some have argued that life under such circumstances would be intolerably and relentlessly boring. The idea is that what keeps us from being bored are our “projects”, and eventually we would run out of projects in an indefinitely long (or even just a very long) life.

I just don’t think this is true. That is, I don’t accept the conclusion that we would run out of projects in a very long (even an infinitely long life). Just consider, for starters, all of the scientific problems that remain to be solved. Focus, as a concrete starting point, on all of the diseases that plague human beings. The project of curing all the currently existing diseases would take a very, very long time. And, even assuming we can (given enough time), cure all existing diseases, by that time many new diseases will have popped up, offering new challenges. I just don’t think that it is obvious that we will ever get to the point where we will have cured all diseases (and palliated all human pain, suffering, and distress—both physical and mental). Simply having lots of time—even infinite time—doesn’t seem to imply that all of these challenges will successfully be met.

And we have just focused on a relatively tiny portion of all of the human challenges—the health challenges. How about all of the other scientific and technological challenges? How long will it take to answer certain fundamental questions of physics and cosmology? Even when they have been answered, if they ever are, there would remain the problems of connecting the abstract theories with all manner of practical problems.

Think, just for another set of concrete examples, of all of the challenges we face in preserving our planet from further environmental degradation. These are multifaceted and daunting. They will keep us busy for a long, long time (if we have that long). They could keep us going for a very long time in an infinitely long life.

So far we have considered just a few (admittedly central and important) scientific challenges that would generate projects in an immortal life. There are more where they came from. And think of all of the other kinds of projects: athletic, artistic, social. Consider the projects of writing poetry or novels or creating lovely paintings or sculptures. Or reading and appreciating novels. Why suppose that these projects would run out? Even if you had an infinite amount of time, why suppose that you would exhaust all of the novels worth reading? Suppose you were to read all of the novels currently worth reading. That would take a very, very long time. But (as with the diseases above) by the time you were finished, there would certainly be a new set of novels worth reading (novels that had been written during your very long process of reading). And why suppose that you could not find challenge and engagement in writing novels, even after a million or a billion years? (Of course, all of one’s projects would have to be distributed appropriately—reading or writing or anything can be boring if pursued without a break!).

The challenges and associated projects discussed above might be called “other-directed” projects. But there are also “self-directed” projects, such as eating delicious food, drinking fine wines, listening to music, enjoying art and natural beauty, sex, prayer, and meditation. These are “self-directed” projects in the sense that they aim at or crucially involve pleasant or agreeable mental states of the individual whose project it is. Again, you would have to distribute these projects properly in a very long or even infinitely long life. But why would a life that contained at least some of these projects necessarily be boring? Why couldn’t these activities be part of an overall life that is engaging and worthwhile?

Assuming that we would still have projects—other-directed and/or self-directed—in a very long or infinitely long life, would we have any motivation to pursue the projects? Some have thought that, given an infinite amount of time, all our activities and projects would lack “urgency”. They have even suggested that we would not have any motivation to do anything insofar as “there would always be time”. This is kind of a procrastinator’s nightmare (or perhaps dream!).

But I don’t have much sympathy for the contention that we would have no motivation in an immortal life. Consider, for example, the motivation to avoid pain—that would still exist in an immortal life. Similarly for the motivation to address other forms of limitation or impairment. We care about how we feel now; if we are now in pain or impaired, we will want to address those issues in a timely way. If I am in significant pain now, it is hardly comforting to know that I have forever to live and so eventually my pain will subside.

Similarly with loneliness. If am separated from someone I love or care about, or if I am just lonely now, I have reason to seek to reunite with the person or to find friendship, love, and companionship. The mere fact that I know that I have forever does not alleviate the suffering of loneliness now.

The curmudgeons about projects in an immortal life are too pessimistic. They are spoil-sports. They greatly underestimate the prospects for human engagement and fulfillment. They look at our projects as like books in a library; with infinite time, we can read all of the books. They forget that there will always be new books to read and even new perspectives to bring to the old books.

What about the second way of understanding our basic question? That is, what if we were to come to believe in immortality in an afterlife? How would (or should) this affect our behavior? Well, again, it depends. First, it depends on what conception of immortality we work with—a Buddhist or Hindu view of reincarnation? A Judeo-Christian conception of the afterlife in heaven or hell?

But let’s abstract away from details. In all plausible religious views, what matters crucially for your prospects after you die—your next life in the wheel of reincarnation or your place in heaven, hell, or perhaps purgatory—is the moral quality of your life here and now. That is, your prospects are enhanced by right action for the right reasons in this life. You need actually to care not just about yourself, but about others—you need to love others and to care about justice. If your actions manifest love of others and a dominant concern for justice, then you will be rewarded in the afterlife. It is key that you must act for the right reasons. And here it is important that the reason for your behavior must not be that it will enhance your prospects in the afterlife. You may of course understand and anticipate this fact. But it cannot be your reason for action. If it were, then your action would be motivated by self-interest and not morality. You would not be doing the right thing for the right reason. So there is a sense in which your behavior now should be focused on this world and the needs and interests of others here and now, even if one were to believe in an afterlife.

Discussion Questions:


1. Do you agree that you would not necessarily run out of other-directed projects in a very long life? An infinitely long life?

2. Do you agree that you could still have self-directed projects in a very long or infinitely long life? Or would such a life necessarily be boring?

3. Do you agree that, even if you believe in an afterlife, you should be concerned about your behavior and motivations here and now. Or do you thank that you should focus more on the life to come?


Related Questions  

Book - "Mind, Modernity, Madness: The Impact of Culture on Human Experience" by Liah Greenfeld

http://ecx.images-amazon.com/images/I/41gs8XWQihL._SY344_BO1,204,203,200_.jpg

Allan Young reviews Liah Greenfeld's final installment in her Nationalism trilogy, Mind, Modernity, Madness: The Impact of Culture on Human Experience (2013) for the summer issue of The Hedgehog Review.

This appears to be a very interesting book - although Young doesn't feel there is too much true innovation here. Here is a passage from the review below:


Greenfeld argues that culture is simultaneously a source of madness and a source of self-medication that attenuates the severity of madness. As pathogenic forces strengthen, she writes, self-medication grows equally more desperate and socially disruptive in an era of globalization:
“Paradoxically, the rate of severe (clinical) mental disturbance should, in general, be proportional to the possibility of engaging in ideologically motivated collective activism; that is, the rate of disturbance should necessarily be highest in individualistic nations, and higher in collectivistic civic nations than in ones organized on the basis of ethnicity. The most aggressive and xenophobic strains of nationalism—the worst kind for international comity—would be the best for the mental health of individual citizens in states where such virulence held sway.”
Based on the brief review, this book, having not read it yet, seems like the intellectual heir of Wilhelm Reich's The Mass Psychology of Facism. If so, it is an important book.

Mind, Modernity, Madness: The Impact of Culture on Human Experience

Liah Greenfeld
Cambridge, MA: Harvard University Press, 2013.

Reprinted from The Hedgehog Review; 16.2 (Summer 2014). This essay may not be resold, reprinted, or redistributed for compensation of any kind without prior written permission. Please contact The Hedgehog Review for further details.



Liah Greenfeld, a professor of sociology at Boston University, describes Mind, Modernity, Madness as the product of “a new—radically different—approach that has never been tried.” At 688 pages, it is a long book that ranges in its “interdisciplinarity” from the clinical epidemiology of bipolar depression to the historiography of romantic love in Shakespeare. But it has a clear, bold thesis: that the advent of madness is connected, as both cause and effect, to the rise of nations and nationalism.

More specifically, Greenfeld contends, the historical conditions that gave rise to the nation—a community of equals; a measure of individual autonomy, liberty, and mobility; and a declining acknowledgment of divine authority—make madness not only possible but inevitable. As the value of human life grows and becomes of paramount concern, self-invention and romantic love become popular ideals, and even common people are driven by ambition, aspiration, and the pursuit of happiness. “Modern culture,” Greenfeld writes, “leaves us free to decide what to be and to make ourselves. It is this cultural laxity that is anomie—the inability of a culture to provide the individuals within it with consistent guidance.”

The author’s evidence is historical and biographical. Her conceptual framework is sociological, inspired by Émile Durkheim’s 1897 book Suicide. Indeed, Greenfeld’s vision of modernity restates and broadens Durkheim’s view that social disintegration produces the anomie and alienation that can lead to self-destructive behavior and acts, including the taking of one’s own life. While Durkheim adduced higher rates of suicide in the anomic nations of later-nineteenth-century Protestant Europe, Greenfeld focuses on the worldwide epidemiology of schizophrenia, bipolar disorder, and major depression, which she regards as an overwhelming threat to Western civilization.

Greenfeld rejects the “constructivist” approach that she believes is prevalent “among Western social scientists—anthropologists, sociologists, and historians studying psychiatry—who conclude that madness is largely an invented problem… analogous to the equally false ‘social constructions’ of witchcraft and possession of other cultures, but dressed in a scientific garb and unjustifiably enjoying the authority of science in ours.” She derides the “poetic” excesses of Michel Foucault’s Madness and Civilization, but ignores constructivist conceptualizations of greater consequence, notably “idioms of distress” and “bio-looping,” which circulate freely within clinical psychiatry and can be found in the Diagnostic and Statistical Manual on Mental Disorders in appendixes on somatization syndromes and cultural psychiatry.

When Greenfeld accuses anthropologists of creating “false social constructions” of witchcraft and possession and repeating this effect on Western madness, she means that anthropologists have misconstrued how culture, mind, consciousness, brain, and madness are connected. Culture, in her view, is an ideational, symbolic, non-material phenomenon. Human consciousness is an emergent phenomenon, “logically consistent with the biological and physical laws but autonomous,” and irreducible to organic reality. The human mind comprises or contains a form of collective consciousness she calls “culture in the brain.”

These conceptions, Greenfeld says, run counter to the dominant “dual mind/body view of reality,” which attributes causal primacy to the “material” (the central nervous system) over the “spiritual” (consciousness, mind, culture). Culture and consciousness, in this paradigm, are epiphenomena of the material world: causation proceeds from brain to mind via identifiable mechanisms. In other words, culture can disguise the material nature of madness but cannot interfere with it. Constructionists are said to share these conventions.

Greenfeld emphatically rejects the dualist paradigm, contending that culture can and does cause biologically real (material) diseases, including madness. She believes that her thesis, being both counterintuitive and empirically proven, has revolutionary implications for how we understand and address the increasing prevalence of madness in our current era and culture. Furthermore, she believes that her thesis enables her to advance additional, counterintuitive claims concerning the historical origins and epidemiology of madness.

Yet it is unclear to me whether Greenfeld’s thesis is truly revolutionary. The difficulty comes in her proposal that culture causes biologically real diseases. There are two ways to interpret this claim. She could mean that “culture in the brain” is a source of distressful dilemmas, contradictions, and emotions that precipitate chains of physiological, molecular, neurological, and anatomical effects; that these changes in turn undermine the homeostasis underpinning normal functioning; and that, as a result, a pathogenic loop is created and sustained. This interpretation seems consistent with the process Greenfeld is proposing, and it is consistent with what she says about clinical psychiatry and research. This is a credible thesis, but it is far from being counterintuitive or revolutionary. Indeed, it is the prevailing approach among anthropologists and other social scientists interested in mind, brain, and psychopathology.

But Greenfeld may have something far more original in mind: “So long as there remains the unresolved philosophical mind-body problem, no significant advance in human neuroscience and, therefore, psychiatry would be possible.… The first order of the business is, therefore, to escape the mind-body quagmire.” If, in this book, she has found a way out of this 400-year-old problem, however, it is not at all obvious to this reader what it is.

Such matters occupy only two chapters. The remaining 500 pages are devoted to “madness.” According to Greenfeld, the term was coined in England in the early modern period (fifteenth and sixteenth centuries), then spread to France and Germany. Between 1880 and 1900, “madness” bracketed the maladies we know today as schizophrenia and bipolar disorder. (In psychiatry, there is currently renewed interest in reconnecting the two disorders on a single diagnostic spectrum.) The author further describes the history of the term in separate chapters on Europe and America.

In a brief but provocative epilogue, Greenfeld argues that culture is simultaneously a source of madness and a source of self-medication that attenuates the severity of madness. As pathogenic forces strengthen, she writes, self-medication grows equally more desperate and socially disruptive in an era of globalization:

“Paradoxically, the rate of severe (clinical) mental disturbance should, in general, be proportional to the possibility of engaging in ideologically motivated collective activism; that is, the rate of disturbance should necessarily be highest in individualistic nations, and higher in collectivistic civic nations than in ones organized on the basis of ethnicity. The most aggressive and xenophobic strains of nationalism—the worst kind for international comity—would be the best for the mental health of individual citizens in states where such virulence held sway.”

Mind, Modernity, Madness is the final volume in Greenfeld’s trilogy on nationalism. It provides readers with a provocative commentary on the sociocultural origins and psychopathological consequences of modernity. And it is a splendid antidote to the reckless application of the term “madness,” by both pundits and politicians, to the policies and persons of America's political opponents and the excesses of their nationalisms.


~ Allan Young, professor of anthropology at McGill University and author of The Harmony of Illusions: Inventing Posttraumatic Stress Disorder (1995), is completing a book on the social brain, psychopathology, and myths of empathy.

Friday, July 25, 2014

Astrocyte Pathology in the Prefrontal Cortex as New Cause of Cognitive Dysfunction?


New research indicates that astrocytes may play a role in cognitive dysfunction (at least in rats). Here is a brief description of astrocytes from Wikipedia:
Astrocytes, also known collectively as astroglia, are characteristic star-shaped glial cells in the brain and spinal cord. They are the most abundant cell of the human brain. They perform many functions, including biochemical support of endothelial cells that form the blood–brain barrier, provision of nutrients to the nervous tissue, maintenance of extracellular ion balance, and a role in the repair and scarring process of the brain and spinal cord following traumatic injuries.
Again, from Wikipedia, here is a list of the functions astrocytes perform in the brain:
Previously in medical science, the neuronal network was considered the only important one, and astrocytes were looked upon as gap fillers. More recently, the function of astrocytes has been reconsidered,[3] and are now thought to play a number of active roles in the brain, including the secretion or absorption of neural transmitters and maintenance of the blood–brain barrier.[4] Following on this idea the concept of a "tripartite synapse" has been proposed, referring to the tight relationship occurring at synapses among a presynaptic element, a postsynaptic element and a glial element.[5]
  • Structural: They are involved in the physical structuring of the brain. Astrocytes get their name because they are "star-shaped". They are the most abundant glial cells in the brain that are closely associated with neuronal synapses. They regulate the transmission of electrical impulses within the brain.
  • Glycogen fuel reserve buffer: Astrocytes contain glycogen and are capable of glycogenesis. The astrocytes next to neurons in the frontal cortex and hippocampus store and release glycogen. Thus, Astrocytes can fuel neurons with glucose during periods of high rate of glucose consumption and glucose shortage. Recent research suggests there may be a connection between this activity and exercise.[6]
  • Metabolic support: They provide neurons with nutrients such as lactate.
  • Blood–brain barrier: The astrocyte end-feet encircling endothelial cells were thought to aid in the maintenance of the blood–brain barrier, but recent research indicates that they do not play a substantial role; instead, it is the tight junctions and basal lamina of the cerebral endothelial cells that play the most substantial role in maintaining the barrier.[7] However, it has recently been shown that astrocyte activity is linked to blood flow in the brain, and that this is what is actually being measured in fMRI.[8][9]
  • Transmitter uptake and release: Astrocytes express plasma membrane transporters such as glutamate transporters for several neurotransmitters, including glutamate, ATP, and GABA. More recently, astrocytes were shown to release glutamate or ATP in a vesicular, Ca2+-dependent manner.[10] (This has been disputed for hippocampal astrocytes.)[11]
  • Regulation of ion concentration in the extracellular space: Astrocytes express potassium channels at a high density. When neurons are active, they release potassium, increasing the local extracellular concentration. Because astrocytes are highly permeable to potassium, they rapidly clear the excess accumulation in the extracellular space.[12] If this function is interfered with, the extracellular concentration of potassium will rise, leading to neuronal depolarization by the Goldman equation. Abnormal accumulation of extracellular potassium is well known to result in epileptic neuronal activity.[13]
  • Modulation of synaptic transmission: In the supraoptic nucleus of the hypothalamus, rapid changes in astrocyte morphology have been shown to affect heterosynaptic transmission between neurons.[14] In the hippocampus, astrocytes suppress synaptic transmission by releasing ATP, which is hydrolyzed by ectonucliotidases to yield adenosine. Adenosine acts on neuronal adenosine receptors to inhibit synaptic transmission, thereby increasing the dynamic range available for LTP.[15]
  • Vasomodulation: Astrocytes may serve as intermediaries in neuronal regulation of blood flow.[16]
  • Promotion of the myelinating activity of oligodendrocytes: Electrical activity in neurons causes them to release ATP, which serves as an important stimulus for myelin to form. However, the ATP does not act directly on oligodendrocytes. Instead, it causes astrocytes to secrete cytokine leukemia inhibitory factor (LIF), a regulatory protein that promotes the myelinating activity of oligodendrocytes. This suggest that astrocytes have an executive-coordinating role in the brain.[17]
  • Nervous system repair: Upon injury to nerve cells within the central nervous system, astrocytes fill up the space to form a glial scar, repairing the area and replacing the CNS cells that cannot regenerate.[citation needed]
  • Long-term potentiation: Scientists debate whether astrocytes integrate learning and memory in the hippocampus. It is known that glial cells are included in neuronal synapses, but many of the LTP studies are performed on slices, so scientists disagree on whether or not astrocytes have a direct role of modulating synaptic plasticity.
This is an important addition to previous research demonstrating the role of astrocyes in developmental disorders:
  • Barker, AJ, Ullian, EM. (2008). New roles for astrocytes in developing synaptic circuits. Communicative & integrative biology; 1(2): 207–11. PMID 19513261
  • Sloan, SA, Barres, BA. (Mar 29, 2014). Mechanisms of astrocyte development and their contributions to neurodevelopmental disorders. Current opinion in neurobiology; 27C: 75–81. PMID 24694749.
Here is a summary of the new research, followed by the abstract for the article, which is, of course, hidden behind a paywall.

A new cause of mental disease?

Thursday 24 July 2014
Researched and Written by Catarina Amorim

Astrocytes, the cells that make the background of the brain and support neurons, might be behind mental disorders such as depression and schizophrenia, according to new research by a Portuguese team from the ICVS at the University of Minho. The study, in Molecular Psychiatry, shows how a simple reduction of astrocytes in the prefrontal cortex (which is linked to cognition) can kill its neurons and lead to the cognitive deficits that characterise several mental diseases. Although malfunctioning astrocytes have been found in psychiatric patients before, it was not clear if they were a cause or a consequence of the disease.

"This is the first time that cognitive deficits of a psychiatric illness can be mimicked by solely affecting astrocytes" - says the team leader, João Filipe Oliveira - "opening a whole new range of possibilities, both on the causes and potential treatments for these disorders." The research by Ana Raquel Lima, João Filipe Oliveira and colleagues is particularly significant when we look at the heavy burden in human suffering and financial cost of mental diseases. In the US and Europe about 1 in 4 adults are affected in every given year (this is about 26% of the populations), while depression alone uses almost 5% of the total world health budget. And a new player behind a disease offers also potential new and maybe more effective treatments.

So what are astrocytes? These star-shaped cells are part of the so called "glial population" - non-neuron cells that form the brain background and that for a long time were considered mere "housekeepers" of the real players - the neurons. In fact, traditionally, brain function is the result of electrical impulses passing between neurons, transmitting the information necessary for all those extraordinary abilities of this brains ours, from memory storage and motor control to personality quirks.

But astrocytes, even if believed to be "the help", have always been the subject of much curiosity since it was claimed by some (and denied by others) that one of the few uniqueness of Einstein's brain was larger and more complex astrocytes within its cerebral cortex than "normal" individuals. Equally curious, was the fact that these are the most numerous cells in the mammalian brain, because keeping cells alive costs energy, which is always in short supply, and astrocytes were not even part of the of main action/brain activity. Or so it was thought.

In fact, the last decade has seen our ideas on astrocytes (and glial cells in general) change radically; we now known they perform highly complex jobs, including several previously associated with neurons. They are, for example, important for synapses (the specialised structures that do the contact between different neurons and through which the electrical signal is transmitted), where astrocytes detect and modulate activity, so effectively controlling the transmission of information in the brain.

Supporting their importance in the brain several studies have shown that patients with mental diseases - such as depression, bipolar disorder and schizophrenia - have lower than normal astrocyte density in the brain, especially in the prefrontal cortex. This can be improved, though, with anti-psychotic drugs.

This not only supports the importance of astrocytes in normal brain function, but also suggests they could play a role in mental disorders. And in fact, in one study killing astrocytes in the prefrontal cortex of rats seemed to cause a depression-like behaviour. But even if faulty astrocytes and mental diseases were often seen together, it was not possible to be sure, at least in psychiatric patients, that these cells were behind the disorder.

It is in this state of affairs that Lima and colleagues, in the work now published, decided to design a simple but very effective experiment to understand what was happening.

They start by injecting rats in the prefrontal cortex with a toxin that specifically kills astrocytes in a very localized way, and then tested the animals' cognitive abilities correlating these with the animals' (changed?) brain structure. The prefrontal cortex was chosen because it controls cognitive abilities such as planning, reasoning and problem solving, which are affected not only in the most common mental diseases, but also on age-related neurodegenerative illnesses like Alzheimer's.

As expected toxin-injected animals developed the cognitive deficits typical of mental disorders where the prefrontal cortex is affected. But what was really interesting, were the brain changes found - not only the prefrontal cortex's astrocytes had died with the toxin (as expected) but, as time passed, also did its neurons. Control animals injected with a solution free of toxin had no changes, either in behaviour or brain structure.

So even if faulty astrocytes have been found before in mental patients, the Portuguese researchers' results give robust support to the idea that astrocyte breakdown can be a primordial cause for these disorders (and not a result of them), and also suggests how it occurs. "Until now, we have blamed the poorer performance of the prefrontal cortex in these diseases on the surrounding astrocyte pathology" - says Oliveira - "but this study now supports the view that astrocytes, targeted in a pathological process, may actually lead to neurodegeneration in a specific brain region. Psychiatric disease can be mimicked by simply affecting astrocytes!"

This is a totally new perspective on how these diseases can develop, and consequently on how to treat them. For now, while we do not test other brain areas, Oliveira's results are specially relevant for mood disorders diseases - depression, schizophrenia and bipolarity - which we know to have both loss of cognitive functions, and abnormalities in the astrocytes of the prefrontal cortex.

But Oliveira and his team's findings are also important challenging the still too present view of the brain as a simple network of neurons, clearly showing that we need to see it instead as an interdependent circuit of neural and glial cells (in particular astrocytes) both in health and disease.

The new work also makes us also wonder if the claims on the importance of the astrocytes in Einstein's brain were that crazy after all...

* * * * *

Full Citation:
Lima, A, Sardinha, VM, Oliveira, AF, Reis, M, Mota, C, Silva, MA, Marques, F, Cerqueira, JJ, Pinto, L, Sousa, N and Oliveira, JF. (2014, Jul). Astrocyte pathology in the prefrontal cortex impairs the cognitive function of rats. Molecular Psychiatry; 19, 834-841. doi:10.1038/mp.2013.182

Astrocyte pathology in the prefrontal cortex impairs the cognitive function of rats

A Lima, V M Sardinha, A F Oliveira, M Reis, C Mota, M A Silva, F Marques, J J Cerqueira, L Pinto, N Sousa and J F Oliveira

Abstract

Interest in astroglial cells is rising due to recent findings supporting dynamic neuron–astrocyte interactions. There is increasing evidence of astrocytic dysfunction in several brain disorders such as depression, schizophrenia or bipolar disorder; importantly these pathologies are characterized by the involvement of the prefrontal cortex and by significant cognitive impairments. Here, to model astrocyte pathology, we injected animals with the astrocyte specific toxin L-α-aminoadipate (L-AA) in the medial prefrontal cortex (mPFC); a behavioral and structural characterization two and six days after the injection was performed. Behavioral data shows that the astrocyte pathology in the mPFC affects the attentional set-shifting, the working memory and the reversal learning functions. Histological analysis of brain sections of the L-AA-injected animals revealed a pronounced loss of astrocytes in the targeted region. Interestingly, analysis of neurons in the lesion sites showed a progressive neuronal loss that was accompanied with dendritic atrophy in the surviving neurons. These results suggest that the L-AA-induced astrocytic loss in the mPFC triggers subsequent neuronal damage leading to cognitive impairment in tasks depending on the integrity of this brain region. These findings are of relevance to better understand the pathophysiological mechanisms underlying disorders that involve astrocytic loss/dysfunction in the PFC.

Sharon K. Farber - Cults and the Mind-Body Connection


We tend to think of cults as exerting mind control over their members, a kind of psychological enslavement, and this is partially true. However, as Dr. Farber argues, the way to obtain control of the mind is through the senses - music, incense, chanting, drumming, dancing, touch. Mind control, as she shows below, can often begin with the body.


This comes from Sharon K. Farber, PhD and her Psychology Today blog, The Mind-Body Connection. The one place where I think she is off the rails is on meditation - her experience with her brother has seriously biased her objectivity.

Cults and the Mind-Body Connection

A Form of Soul Murder

Published on July 19, 2014 by Sharon K. Farber, Ph.D. in The Mind-Body Connection

I just returned from the International Cultic Studies Association‘s (ICSA) annual conference and wanted to tell you about soul murder, the term coined by psychoanalyst Leonard Shengold to describe the intentional attempt to stamp out or compromise the separate identity of another person. That is what destructive cults do. My interest in cults grew out of my shock many years ago upon discovering how deeply my brother was involved in Transcendental Meditation (TM), so deeply that he lost the ability to think for himself. Since then I have treated a number of patients who had been profoundly damaged by cults, and when I heard about the International Cultic Studies Association (ICSA) some years ago, I joined and have attended and presented on topics related to cult involvement, something that the mental health field tends to know little about. At these conferences I have met so many intelligent people who have been victims of cults, their families, researchers, and mental health professionals in the U.S. and abroad with expertise in treatment of those who had been involved in a cult. I also met what are today known as exit counselors, usually former cult members themselves who left their cult, who will talk with individuals who agree to speak with them. (No kidnapping, as in the deprogramming of years ago.) ICSA even has a number of members who were born into a cult. Having high intelligence is no protection from becoming victimized by a cult.

My practice is two or three miles from Irvington and Tarrytown, in Westchester County, where the Rev. Sun Myung Moon, founder of the Unification Church (aka the Moonies) owned hundreds of acres. At least a few times a week I pass by Belvedere, a large estate where many followers live. Years ago, I imagined myself infiltrating the group but after reading accounts of how people can succumb to mind control, I decided against it. At the conference, I met a young man who discovered when he was around twelve that the Rev. Moon was his father.

The  (ICSA) is a global network of people concerned about psychological manipulation and abuse in cultic or high-demand groups, alternative movements, and other environments. ICSA is not affiliated with any religious or commercial organizations.Its mission, as stated at their website icsahome.org, is to apply research and professional perspectives to help those who have been spiritually abused or otherwise harmed by psychological manipulation and high-demand groups, educate the public, promote and conduct research, and support helping professionals interested in cults, related groups, and psychological manipulation. Hearing parents speak at the conference of their anguish after losing a child to a cult was heart-wrenching. And yet they gained some comfort and lessened their feeling of isolation by sharing their stories with others who understood.

Cult activity is far more common than you might imagine. Our attention was drawn to cults in the sixties and seventies, when Allen Ginsberg said that life should be ecstasy and went to India and Hindu culture in search of it. Many young people followed suit, questioning western values and embracing eastern thought. Indian clothing and Hindu practices became the rage, and we became accustomed to seeing young men in orange robes chanting their Hare Krishna mantra in airports and bus stations, where they sought recruits.
hare kṛiṣhṇa hare kṛiṣhṇa
kṛiṣhṇa kṛiṣhṇa hare hare
hare rāma hare rāma
rāma rāma hare hare
Their heads were shaved except for a small lock of hair in the back, and they had paint marks on their foreheads. These representatives of the International Society for Krishna Consciousness (ISKON) became known as the hare krishnas. If anyone had told me years ago that I would develop a friendship with a man who had been deeply involved with this cult, I would not have believed it.

The first generation of cult members were young people who left home and school looking for meaning at a vulnerable period in their life. They were seduced into thinking they had found what they were looking for in such groups as the Unification Church, Children of God, The International Society for Krishna Consciousness, Scientology and others. As their numbers increased, the different groups and practices began to blur in the public eye.

When we think of cults today, we tend to think of the Hare Krishnas or other eastern meditation groups. But many Christian cults have evolved, such as the Jesus freaks, Children of God, The Way International, the Unification Church, and the Mormon Church. And today cults are not limited to religious groups but include EST, Scientology, yoga cults, psychotherapy cults, and philosophy cults such as Aesthetic Realism.

Just what is a cult? The word itself is controversial, because it used to be used to mean any religious group with unusual beliefs that deviated from the norm, what we might today consider a sect. Today, the term destructive cult is used to describe groups that use manipulative techniques and mind control to heighten suggestibility and subservience. They tend to isolate recruits from former friends and family in order to promote total dependence on the group. The aim is to advance the goals of the group’s leaders, which is to have total control over members.

Gaining total control of members is done by assaulting the minds of recruits, an assault meant to control their minds. The mind is located in the brain and in certain hormones and enzymes that travel through the body, affecting our senses. It is through the senses. through seeing, hearing, tasting, smelling and touching. that we know about the world. Think of the body as a giant pharmaceutical factory that manufactures powerful, mind-altering chemicals which we can release by immersing ourselves in mood altering activities or ingesting mood altering substances. The medieval Christian mystics who starved and flagellated themselves knew this well. So do Turkey’s Whirling Dervishes who once a year, put on long white robes with full skirts, black cloaks, and tall conical red hats and twirl in unison to the sound of drums and flutes, faster and faster, whirling their way toward God and ecstasy.

Cults start seducing people with love-bombing, paying a great deal of attention to and being very affectionate with potential recruits, a very effective way of connecting with someone who is feeling lonely and isolated. Then they assault and overwhelm their senses by using various various techniques to induce a dissociated state, an altered state of consciousness, a trance state, in which mind and body are disconnected from each other. These techniques include sleep and food deprivation, drumming, chanting, lecturing on and on for hours, flashing lights, spinning around in circles, all of which assault the senses and break down a person’s ability to think. The cult uses mind control to fill the dissociated mind with their beliefs and magical thinking. A moment comes when the mind shuts down and seems to snap from this assault to the nervous system. Snapping may happen suddenly and abruptly, or it may be a slower, more gradual process of subtle changes, resulting in personality change.

Many cults promote meditation, at times for many hours a day. When TM first came on the scene in the sixties, most people thought of it as a benign practice occurring twenty minutes in the morning and evening, but many advanced TMers devote many hours a day to meditating. In fact, they may go into the dissociated meditational state without intending to do so, and may live largely in a dissociated state of consciousness.

Meditation is generally promoted as having many health benefits, and mindfulness meditation has been actively promoted in the past two decades. It is a western, non-sectarian, research-based form of meditation derived from a 2,500 year old Buddhist practice called Vipassana. However, it is important to know that meditation of any kind is not for everyone.There are several studies indicating that up to as many as 55% of long-term meditators showed adverse effects, including partial epileptic-type seizures, with adverse effects increasing with the length of practice. Meditation can produce anxiety, panic, confusion, depression, agitation, ongoing dissociation, hallucinations, tics, sweating, trembling, shivering, worsened interpersonal relations, psychotic breakdowns and suicidal tendencies in some people. Meditation is particularly dangerous for those with a history of schizophrenia.

The TM movement is known for ascribing positive qualities to all kinds of cult-induced psychopathology. A psychotic breakdown may be regarded as achieving cosmic consciousness, the key to enlightenment. TMers are indoctrinated to believe that if they spend thousands of dollars for a higher level of training, they would be able to levitate, also known as yogic flying. David Wants to Fly, a recently released film about a young man’s interest in levitation, has scenes of levitation, some of which can be viewed on Youtube.com. What you see is not people flying, but sitting in the lotus position within a dome-shaped structure known as a levitation

dome, on a thick layer of foam rubber padding, and bouncing and hopping around on their behinds. I have also heard that they strap on foam rubber “butt pad”, which happen to be the number one bestselling accessory at Maharishi International University in Fairfield, Iowa. Apparently, they help someone bouncing around on his behind to bounce higher.

Radiance, the TM Ideal community where my brother lives has, in addition to a community swimming pool, its own levitation dome. My brother said he banged into a wall while levitating and broke his good watch, making him decide to switch to a cheaper Timex. I tried it myself. I just bounced around. You too can bounce around on your behind without spending lots of money to learn how to do it. You don’t need to meditate. Just don’t call it levitation. Although TM purported that members could levitate, they never allowed photographers or film makers to witness it, and for good reason.

The cult preys upon the tendency of many to rely on magical thinking, which reinforces the tendency to endow the leader with omnipotent and magical powers, much like the child’s early mental representations of the parent who at that time, did control his universe. The member can readily come to believe that the leader can read his mind or hear conversations at a distance. Slowly, greater and greater irrational power is attributed to the leader. Because the cult leader tends to be a person with a sense of self-esteem so damaged that he requires the adoration, obedience, and subjugation of others to gain a sense of self-esteem and power, he cannot get enough of this. This is very much the same dynamic as is found in cases of domestic violence, when one spouse, usually the husband, Tries to assert total control over the other, seemingly a cult of one.

Some in cults who cannot verbally express what they feel about what has been done to them express it through their bodies, harming themselves through cutting and burning themselves, starving their bodies or stuffing themselves with food when they can get their hands on it, or purging through vomiting. When they get so sick in the cult, this is when the cult will eject them because they feel no responsibility for getting them the help they need. It should not be a surprise to hear that many cults are openly against psychotherapy.

I hope this helps you understand a bit more about soul murder. The victims of soul murder remain in large part possessed by another, their souls in bondage to another. Shengold cites George Orwell ‘s 1984, in which O’Brien says to Winston Smith: “You will be hollow. We will squeeze you empty, and then we shall fill you with ourselves .. .Power is in tearing human minds to pieces and putting them together again in new shapes of your own choosing.”

There is help available to those who have been victimized by a cult.

There is a chapter about the cults, “Cult-Induced Ecstasies and Psychosis” in my new book, Hungry for Ecstasy: Trauma, the Brain, and the Influence of the Sixties (2013). It is an expensive book but available with a 30% discount if you email me at Sharonkfarber@gmail.com. Or you can request your library obtain a copy.

Also, if you go to the ICSA website (icsahome.orgicsahome.org) there is much valuable information there. If you join you will receive their magazine and the Cultic Studies Journal and more.

In NY, there is the Cult Hotline and Clinic at the -Jewish Board of Family and Children’s Services,telephone 212 632-4640. Arnold Markowitz is the director. There are local ICSA support meetings in NYC, Philadelphia and Boston. Bill and Lorna Goldberg, both licensed clinical social workers, run a monthly free support group for people whose lives have been affected by cults, victims and families, in Englewood NJ. See http://www.blgoldberg.com/

Or call 201.894.8515. There is also help available in other areas of the country and Europe.

Thursday, July 24, 2014

Robert Sapolsky - Dude, Where’s My Frontal Cortex?

An excellent new article from Robert Sapolsky published at the always interesting Nautilus. Sapolsky offers some insights into the seemingly incomprehensible functioning of the teenage brain.

Dude, Where’s My Frontal Cortex?

There’s a method to the madness of the teenage brain.

By Robert Sapolsky | Illustration by John Hendrix
July 24, 2014

http://static.nautil.us/3707_2ad9e5e943e43cad612a7996c12a8796.png

IN THE FOOTHILLS of the Sierra Mountains, a few hours east of San Francisco, are the Moaning Caverns, a cave system that begins, after a narrow, twisting descent of 30-some feet, with an abrupt 180-foot drop. The Park Service has found ancient human skeletons at the bottom of the drop. Native Americans living there at the time didn’t make human sacrifices. Instead, these explorers took one step too far in the gloom. The skeletons belonged to adolescents.

No surprises there. After all, adolescence is the time of life when someone is most likely to join a cult, kill, be killed, invent an art form, help overthrow a dictator, ethnically cleanse a village, care for the needy, transform physics, adopt a hideous fashion style, commit to God, and be convinced that all the forces of history have converged to make this moment the most consequential ever, fraught with peril and promise.

For all this we can thank the teenage brain. Some have argued adolescence is a cultural construct. In traditional cultures, there is typically a single qualitative transition to puberty. After that, the individual is a young adult. Yet the progression from birth to adulthood is not smoothly linear. The teenage brain is unique. It’s not merely an adult brain that is half-cooked or a child’s brain left unrefrigerated for too long. Its distinctiveness arises from a key region, the frontal cortex, not being fully developed. This largely explains the turbulence of adolescence. It also reflects an important evolutionary pressure.

The frontal cortex is the most recently evolved part of the human brain. It’s where the sensible mature stuff happens: long-term planning, executive function, impulse control, and emotional regulation. It’s what makes you do the right thing when it’s the harder thing to do. But its neurons are not fully wired up until your mid-20s. Why?

It’s a central tenet of genetics that the genome that we start off life with, back when we were just a fertilized egg, is passed on to every subsequent cell in the body. But if the frontal cortex is the last part of the brain to fully mature, it’s the brain region least shaped by that genome and most sculpted by experience.

Our success as primates, and as individuals, revolves around social intelligence and adapting to the subtleties and idiosyncrasies of the environment. This is the portfolio of the frontal cortex. So if it’s going to develop the ability to do this right, it’s going to have to be profoundly shaped and informed by every smidgen of experience along the way.

Like evolution itself, though, maturation seldom follows a straight and narrow path. Adolescence reminds us that life has many rivers to cross, some fraught with turbulence.

AROUND THE ONSET of adolescence, the frontal cortex is the only brain region that has not reached adult levels of grey matter, made up of neuronal cell bodies. It would seem logical that gray matter levels would increase thereafter. But no, over the course of adolescence, frontal cortical gray matter volume decreases.


This occurs because of one of the cleverest things that brains ever evolved. During fetal development, mammalian brains generate far more neurons than are found in the adult brain. Why? Because the fetal brain holds a dramatic competition. Winning neurons get to migrate to the correct location and form the optimal number of connections with other neurons. Neurons that miss the cut undergo “programmed cell death.” Neuronal overproduction followed by competitive pruning (a process that has been termed “Neural Darwinism”) allows more complex and optimized neural circuitry, a wonderful example of less being more.

The same plays out in the adolescent frontal cortex. At the beginning of adolescence, gray matter volume is greater than it is in adults, and subsequently declines, as less optimally connected neurons are pruned away. Within the frontal cortex, it is the evolutionarily oldest sub-regions that mature first; the spanking new dorsolateral prefrontal cortex, for example, does not even begin to lose gray matter volume until the end of adolescence. This delayed frontal cortical maturation means that adolescents aren’t at adult levels of expertise at various cognitive tasks, like recognizing irony or Theory of Mind—the ability to operate with the knowledge that someone else has different information than you do.

In an adult, the frontal cortex steadies the activity of parts of the limbic system, a brain region involved in emotion; in contrast, in the teenage brain, the limbic system is already going at full speed, while the frontal cortex is still trying to make sense of the assembly instructions. One result of this imbalance is that emotions are more intense. Stick people in a brain scanner and show them pictures of faces expressing strong emotions. In the adult, there is activation of a prime limbic structure, the amygdala; shortly after, there is activation of frontal cortical regions, which damp the amygdaloid response: “OK, calm down, it’s just a picture of an angry/sad/happy/scared face, not the real thing.”

But in the teenager, the frontal cortical response is muted, and the amygdala’s response is augmented. That means emotional highs are higher, and the lows are lower. This is shown in studies of limbic pathways that release dopamine, a neurotransmitter central to anticipation of reward and pleasure (cocaine, for example, works on this limbic dopamine system). Average dopamine levels in adolescents and adults do not differ; what differs are patterns of dopamine release. Put an adult in a brain scanner, give them a small reward, and there’s a small degree of activation of this dopamine circuitry. Medium reward, medium activation; big reward, big activation. Now do the same with a teenager. Medium reward, medium activation, same as in the adult. With a big reward, though, dopamine signaling increases far more than in an adult. Small reward and dopamine signaling decreases. A small reward feels like a punishment. This is a neurochemical gyroscope that’s way off kilter.

The delayed maturation of the frontal cortex also helps explain the defining feature of adolescence, namely the weird predilection for bungee jumping. During risky decision-making, adolescents show less activation of some key sub-regions of the frontal cortex than do adults, and among adolescents, the less activity in these regions, the poorer the risk assessment.

And adolescents are bad at risk assessment in a particular way, shown by Sarah-Jayne Blakemore of University College London. Ask test subjects to estimate the likelihood of some event happening to them and then tell them the actual likelihood of it happening. The feedback can constitute good news, as subjects learn that a good event is more likely to occur than they thought. Conversely, the feedback can constitute bad news, as subjects learn an event can happen more often than they expected. Both adults and teens are likely to adjust their assessments when asked again about the likelihood of good news. But teens alone don’t get the picture about bad news. Researcher: What do you think the odds are of your having a car accident if you’re driving drunk? Adolescent: One chance in a gazillion. Researcher: Actually, the risk for people in general is 50 percent; now what do you think your chances are? Adolescent: Hey, we’re talking about me; one chance in a gazillion. This helps explain why adolescents have 2 to 4 times the rate of pathological gambling than adults.

So adolescents are lousy at risk assessment and take more risks. But there’s more to the story of those skeletons in Moaning Caverns. It’s not the case that adolescents and adults have an equal desire to do the same dumb-ass thing, and the sole difference is that the fully mature frontal cortex in the latter prevents them from doing so. Adolescents feel the allure of jumping off things. Middle-aged adults just recklessly cheat on their diets. Adolescents not only take more risks, they seek more novelty.



Adolescence is the time of life when we develop our tastes in music, food, and fashion, with openness to novelty declining thereafter. We’re not alone. When are lab rats most willing to try a novel food? During rodential adolescence. Among many social mammals, the adolescents of one of the sexes leave their natal group, avoiding inbreeding. Consider impalas, who live in a “harem structure,” a group of related females and offspring with one breeding male (and where the rest of the males knock around disconsolately in “bachelor herds”). When a young male hits puberty, he is driven out by the resident breeding male.

It’s different among primates. Take baboons. Two troops encounter each other on opposite sides of a stream. The males hoot and holler at each other in a manly fashion until they get bored and go back to foraging, ignoring the interlopers. An adolescent stands on the edge of the stream, riveted. “New baboons, a whole bunch of ’em!” Nervous and agitated, he runs five steps toward them, runs back four. He tentatively crosses the stream, sits on the very edge of the other bank, scampers back as soon as anyone so much as glances at him.

The next day he stays on the other side for an hour. Then an afternoon. Then a night, breaking the umbilical cord. He wasn’t pushed out of his home troop, as with impalas. He had reached the point where if he had to spend one more day with the same baboons he’s known his whole life, he’s going to scream. The same thing happens with adolescent female chimps, who can’t get off the farm fast enough, heading off into the great unknown in the next valley. We primates don’t get driven out of our herds at adolescence. We get a mad craving for novelty.

These traits are exacerbated when adolescents are around peers. In one study, Laurence Steinberg of Temple University discovered that adolescents and adults, when left on their own, don’t differ in the risks they take in a driving simulator. Add peers egging them on and rates don’t budge in adults but become significantly higher in teens. When the study is carried out in a brain scanner, the presence of peers (egging on by intercom) lessens frontal cortical activity and enhances activity in the limbic dopamine system in adolescents, but not in adults.

This teenage vulnerability to peer pressure is worsened by the fact that such pressure rarely takes the form of hesitant adolescents coerced into joining in the fun of committing random acts of kindness. Instead, pressure disproportionately takes the form of “deviance training,” increasing the likelihood of risky sexual behavior, poor health habits, substance abuse, and violence. As has been said, the greatest crime-fighting tool available to society is a 30th birthday.

One brain-imaging study reveals the neural depths of adolescent pain in not belonging. Put someone in a scanner to play a video game with two other individuals, and manipulate things so that the subject believes they are being ostracized. In adults, this social exclusion activates the amygdala along with other limbic regions associated with pain, disgust, anger, and sadness. But then the frontal cortex kicks in—“Come on, it’s a stupid game”—and the limbic structures quiet down. Do the same with an adolescent and the frontal cortex remains silent and that agonized limbic network of teenage angst wails.

The slowpoke frontal cortex is not the only explanation for teen behavior. Another factor comes into play that keeps that teen brain off balance, namely gonadal hormones like estrogen and progesterone in females, and testosterone in males. This helps explain why adolescence is more turbulent than childhood—the frontal cortex is immature at both ages, but the tsunamis of hormones haven’t started in pre-adolescents. Hormones have numerous effects on the function of both the limbic system and frontal cortex. Testosterone decreases the ability of the frontal cortex to communicate with and rein in the amygdala. Not surprisingly, landmarks of adolescent maturation in brain and behavior are less related to chronological age than to time since puberty.

The onset of puberty is not merely about the sudden onslaught of gonadal hormones. The defining feature of ovarian endocrine function is, of course, the oscillations of hormone release. In adolescent females, puberty does not arrive in full flower, so to speak, with one’s first period. Instead, for the first few years, only about half of cycles actually involve ovulation and the accompanying hormonal surges. So not only are there major fluctuations in gonadal hormone levels due to ovulation, but also higher-order fluctuations as to whether ovulation occurs. And fluctuations in hormones affect emotion and cognition. (While an adolescent male does not have the hormonal fluctuations of his female counterparts, it can’t be a great thing that his frontal cortex probably keeps getting hypoxic from all that priapic blood flow to his crotch.)

As adolescence dawns, the frontal cortex’s efficiency is diluted with superfluous connections failing to make the grade. The limbic system is fully online and dopamine is careening all over the place. Meanwhile, the brain is being marinated in the ebb and flow of gonadal hormones. No wonder the human species produces beings like Justin Bieber and Miley Cyrus, making a handy living catering to their constituency.



Party in the Brain: In adolescence, the frontal cortex is diluted with superfluous connections, dopamine is careening all over the place, and the brain is being marinated in the ebb and flow of hormones. That should explain Miley Cyrus.Getty Images for MTV

BUT ADOLESCENCE isn’t always as dark as it’s made out to be. There’s a feature of adolescence that makes up for the stupid risk-taking and hideous fashion decisions. And that’s an adolescent’s frenzied, agitated, incandescent ability to feel someone else’s pain, to feel the pains of the entire world, to want to right all its wrongs. Adolescents are nature’s most wondrous example of empathy, where the forcefulness of feeling as the other can border on nearly being the other.

This intensity is at the intersection of so many facets of adolescence. With the highs higher and lows lower, the empathic pain scalds and the glow of having done the right thing makes it seem plausible that we are here for a purpose. Another factor is the openness to novelty. An open mind is a prerequisite for an open heart, and the adolescent hunger for the new readily presents opportunities to walk a mile in someone else’s shoes. And there is the egoism of adolescence. There was a period during my late adolescence where I hung out with Quakers. They’d often say, “All God has is thee.” This is God of limited means, not just a God who needs the help of humans to right a wrong, but who needs your help most of all. Egoism is tailor-made for adolescents. Throw in inexhaustible energy and the sense of omnipotence and it seems possible to make the world whole.

A few years ago, I saw a magnificent example of the empathy that a sluggish frontal cortex can produce in teenagers. My daughter is seriously into theater and at the time she was in a production of a superb, searing play about the Bosnian genocide, Stefanie Zadravec’s Honey Brown Eyes. She played a 12-year-old Bosnian girl for whom things don’t go so great, and whose life-or-death fate is ambiguous as the play ends.

Some high school kids had come to a performance as a group outing for an English class. About halfway through the play, my daughter’s character appears for the first time, cautiously emerging from a ventilation duct in her kitchen where she’d been hiding, unaware that the soldier who had just left the apartment after killing her mother was going to return. Up until that point, she had only been hinted at as a character. The soldier had his ethnic-cleansing to-do list of names of Bosnians in the building to kill, and kept demanding of the mother, “Where’s your daughter? It says you have a daughter.” “I don’t have a daughter,” the mother repeated up until her death. So as the girl begins to emerge from the ventilation duct, the realization sweeps through the audience: there is a daughter. As my daughter began to crawl out, the teenagers in the audience did something you’re not supposed to do in a theater, something no adult with a developed frontal cortex would do. After a moment of hushed silence, two or three voices called out, “No!” Another called, “Go back in, it’s not safe!,” another, “He’s coming back!” After the play, the teenagers clustered around my little girl when she came out of the stage door, hugging her, reassuring themselves that both she and her character were OK.

This is the picture of adolescents with their hearts on their sleeves, limbic systems going full blast, and their frontal cortices straining to catch up with some emotional self-regulation. When I see the best of my university students in that agitated, optimistic state, I always have the same thought: It used to be so much easier to be like this. Having this adult frontal cortex of mine probably enables me to do good in a more efficacious, detached way. The trouble, though, is the same detachment makes it so much easier to decide that it’s really not my problem.

SO WHAT is the adaptive advantage of human brain development evolving this way? Potentially, there is no advantage. Perhaps the human frontal cortical maturation is delayed because it is the most challenging construction project the brain undertakes. In this view, wiring up something like the visual cortex can pretty much be wrapped up in the first year of life, but doing the same in the frontal cortex takes another quarter century. This seems unlikely. The frontal cortex has the same basic structure as the rest of the cortex, uses the same neurotransmitters and types of neurons. On a nuts and bolts level, its maturation is slower than necessary, suggesting that there has indeed been active selection for the delay, that there is something advantageous about it.

One possibility, of course, is the adolescent turbulence. If the frontal cortex wired up at the same speed as the rest of the brain, there’d be no Sturm und Drang, no emotional fireworks, no psychic acne. Just a simple transition from kids to fertile adults around age 12. One can imagine that something would then be lost—that developmental phase of antsy, itchy exploration and creativity that has been evolutionarily enriching. We might not have had that long line of pimply adolescent geniuses who worked away to invent fire, cave-painting, and the wheel.

Maybe. But this just-so story has to accommodate the fact that behavior doesn’t evolve for the good of the species, but for passing on copies of genes of individuals. And throughout history, for every teenager who scored big-time in the reproduction department thanks to some adolescent inventiveness, there’ve been far more who broke their necks from some adolescent imprudence.

No, I think that the genetic program of brain development has evolved to help free the frontal cortex from the straightjacket of genes. If the frontal cortex is the last part of the brain to fully mature, it is by definition the brain region least shaped by that genome and most sculpted by experience. With each passing day, the frontal cortex is more the creation of what life has thrown at you, and thus who you become.

A key theme in current neuroscience—the plasticity of the adult brain—underscores how this happens. As shown with a vast body of research, repeated stimulation of a synapse (the connection between two neurons) causes it to become more excitable, to pass on messages more readily; the synapse “remembers,” as a likely building block of how memory works. Similarly, the dense arbor of neuronal processes that determine which neurons connect to each other can expand or contract, depending on experience. Spend a summer learning how to juggle and entire circuits in the motor cortex will remap. Despite a zillion years of neuroscience dogma, it is now clear that the adult brain makes new neurons in response to things like an enriched environment. All of these occur in adults of any age, but most readily in the young adult brain. And if the frontal cortex is unique in still developing then, that makes it the brain’s hotspot for plasticity.

Why is this important? One answer comes from recent studies of intelligence in education. Some educators stress that a student’s “emotional intelligence” or “social intelligence” (as measured various ways) is a better predictor of adult success and happiness than their IQ or SAT scores. It’s all about social memory rather than memory of vocabulary words, about emotional perspective-taking, impulse control, empathy, ability to work with others, self-regulation.

There’s a parallel in other primates, with their big, slow-maturing frontal cortexes. What makes a “successful” male baboon in a world of dominance interactions? Attaining a high rank is all about muscle, sharp canines, well-timed aggression. But once alpha status has been achieved, maintaining it is all about social smarts—which potential coalitions to form, which to stay away from, how to cow a rival through psychological intimidation, having sufficient impulse control to walk away from provocations, and avoiding displacing aggression onto everyone else when you’re having a bad hair day. This is the realm where doing the right thing is often the harder thing, and adult life is filled with consequential forks in the road where intelligence lights the way forward.

This is all worth keeping in mind the next time you find yourself being an adolescent, or dealing with one who has the volume up to 11 in every domain. Sure, adolescence has its down sides, but it has its abundant pluses—the inventiveness, the optimism, the empathy. Its biggest plus is that it allows the frontal cortex time to develop. There’s no other way we could navigate the ever-increasing complexity of our social world.


~ Robert Sapolsky is professor of biology, neurology, and neurosurgery at Stanford University.