We keep asking ourselves “how?” and “why?” And, with authorities still trying to piece together evidence, the public has to make do with limited – and often incorrect – information.
First came reports that the shooter, Adam Lanza, might have Asperger’s. To my knowledge, no authoritative source has yet confirmed Lanza had a formal diagnosis of that or any other emotional, behavioral or developmental condition.
But that lack of evidence – as well as expert consensus that Asperger’s was extremely unlike to have triggered a shooting rampage – didn’t stop an army of commentators from weighing in.
Now, comes the speculation about whether Lanza might have a history of taking mood or behavior-altering medication.
Don’t get me wrong, here. I’m not blaming journalists, bloggers, pundits, Twitter users, and the general public from wondering if Lanza might be taking psychiatric meds.
In fact, it’s one of the first questions that came to my mind – even before I heard the reports of his possible Asperger’s.
With the explosion of mobile apps and websites such as PatientsLikeMe, which help people chart symptoms, medications and side effects, we’ve entered a new era of unprecedented medical self-monitoring.
Is this a good thing when it comes to psychiatric medications and mental health?
Last week, I featured a guest post from M., a reader from Texas who began taking Ritalin for ADHD when she was 12, then quit before college.
M. concluded in retrospect that taking that taking Ritalin taught her she couldn’t rely on herself to control her behavior. Instead, she learned to look to others for feedback, which she thinks provoked her anxiety.
Today, I’m following up with the second half of M.’s medication story, about her experience starting Zoloft in her mid-20s to treat some of that residual anxiety. Read on to find out how she fared during a second stab at medication treatment.
In recent years, there has been a huge increase in the prescribing of psychiatric medication to treat aggression in children.
Specifically, atypical antipsychotic and mood stabilizing drugs, originally developed for schizophrenia and bipolar disorder in adults, are now routinely prescribed to treat the aggression that occurs in a variety of childhood psychiatric disorders.
Prescriptions for atypical antipsychotics increased sixfold between 1993 and 2002, and the majority were prescribed to treat non-psychotic aggression, according to a task force that recently published guidelines on how to treat aggression in kids.
But these drugs carry the risk of serious side effects, notably severe weight gain and metabolic changes that can lead to Type 2 diabetes. Critics, including many in the medical community, have said they are over-prescribed.
At the same time, we’re in the midst of a collective national hand-wringing over how to reduce childhood bullying. Might drugs that curb aggression be the answer?
On this blog and in my new book, Dosed: The Medication Generation Grows Up, I explore young people’s experiences with medication. And oftentimes, by exposing their ambivalence, even their resentment, toward their treatment from an early age, I end up implicitly questioning the value of early intervention for mental illness.
So in honor of the American Psychological Association’s Mental Health Month Blog Party Day, I want to address the question of whether I think early intervention is worth it.
Some new findings about disparities in the way young adults of different races use mental health services have been troubling me since I first read about them last week.
Whites who received “psychological or emotional counseling” as teenagers were more likely to be in treatment as young adults compared to their age peers who didn’t receive counseling, a study in Psychological Services found.
But for black young adults, the findings were reversed: Having received counseling as teens made them less likely to receive services as adults.