Archives for Cognition
“I never read fiction,” is a point of pride for many people, along the lines of “I never watch TV.” The implication is that nonfiction is a higher calling, that fiction is a frivolous pastime while nonfiction is a serious education. This has been a push-pull throughout the history of the novel, especially since early novels tended towards salacious or scandalous, more Danielle Steel than Ian McEwan. Poet Samuel Coleridge, (1712 to 1835) stated his case thus: I will run the risk of asserting that where the reading of novels prevails as a habit, it occasions in time the entire destruction of the powers of the mind: it is such an utter loss to the reader, that it is not so much to be called pass-time as kill-time. It…provokes no improvement of the intellect, but fills the mind with a mawkish and morbid sensibility, which is directly hostile to the cultivation, invigoration, and enlargement of the nobler powers of the understanding. My brain must be a mawkish mess because I love a good novel (currently reading Ann Patchett’s State of Wonder, thumbs up). I love nonfiction too, but the escape and emotional charge novels provide have always been preferable to me (unless we’re talking narrative nonfiction, like Erik Larson’s engaging histories or a book I recently read in practically one gulp, The Big House: A Century in the Life of an American Summer Home). So I was gratified to read this New York Times story about the neuroscience of reading fiction.
Elisha Goldstein’s book, The Now Effect, has sent my brain spinning in yet another direction. The anecdote: A professor stood before a philosophy class holding an empty jar. As the students took their seats, she began filling the jar with golf balls. When they reached the top, she asked the students if the jar was full. They agreed that it was. The professor then took a bag of pebbles and poured them into the jar, and they made their way between the spaces of the golf balls. Again she asked the students if the jar was full, and they agreed that it was. But the professor had another trick up her sleeve. She brought out a bag of sand and proceeded to pour the grains into the jar, filling up more of the remaining space. Again the question came: “It’s full now, correct?” The answer was a resounding “Yes.” The professor then took a sip of her coffee and dumped the rest into the jar, filling up spaces that no one thought was there. The thought: how does our brain process negative space?
I’ve only just started reading the new book by fellow PyschCentral blogger Elisha Goldstein, and I’ve already found something useful. Goldstein is a psychologist in private practice, and his excellent blog is about mindfulness. His book, The Now Effect: How This Moment Can Change the Rest of Your Life, is a manual for learning mindfulness. The book is short, quick-read chapters that leave you with lots to think about and try. “See, Touch, Go” is the chapter that twanged a note in my brain--one image, in particular. Goldstein describes the See, Touch, Go method in an anecdote, through the words of a dog trainer trying to help a family frustrated by their rambunctious rescue dog. "‘See, touch, go.’ When your mind begins to wander off onto all your worries and frustrations with this dog, see that your mind has wandered, touch the thought like you might softly touch your reflection in a pond, and then gently go back to focusing on the training we’ve discussed." OK, so the dog trainer is beside the point. What got me is this: Touch the thought like you might softly touch your reflection in a pond.
“The Internet has ruined everything,” my husband likes to grumble. In some ways, he’s right. The Internet has laid waste to newspapers and threatens traditional publishing in all forms. It sucked the money out of the music industry. It's killing off traditional bookstores--even the superstores that killed off the small independents. New technology has opened up forms of expression to people who had been blocked by gatekeepers, but at the same time threatens to drag down the quality of that expression overall, because of the lack of those same gatekeepers. (If you saw some of the press releases I receive for self-published books, you would understand what I mean.) News operations struggle with the ever-increasing speed of the news cycle, trying to balance getting news out fast and getting it right. What I wonder now is what the speed of technology is doing to creativity. And because we are taught to “write what you know,” I will write about writing. Specifically blogging.
This blog celebrated its first anniversary on January 1, so I am therefore compelled (it's the law) to reflect on the past year. Writing Real World Research has been fun and also a lot of work. I read a lot more research than I end up writing about. Academic writing is no easy read and I am eternally grateful to those researchers who manage to slip a little joke in here and there. Some papers are so dense that even if the topic is compelling, my eyes cross and I can’t hack my way through them. I have no one to blame but myself---I decided to focus this blog on research. Sometimes I hate myself for choosing a theme that so often forces me in way over my head. Still, one of the perks of being a writer is that I get paid for finding out stuff I want to know. Reading and writing about research has taught me all kinds of useful things which, as the blog title suggests, I can take into the real world. So to reflect on the past year, here is some of the stuff I learned writing Real World Research in 2011 that has been most useful to me.
New research finds a small but significant correlation between social anxiety and ability to recognize faces. Yes. Oh yes. I don’t have severe social anxiety, but I do have some, and this gave me an aha! moment about it. I have a terrible time remembering faces. Even famous people. I recognize George Clooney, easy. Matt Damon? Not so much. Meryl Streep, easy. Charlize Theron? Not so much. Put me in a large party and I spend a lot of time pretending I remember people who remember me. People tend to be hurt and offended when you don’t remember meeting them and I don’t blame them. If you remind me where or how we met, I might remember (although my memory is crappy in many ways so maybe not). Every party is a minefield of not recognizing people I don't know well. And this is not just a problem at parties. I didn’t recognize a neighbor the other day and what’s worse, I took a guess and was wrong. Ugh, ugh, ugh. I never made the connection between my anxiety about parties and facial recognition, but this information fits with the satisfying click of a puzzle piece set in place.
The other day I learned that I’ve been walking around for the better part of a decade with a dislocated toe. I knew something was wrong. I’d had it X-rayed and the doctor said it looked like I’d jammed my toe somehow (true) and had developed some form of arthritis. I can’t remember the name. He gave me a prescription I never filled. I was not ready for a lifelong commitment, and figured such is age. You get arthritis, you learn to live with it. I don’t know how painful arthritis is, but this was extremely painful. I got rid of shoes that hurt too much, and was more than once brought to tears in the aisle of DSW just from trying on a shoe that hit my toe wrong. I often had near-blinding stabs of pain randomly, when I was lying in bed. The weight of the bedclothes could be painful. I’ve recently developed plantar fasciitis from walking wrong. Then, the other night, I turned in the kitchen and had a sudden stab of pain in my toe. But even as I was still staggering dizzily while it throbbed, my first thought was: It’s fixed.
A friend learning her way around her new iPad wonders if learning really is different as we get older. And what’s the deal with that? The short answer is yes, our ability to learn does change as we age. We get slower. We have diminished capacity in our working memory as we age. That is, you can’t throw too much stuff at us at once. As a rule, it takes older people longer to learn things than it does young people. And older people might never get as good at new stuff as younger people can, no matter how long they study. Hm, yeah, that’s no fun. I read that in an article discussing evolutionary theory, which also gave me this cheering thought, about allocation of psychological resources: In childhood, the primary allocation is directed toward growth; during adulthood, the predominant allocation is toward maintenance and recovery (resilience). In old age, more and more resources are directed toward regulation or management of loss. The older you get, the more of a bummer evolutionary theory can be. So let us skip, instead, over to educational psychology, and an article titled “Age-related differences in the relation between motivation to learn and transfer of training in adult continuing education.” This article argues, through a literature review and a re-crunching of statistics, that motivation is key to learning, and that older adults are just as motivated to learn as younger ones.
Last night, my yoga and meditation teacher mentioned her surprise at how much easier meditation gets over time. She no longer has to work nearly as hard as she once did, she said, to reach a meditative state. And, she said, it's much easier than it once was to keep intrusive thoughts and daydreams at bay while she meditated. “I don’t know why,” she concluded, with some wonder in her voice. Coincidentally, I’d just spent much of the day reading about this very thing, in order to write this post. People who study the brain talk about something called the default-mode network (DMN), which is where our brain tends to go when we’re not making it do something else. The DMN correlates with the parts of the brain that activate when we’re thinking about ourselves—the medial prefrontal and posterior cingulate cortices, if you want to get technical about it. And our DMN does not always have our best interests at heart.
I’ve recently started listening to audio books. The idea never appealed to me much because I’ve never liked being read to. Reading is a solitary experience for me and being read to always seemed a little icky, though I couldn’t tell you why. Certainly being read to has a venerable history. At one time, all writing was meant to be read aloud, since few people could read. And reading aloud was family entertainment in the pre-radio, pre-TV days. And, of course, reading to children is both cozy and the first step towards their literacy. So it's not like listening to books is anything new. But downloadable audio books are increasingly popular (though the growing popularity of ebooks is the headline news in publishing.) Fans of audio books even have their own magazine. The first audio book I listened to was Bossypants, which is read by Tina Fey herself. Now I’m listening to Never Let Me Go, by Kazua Ishiguro, which is beautifully read by Rosalyn Landor, who strikes a tone as wistful as the book and conveys changes of character with just the slightest change in her voice. Narration, I realize, is an art form unto itself. But I’m still not sure how I feel about the audio book. It might be seducing me, but I worry about whether I’m having the experience of the book the author originally intended. Do we lose something of a novel when we don’t see the words spelled out in front of us? Is the medium integral to the message?