Can I have your attention please?

A small, highly unscientific poll (conducted by yours truly) suggests that when psychologists are asked what attention is, they will direct you to the 19th century psychologist and philosopher, William James. In his now-canonical 1890 book, The Principles of Psychology, the definition is shockingly simple: “Everyone knows what attention is.”

I’m holding out though; this is the very short answer and it feels incomplete. James goes on:
“It is the taking possession by the mind, in clear and vivid form, of one out of what seem several simultaneously possible objects or trains of thought … It implies withdrawal from some things in order to deal effectively with others, and is a condition which has a real opposite in the confused, dazed, scatterbrained state which in French is called distraction, and Zerstreutheit in German.”

James is certainly not the only person to give thought — academic or otherwise — to attention. A great many intellectuals have trodden along James’ original tracks, and more recently, attention has become a topic of growing popular concern.

From James’ original question, let’s fast-forward over a century to the summer of 2008. That year, The Atlantic’s July/August issue posed the question, “Is Google Making Us Stoopid?” The accompanying article by Nicolas Carr considered the ways in which the Internet might be changing our brains — for the worse. In particular, Carr posited that our ability to pay attention might be at stake. A similar, and perhaps more sombre read was Maggie Jackson’s book published in the same year, Distraction: The Erosion of Attention and the Coming Dark Age.

Carr and Jackson are just two of a growing number of voices concerned about what new technologies and the Internet in particular are doing to our minds and attention spans. That’s not to say there’s a consensus on the topic, however; some writers claim that these concerns ignore the potential value of distraction. Jonah Lehrer, for instance, cites studies linking distractedness with creative achievement. But amid this intellectual tumult over whether computers are sapping our attention spans, it is worthwhile to return to James’ original question: what is attention in the first place?

Jay Pratt is the undergraduate chair of U of T’s psychology department, and he conducts research on visual attention. Today, researchers recognize that the topic of attention is too big to study as a whole, and as Pratt explains, there are subfields of study like visual and auditory attention. According to Pratt, “At its highest level, attention is maybe the allocation of mental resources. It’s the brain putting its energy and its resources towards certain processes.” But for Pratt, that’s the big picture, and things aren’t so simple up close. As he points out, the brain performs many attention-based processes at once, and this makes it difficult to simply model attention as a unit that can simply increase or decrease. As a matter of fact, this is an old and oversimplified way of talking about attention that dates back to the 1950s and ‘60s during the rise of cognitive psychology. In this period, some researchers thought of attention as a pool that could be filled or emptied, grown or shrunk.

alt text

The balancing act between the environment and our own will steers attention. Elements in the environment can capture our attention, but we can also willingly shift it to items we wish to bring into mental focus.

Two different brain networks carry out these two functions. The posterior network controls what some call reflexive attention — for example, when glint of light or something else in the environment catches our eye and our attention. This is an evolutionarily older part of the brain and is critical to survival in many animals, human or otherwise. Of course, the fact that you notice a flash of light out of a window might not matter much, but it’s the same system that tells you when an oncoming car is about to hit you.

In contrast, the anterior network controls volitional attention, the attempts we make to focus on something. Susanne Ferber, a cognitive neuroscience researcher at U of T, explains that these networks are mostly located in the cerebral cortex (the wrinkly, grayish outer layer you probably think of when you picture what the brain looks like), particularly the frontal and parietal lobes. The cortex is so important, Ferber explains, that if a person’s right parietal lobe is damaged, he or she is completely unable to pay attention to the left side of his or her field of vision — each hemisphere of the brain links up to the opposite side of the body. This is a condition called spatial neglect: those who suffer from it cannot perceive, process, or interact with one side of space.

Today, experimental psychology contributes greatly to the way we think about attention, but it doesn’t make up the entire picture. Mathematical models such as the one developed by John Senders, Professor Emeritus of industrial engineering at U of T, exist alongside current psychological ones. Senders began working on his own model of attention in the mid-1950s. As part of a project with the United States Air Force, he worked on determining which parts of the cockpit pilots were focusing their visual attention on.

When asked what attention is, Senders jokes that nobody knows — except himself. His current work involves building a mathematical model of attention and draws on information theory, control theory, and queuing theory, various branches of mathematics developed in the first half of the 20th century. Senders says that looking mathematically at how we distribute our attention over different possible targets can help us determine the probability that we will look at a given target.

For instance, if you are driving along and see a sign indicating that you are near a school, you’re more likely to look for children on the road. This may seem obvious, but it suggests something important: we may be paying attention to some things because of calculations conducted in the brain.

This would make the brain something like a robot, programmed to direct itself based on what it knows about certain targets and what they usually indicate. This is simply a model, and it’s difficult to tell whether the brain is doing calculations or whether the mathematics just offer a good heuristic.

But why do we place so much value on attention in the first place? Paul North, an intellectual historian in the Germanic languages and literature department at Yale University, says that attention is tied to theological roots that persist even now.

alt text

According to North, this theological bent originated with Aristotle, who thought an attentive mind was important for getting at the essence of the world. Similarly, for Augustine, the attentive mind was a way of imitating the mind of God. God can comprehend everything at once, but humans can use attention to separate the elements of our environment and situate them in the world.

By the 18th and 19th century, the philosophical and medical institutions of the time increasingly emphasized the importance of attention. The problem, as North explains, is that attention carries some pretty ancient baggage. Namely, to be less attentive is not just to be unable to focus, but to be further from God.

Today, attention is considered in parallel with our evolving technologies, which Maggie Jackson suggests are the very cause for our eroding ability to pay attention.

But not everyone feels the same way. Rhonda McEwen, who studies new media at the iSchool at U of T’s Faculty of Information, criticizes this popular view. She contends that we’re thinking about new media and technologies the wrong way. “We like to think of [the allure of media and technology] as a pull… But it’s more of a push.”

In other words, we aren’t thinking about distraction in the right way. We aren’t getting worse at paying attention to other things, we’re just getting better at devoting hours and hours to smartphones, laptops, and the other new wonders that emerge from Mountain View, Cupertino, and beyond. We’re not being distracted by these technologies; we’re just always paying some sort of attention to them.

According to McEwen, one of the questions that remains is whether the Internet and other new technologies are changing us or whether they represent something we’ve developed as a preferred alternative for interacting, communicating, and learning. As McEwen emphasizes, we should be very cautious about demonizing technology: doing so prevents us from reaping the rewards that technology yields and gets us nowhere.

Even so, the concerns of skeptics like Carr and Jackson aren’t groundless. People may in fact be focusing less on certain things and more on others. But the conception of attention they work with seems to be somewhat outmoded. We may be facing problems with our ability to pay attention, and they may well even be serious. But whether we want to be optimists or pessimists about how our minds might be changing, we need to be more attentive to attention.

Confessions of a depressed first year

Being a teenager with clinical depression can be a challenge. Combine that with the stresses of your first year in university, a totally new environment, high expectations, and a particularly bad case of anxiety disorder — and you’d probably ready to snap. Living with depression has its challenges at any age but it’s much more acute with a major lifestyle change.

I was diagnosed at fifteen. The verdict: serotonin deficiency. Interpretation: there’s a chemical lacking in my brain that makes me depressed. Simple? Sounds like it, but the reality is quite different.

Although I am on medication that has greatly improved my outlook on life, I still have problems trying to explain to my friends why I felt sad — “why can’t you just cheer up?” The reality is I don’t know why. An even worse question arose on one occasion: “So are you crazy?” Define crazy.
Everyone has different ideas about what mental illness is and is not. To complicate things further, a number of mental disorders were identified by the medical field just in the past ten years. It isn’t black and white. It isn’t as simple as being deaf or visually impaired — these conditions have had years of coping strategies developed to perfection.

alt text

Mental illness is like trying to find your way through a dark room. It varies greatly and every case is different. If you saw me walking down the street, you probably wouldn’t guess I had severe depression and anxiety disorder. On the outside, I’m your average teenager: good grades, friends, hobbies and a loving family. You’d never guess that I suffer from panic attacks, moodiness and frequent bouts of sadness.

Luckily at university, there is support for those suffering from mental illness. In the past, I’ve been told to hide my illness from people because of the social stigma that goes along with depression. “Don’t talk to her, she’s crazy,” or “Don’t upset her, she’s already sad. She has depression.” The reality is, I don’t want anyone’s pity. But I do want support.

I wish it was as simple as telling somebody I’m blind — because people would understand that it isn’t my fault, it’s not something I did or didn’t do. It’s an illness just like any other.

At U of T, people are far more supportive of those with mental illnesses than at the average high school. There are many groups that you can join for support, or if you just need someone to talk to. These groups are discreet. They need to be, unfortunately, due to the continuing social stigma attached to mental illness.

Depression is a part of me but it doesn’t define who I am as a person. Aside from the usual stresses of an average university student, my life includes mood swings, crying jags, and panicky moments. It makes life difficult, but not unmanageable. With properly dosed medication and more importantly, the support of friends, family, and non-judgemental support groups, I hope to find the university experience one of the best in my life.

How to perform a lobotomy

alt text


alt text

Insert metal rod through eye socket.

alt text

Prod until cured.

India, land of the wise?

There are only two ways to offend my Indian sensibilities. First, you can point out that my English is “very good.” English is the only language I speak with any degree of competence, so if my English weren’t “very good,” I wouldn’t be able to get myself understood, ever.

The second way is to tell me that you’ve “always wanted to go to India” to find yourself, discover its ancient wisdom, or some variation thereof. Let’s get one thing straight: Indian “wise” men are no wiser than any others, they’re just better at marketing.

I blame The Beatles. Sure, “enlightenment tourism,” as Professor Ritu Birla, Director of the Centre for South Asian Studies at U of T, calls it, has been around since before the Fab Four made their pilgrimage to my country in the 1960s. The idea of the mystical and ancient wisdom of the East goes back to the Orientalist school of thought that portrayed India as an exotic, romantic place of enlightenment.

alt text

India is not exotic and it’s not particularly romantic (even the Taj Mahal has a limited “wow” factor from up close). I’d argue that the country’s failure to deal with the greater-than-half of the population below the poverty line suggests that it’s not that institutionally wise either. There’s no storehouse of ancient wisdom that gets passed from generation to generation of Indians — or if there is, I certainly missed out when it was my turn!

Maybe you don’t think all Indians are inherently wise — that would be racial or national stereotyping after all. But the gurus and wise men that you’re going to India to seek out aren’t really much wiser than the average Indian either; they’re just good PR people. Salesman, marketer, entrepreneur — these labels fit the modern guru better than “spiritual teacher.” I’m a spiritual skeptic as a general rule, so I’m not going to pretend that my bias against the idea of some greater power or knowledge doesn’t play a role here.

Enlightenment tourism is an industry in and of itself in India. Foreigners swarm cities like Jaipur and Jhodpur seeking opium to lead them to “enlightenment” and a guru to show them the way. But they’re also just ordinary tourists to the hotels, restaurants, and other businesses that benefit from their presence. They’ll treat you well, because if there’s another thing Indians are known for, it’s hospitality. These places are no different from, say, Niagara Falls. Both use their natural resources or reputation of their culture and geography to make money from tourists.

Here’s another thing: if enlightenment is possible (and a lot of people genuinely think it is), then I don’t see why you have to go to India to attain it. Sure, the “East” first named moksha and nirvana, and created a framework for explaining the need for release from the cares of everyday life. Yet nowhere in the canon of “Oriental” religions is there a requirement to “find” yourself under a tree somewhere in rural India. If you’re searching for enlightenment, you’re just as likely to get there in a subway car in Toronto as in a village on the South Asian subcontinent.

You may find India exotic, but I’ve lived there for all but two years of my life, so forgive me if the magic is a little lost on me. It’s not that I don’t feel attached to the land of my birth, but separation from it hasn’t changed my ability to look at it clearly and critically.

So do me a favour: if you want to find yourself, take the money you’re planning to use to make that trip to India and send it to an NGO working to help bring the living standards of the people there to some decent standard. Then go outside and do something for the people of your own community — you may find that they are wiser than you think.

Brain Tidbits

Mirror neurons were discovered accidentally at the University of Parma in Italy. They activate when you perform a specific action and when you see another person perform that same action. Early experiments using monkeys showed that the same neurons activated when the monkey picked up a piece of food and when he or she saw an experimenter pick up the food.

A small percentage of people don’t need iTunes or Windows Media Player to see visualizations to their music. Sound-colour synesthesia makes people “see” music as lines on a screen in front of them. The lines, which move like those on an oscilloscope, have varying colours, height, width and depth.

A study by Kyle Steeland and James Deddens found that NBA teams did four points better when travelling west to east compared to east to west. The study, entitled Effect of travel and rest on performance of professional basketball players, suggested that lack of physical recovery time, rather than disruption of circadian rhythms (jetlag), contributed to decreased performance.

Male weightlifters, like models, have to care about their appearance. The overwhelming desire to be large, however, can be a symptom of a body dysmorphic disorder called muscle dysmorphia. First labeled “reverse anorexia” or “bigorexia,” the disorder is characterized by the pathological obsession with building muscle.

Alice in Wonderland syndrome (AIWS) is a neurological disorder that causes sufferers to experience visual distortions analogous to those experienced by Alice in Lewis Carroll’s novel. Symptoms of AIWS include object size distortions, such as perceiving objects as being much smaller (micropsia) or much bigger (macropsia).

The earthworm has an exceptionally small brain in proportion to the rest of its body. The brain (or cerebral ganglion) is actually a nerve bundle found at the front of the worm. Responsible for sensing light, among other environmental conditions, the brain is a relatively useless organ — so much so that, were it removed, the worm’s behaviour would appear unchanged.

A study by Richard Stephens has shown that swearing can get people through intense bouts of pain. Stephens made participants put their hands in buckets of cold water. They were divided into a swearing-accepted group and swearing-prohibited group. Members of the swearing group endured the discomfort longer than those in the group forced to keep their dialogue rated G.

MRI: How it works

Click to Enlarge

alt text

How we learn to make sounds

Hold a tissue or an unfolded napkin in front of your lips so that it moves when you breathe, and then say “peak.” If you’re a native speaker of one of several varieties of English, including Canadian English, a distinct burst of air should follow your P, causing the tissue to flutter. Now, say “speak.” This time around, the tissue should stay more or less still; it’s the same deal for the P in “leap.”

In linguistics, the name for that puff of air following P is “aspiration,” and in some languages, it makes an absolute difference. In those languages, P, like in “speak,” and P with the puff of air, like in “peak,” are as distinct as P and B are in English — Thai and Hindi speakers, I’m talking about you! In English, however, P’s are P’s whether they are aspirated or not, which (if you’re interested in language) reveals something totally fascinating about sound representation in the mind.

alt text

In the language acquisition process, humans learn to group some types of sounds categorically, so that measurably distinct sounds like a “puffy P” and the regular kind of P can be considered the same thing. Experiment after experiment shows that newborns are sensitive to all kinds of subtle linguistic contrasts like aspiration, but as those newborns become infants, their brains start to categorize sounds and focus only on the sound distinctions found in the primary surrounding language(s). So when it comes to picking up relevant sound contrasts and forming sound categories, the brain sure knows its stuff!

What’s on your mind?

alt text