A small, highly unscientific poll (conducted by yours truly) suggests that when psychologists are asked what attention is, they will direct you to the 19th century psychologist and philosopher, William James. In his now-canonical 1890 book, The Principles of Psychology, the definition is shockingly simple: “Everyone knows what attention is.”I’m holding out though; this is the very short answer and it feels incomplete. James goes on:
“It is the taking possession by the mind, in clear and vivid form, of one out of what seem several simultaneously possible objects or trains of thought … It implies withdrawal from some things in order to deal effectively with others, and is a condition which has a real opposite in the confused, dazed, scatterbrained state which in French is called distraction, and Zerstreutheit in German.”James is certainly not the only person to give thought — academic or otherwise — to attention. A great many intellectuals have trodden along James’ original tracks, and more recently, attention has become a topic of growing popular concern.From James’ original question, let’s fast-forward over a century to the summer of 2008. That year, The Atlantic’s July/August issue posed the question, “Is Google Making Us Stoopid?” The accompanying article by Nicolas Carr considered the ways in which the Internet might be changing our brains — for the worse. In particular, Carr posited that our ability to pay attention might be at stake. A similar, and perhaps more sombre read was Maggie Jackson’s book published in the same year, Distraction: The Erosion of Attention and the Coming Dark Age. Carr and Jackson are just two of a growing number of voices concerned about what new technologies and the Internet in particular are doing to our minds and attention spans. That’s not to say there’s a consensus on the topic, however; some writers claim that these concerns ignore the potential value of distraction. Jonah Lehrer, for instance, cites studies linking distractedness with creative achievement. But amid this intellectual tumult over whether computers are sapping our attention spans, it is worthwhile to return to James’ original question: what is attention in the first place?Jay Pratt is the undergraduate chair of U of T’s psychology department, and he conducts research on visual attention. Today, researchers recognize that the topic of attention is too big to study as a whole, and as Pratt explains, there are subfields of study like visual and auditory attention. According to Pratt, “At its highest level, attention is maybe the allocation of mental resources. It’s the brain putting its energy and its resources towards certain processes.” But for Pratt, that’s the big picture, and things aren’t so simple up close. As he points out, the brain performs many attention-based processes at once, and this makes it difficult to simply model attention as a unit that can simply increase or decrease. As a matter of fact, this is an old and oversimplified way of talking about attention that dates back to the 1950s and ‘60s during the rise of cognitive psychology. In this period, some researchers thought of attention as a pool that could be filled or emptied, grown or shrunk.The balancing act between the environment and our own will steers attention. Elements in the environment can capture our attention, but we can also willingly shift it to items we wish to bring into mental focus. Two different brain networks carry out these two functions. The posterior network controls what some call reflexive attention — for example, when glint of light or something else in the environment catches our eye and our attention. This is an evolutionarily older part of the brain and is critical to survival in many animals, human or otherwise. Of course, the fact that you notice a flash of light out of a window might not matter much, but it’s the same system that tells you when an oncoming car is about to hit you.In contrast, the anterior network controls volitional attention, the attempts we make to focus on something. Susanne Ferber, a cognitive neuroscience researcher at U of T, explains that these networks are mostly located in the cerebral cortex (the wrinkly, grayish outer layer you probably think of when you picture what the brain looks like), particularly the frontal and parietal lobes. The cortex is so important, Ferber explains, that if a person’s right parietal lobe is damaged, he or she is completely unable to pay attention to the left side of his or her field of vision — each hemisphere of the brain links up to the opposite side of the body. This is a condition called spatial neglect: those who suffer from it cannot perceive, process, or interact with one side of space. Today, experimental psychology contributes greatly to the way we think about attention, but it doesn’t make up the entire picture. Mathematical models such as the one developed by John Senders, Professor Emeritus of industrial engineering at U of T, exist alongside current psychological ones. Senders began working on his own model of attention in the mid-1950s. As part of a project with the United States Air Force, he worked on determining which parts of the cockpit pilots were focusing their visual attention on. When asked what attention is, Senders jokes that nobody knows — except himself. His current work involves building a mathematical model of attention and draws on information theory, control theory, and queuing theory, various branches of mathematics developed in the first half of the 20th century. Senders says that looking mathematically at how we distribute our attention over different possible targets can help us determine the probability that we will look at a given target.For instance, if you are driving along and see a sign indicating that you are near a school, you’re more likely to look for children on the road. This may seem obvious, but it suggests something important: we may be paying attention to some things because of calculations conducted in the brain.This would make the brain something like a robot, programmed to direct itself based on what it knows about certain targets and what they usually indicate. This is simply a model, and it’s difficult to tell whether the brain is doing calculations or whether the mathematics just offer a good heuristic. But why do we place so much value on attention in the first place? Paul North, an intellectual historian in the Germanic languages and literature department at Yale University, says that attention is tied to theological roots that persist even now. According to North, this theological bent originated with Aristotle, who thought an attentive mind was important for getting at the essence of the world. Similarly, for Augustine, the attentive mind was a way of imitating the mind of God. God can comprehend everything at once, but humans can use attention to separate the elements of our environment and situate them in the world.By the 18th and 19th century, the philosophical and medical institutions of the time increasingly emphasized the importance of attention. The problem, as North explains, is that attention carries some pretty ancient baggage. Namely, to be less attentive is not just to be unable to focus, but to be further from God.Today, attention is considered in parallel with our evolving technologies, which Maggie Jackson suggests are the very cause for our eroding ability to pay attention. But not everyone feels the same way. Rhonda McEwen, who studies new media at the iSchool at U of T’s Faculty of Information, criticizes this popular view. She contends that we’re thinking about new media and technologies the wrong way. “We like to think of [the allure of media and technology] as a pull… But it’s more of a push.”In other words, we aren’t thinking about distraction in the right way. We aren’t getting worse at paying attention to other things, we’re just getting better at devoting hours and hours to smartphones, laptops, and the other new wonders that emerge from Mountain View, Cupertino, and beyond. We’re not being distracted by these technologies; we’re just always paying some sort of attention to them.According to McEwen, one of the questions that remains is whether the Internet and other new technologies are changing us or whether they represent something we’ve developed as a preferred alternative for interacting, communicating, and learning. As McEwen emphasizes, we should be very cautious about demonizing technology: doing so prevents us from reaping the rewards that technology yields and gets us nowhere.Even so, the concerns of skeptics like Carr and Jackson aren’t groundless. People may in fact be focusing less on certain things and more on others. But the conception of attention they work with seems to be somewhat outmoded. We may be facing problems with our ability to pay attention, and they may well even be serious. But whether we want to be optimists or pessimists about how our minds might be changing, we need to be more attentive to attention.