Opinion

Morning Joe

RacheL Maddow

Deadline: White House

The weekend

Newsletters

Live TV

Featured Shows

The Rachel Maddow Show
The Rachel Maddow Show WEEKNIGHTS 9PM ET
Morning Joe
Morning Joe WEEKDAYS 6AM ET
Deadline: White House with Nicolle Wallace
Deadline: White House with Nicolle Wallace Weekdays 4PM ET
The Beat with Ari Melber
The Beat with Ari Melber Weeknights 6PM ET
The Weeknight Weeknights 7PM ET
All in with Chris Hayes
All in with Chris Hayes TUESDAY-FRIDAY 8PM ET
The Briefing with Jen Psaki
The Briefing with Jen Psaki TUESDAYS – FRIDAYS 9PM ET
The Last Word with Lawrence O'Donnel
The Last Word with Lawrence O’Donnel Weeknights 10PM ET
The 11th Hour with Stephanie Ruhle
The 11th Hour with Stephanie Ruhle Weeknights 11PM ET

More Shows

  • Way Too Early with Ali Vitali
  • The Weekend
  • Ana Cabrera Reports
  • Velshi
  • Chris Jansing Reports
  • Katy Tur Reports
  • Alex Witt Reports
  • PoliticsNation with Al Sharpton
  • The Weekend: Primetime

MS NOW Tv

Watch Live
Listen Live

More

  • MS NOW Live Events
  • MS NOW Columnists
  • TV Schedule
  • MS NOW Newsletters
  • Podcasts
  • Transcripts
  • MS NOW Insights Community
  • Help

Follow MS NOW

  • Facebook
  • Instagram
  • X
  • Mail

WITHpod Special: Hear a Chapter of Chris’ New Book: ‘The Siren’s Call’

Share this –

  • Click to share on Facebook (Opens in new window) Facebook
  • Click to share on X (Opens in new window) X
  • Click to share on Mail (Opens in new window) Mail
  • Click to share on Print (Opens in new window) Print
  • Click to share on WhatsApp (Opens in new window)WhatsApp
  • Click to share on Reddit (Opens in new window)Reddit
  • Click to share on Pocket (Opens in new window)Pocket
  • Flipboard
  • Click to share on Pinterest (Opens in new window)Pinterest
  • Click to share on LinkedIn (Opens in new window)LinkedIn

Why Is This Happening?

WITHpod Special: Hear a Chapter of Chris’ New Book: ‘The Siren’s Call’

This week, we’re sharing the first chapter of Chris’ new book, “The Sirens’ Call: How Attention How Attention Became the World’s Most Endangered Resource.’

Jan. 30, 2025, 10:56 PM EST
By  MS NOW

Hi, WITHpod listeners! It’s an exciting day and we have a special treat for you. Today, we’re sharing the first chapter of Chris’ new book, “The Sirens’ Call: How Attention How Attention Became the World’s Most Endangered Resource.”

From the book description:

We all feel it—the distraction, the loss of focus, the addictive focus on the wrong things for too long. We bump into the zombies on their phones in the street, and sometimes they’re us. We stare in pity at the four people at the table in the restaurant, all on their phones, and then we feel the buzz in our pocket. Something has changed utterly: for most of human history, the boundary between public and private has been clear, at least in theory. Now, as Chris Hayes writes, “With the help of a few tech firms, we basically tore it down in about a decade.” Hayes argues that we are in the midst of an epoch-defining transition whose only parallel is what happened to labor in the nineteenth century: attention has become a commodified resource extracted from us, and from which we are increasingly alienated. The Sirens’ Call is the big-picture vision we urgently need to offer clarity and guidance.

Audio excerpted courtesy of Penguin Random House Audio from The Sirens’ Call by Chris Hayes, read by Chris Hayes. © 2025 Christopher Hayes, ℗ 2025 Penguin Random House, LLC. All rights reserved.

Note: This is a rough transcript. Please excuses any typos.

Chris Hayes: Hello, and welcome to “Why Is This Happening?” with me, your host, Chris Hayes.

Well, today’s a big day for me, a day I have been looking forward to for a while. And I’m really excited what we have for you, the listener.

The book that I have been working on for the last few years, “The Sirens’ Call: How Attention Became the World’s Most Endangered Resource,” is out today. You can get it wherever books are sold. And also — and this is crucial — you can get it from your local library. And if they don’t have it, you can go to your local library and you can say, I would love for you to order this book.

And often, in most cases, they will. Also, there are apps for your local library to get audiobooks if you want to listen to me read the book, which I think a lot of people who are WITHpod listeners might be interested in.

And, on that note, we have a special, exclusive treat for you today, the first chapter of the book. I read the audiobook. I have read the last two books I have done. I have read the audiobooks. I love reading the audiobooks. Obviously, I talk for a living, so it’s something I’m pretty comfortable with, as you probably know.

And so I read the audiobook. I really enjoyed reading this audiobook. I actually wrote the book in a kind of, I don’t know, fairly voicy oral register, if that makes sense, I think influenced by the now 12 years of cable news I have been doing.

So, today, in your feed right now, after I stop talking, the first chapter of “The Sirens’ Call” read by yours truly. Again, go to sirenscallbook.com if you want to order the book. And there’s all sorts of places, including independent booksellers. And also talk to your library if you don’t want to order the book, but would like to listen or read it nonetheless.

So, without further ado, “The Sirens’ Call.”



Let us begin with a story from Odysseus’s journey. In book twelve of the Odyssey, our hero is about to depart the island of the goddess Circe when she gives him some crucial advice about how to navigate the perils of the next leg of his voyage.1 “Pay attention,” she instructs him sternly:

First you will come to the Sirens who enchant all who come near them. If any one unwarily draws in too close and hears the singing of the Sirens, his wife and children will never welcome him home again, for they sit in a green field and warble him to death with the sweetness of their song. There is a great heap of dead men’s bones lying all around, with the flesh still rotting off them.

Odysseus listens as Circe provides him with a plan: stuff wax in the ears of your crew, she says, so they cannot hear the Sirens, and have them bind you to the mast of the ship until you have sailed safely past. Odysseus follows the plan to a tee. Sure enough, when the Sirens’ song hits his ears, he motions to his men to loosen him so that he can follow it. But as instructed, his crew ignores him until the ship is out of earshot.

This image is one of the most potent in the Western canon: Odysseus lashed to the mast, struggling against the bonds that he himself submitted to, knowing this was all in store. It has come down to us through the centuries as a metaphor for many things. Sin and virtue. The temptations of the flesh and the willpower to resist them. The addict who throws his pills down the toilet in preparation for the cravings to come, then begs for more drugs. It’s an image that illustrates the Freudian struggle between the ego and the id: what we want and what we know we should not, cannot have.

Whenever I’ve encountered a visual representation of the Sirens, they are always, for lack of a better word, hot. Seductive. From Shakespeare to Ralph Ellison and down through literature, the Sirens are most often a metaphor for female sexual allure. In James Joyce’s Ulysses, Bloom describes the man who has taken up with Bloom’s wife as “falling a victim to her siren charms and forgetting home ties.”

Given this, it is a bit odd to reconcile the original meaning of the word with how we use it today, to describe the intrusive wail of the device atop ambulances and cop cars. But there’s a connection there, a profound one, and it’s the guiding insight for this book and central to understanding life in the twenty-first century.

Stand on a street corner in any city on earth long enough, and you will hear an emergency vehicle whiz past. When you travel to a foreign land, that sound stands out as part of the sensory texture of the foreignness you’re experiencing. Because no matter where you are, its call is at once familiar and foreign. The foreignness comes from the fact that in different countries the siren sounds slightly different— elongated, or two-toned, or distinctly pitched. But even if you’ve never encountered it before, you instantly understand its purpose. Amidst a language you may not speak and food you’ve never tried, the siren is universal. It exists to grab our attention, and it succeeds.

The siren as we know it now was invented in 1799 by Scottish poly- math John Robison. He was one of those Enlightenment figures who dabbled in everything from philosophy to engineering, and he originally intended the device as a form of musical instrument, though that didn’t take. What we think of as the siren didn’t reach its current form and function until the late nineteenth century. In the 1880s, a French engineer and inventor who had created electric (and therefore mostly silent) boats, utilized electric-powered sirens that worked to prevent boating accidents.6 (He even had a boat called La Sirène.) In relatively short order, the technology made its way to land vehicles like fire trucks, replacing the loud bells they’d formerly used to clear the way.

The Sirens of lore and the sirens of the urban streetscape both compel our attention against our will. And that experience, having our mind captured by that intrusive wail, is now our permanent state, our lot in life. We are never free of the sirens’ call.

Attention is the substance of life. Every moment we are awake we are paying attention to something, whether through our affirmative choice or because something or someone has compelled it. Ultimately, these instants of attention accrue into a life. “My experience,” as William James wrote in The Principles of Psychology in 1890, “is what I agree to attend to.” Increasingly it feels as if our experience is something we don’t fully agree to, and the ubiquity of that sensation represents a kind of rupture. Our dominion over our own minds has been punctured. Our inner lives have been transformed in utterly unprecedented fashion. That’s true in just about every country and culture on earth.

In the morning I sit on the couch with my precious younger daughter. She is six years old, and her sweet soft breath is on my cheek as she cuddles up with a book, asking me to read to her before we walk to school. Her attention is uncorrupted and pure. There is nothing in this life that is better. And yet I feel the instinct, almost physical, to look at the little attention box sitting in my pocket. I let it pass with a small amount of effort. But it pulses there like Gollum’s ring.

My ability to reject its little tug means I’m still alive, a whole human self. In the shame-ridden moments when I succumb, though, I wonder what exactly I am or have become. I keep coming back to James’s phrase “what I agree to attend to” because that word “agree” in his formulation carries enormous weight. Even if the demand for our attention comes from outside us, James believed that we ultimately controlled where we put it, that in “agreeing” to attend to something we offered our consent. James was rather obsessed with the question of free will, whether we in fact had it and how it worked. To him, “effort of attention”—deciding where to direct our thoughts— was “the essential phenomenon of will.”9 It was one and the same. No wonder I feel alienated from myself when the attention box in my pocket compels me seemingly against my own volition.

The ambulance siren can be a nuisance in a loud, crowded city streetscape, but at least it compels our attention for a socially useful purpose. The Sirens of Greek myth compel our attention to speed our own death. What Odysseus was doing with the wax and the mast was actively trying to manage his own attention. As dramatic as that Homeric passage is, it’s also, for us in the attention age, almost mundane. Because to live at this moment in the world, both online and off, is to find oneself endlessly wriggling on the mast, fighting for control of our very being against the ceaseless siren calls of the people and devices and corporations and malevolent actors trying to trap it.

That’s basically the world we’ve built for our minds. Well, maybe not “we,” per se. Our agency in the construction of the business and institutions of the attention age is a matter of considerable debate.

The combination of our deepest biological instincts and the iterative genius of global capitalism means we are subject to an endless process of experimentation, whereby some of the largest corporations in the history of humanity spend billions to find out what we crave and how much of that they can sell us. From inside our own being, attention is what constitutes our very self, but from the perspective of entities outside of us, attention is like gold in a stream, oil in a rock.

My professional life requires me to be particularly consumed by these questions, but I think we all feel this to some degree, don’t we? The alienating experience of being divided and distracted in spite of ourselves, to be here but not present. I bet you could spend day and night in any city or town canvassing strangers and not find a single one who told you they felt like their attention span was too long, that they were too focused, who wished they had more distractions, or spent more time looking at screens. Like traffic, our phones are now the source of universal complaint, a way to strike up a conversation in a barber shop or grocery line. What began as small voices at the margins warning us that the tech titans were offering us a Faustian bargain has coalesced into something approaching an emerging consensus: things are bad, and the technologies we all use every day are the cause. The phones are warbling us to death.

But before we simply accept this at face value and move on with our inquiry, it’s worth poking a bit at this quickly forming conventional wisdom. I mean, don’t we always go through this cycle? Don’t people always feel that things are wrong and that it’s because of kids these days? Or the new technology (printing press, steam engine, et cetera) has been our ruin?

In Plato’s Phaedrus, Socrates goes on a long rant – half persuasive and half ludicrous—about the peril posed by the new technology of… writing: “If men learn [the art of writing],” Socrates warns, “it will implant forgetfulness in their souls: They will cease to exercise memory because they rely on that which is written, calling things to remembrance no longer from within themselves, but by means of external marks. What you have discovered is a recipe not for memory, but for reminder.”

It seems safe to say in hindsight that writing was a pretty big net positive for human development, even if one of the greatest thinkers of all-time worried about it the same way contemporaries fret over video games. Indeed, it often feels that for all the legitimate criticism of social media and the experience of ubiquitous screens and connectivity, a kind of familiar neurotic hysteria undergirds the dire warnings. An entire subgenre of parenting advice books and blocking software now exists to manage “screen time” and the mortal peril introduced by our devices into the brain development of children; the broader cultural conversation has taken on all the overdetermined ferocity of a moral panic. In 2009, the Daily Mail alerted its readers to “How using Facebook could raise your risk of cancer.” The New York Post warned that screens are “digital heroin” that turn kids into “psychotic junkies.” “Teens on social media go from dumb to dangerous,” CBS cautioned. And The Atlantic was just one of many to ask the question: “Have smartphones destroyed a generation?” In 2024, social psychologist Jonathan Haidt published The Anxious Generation, which argues that ubiquitous access to smartphones has consigned an entire generation of teens and children to unprecedented levels of depression, anxiety, and self-harm. While some scholars who studied the issue criticized Haidt’s polemic for being overcooked, it was a runaway bestseller, and parents and schools across the country organized efforts to keep phones out of schools, as the book urged.

Some of the most grave and chilling descriptions of the effects of the attention age come from the workers who have engineered it. The hit Netflix documentary The Social Dilemma relies heavily on former Silicon Valley figures like whistleblower and former Google employee Tristan Harris to warn of the insidious nature of the apps mining our attention. Sean Parker, the creator of Napster and one of Facebook’s earliest investors, describes himself as a “conscientious objector” when it comes to social media: “God only knows what it’s doing to our children’s brains,” he has said. He is very much not alone. A New York Times Magazine article from 2018 tracks what the author calls the “dark consensus about screens and kids” among the Silicon Valley workers who themselves helped engineer the very products they now bar their own children from using. “I am convinced,” one former Facebook employee told The New York Times in 2018, that “the devil lives in our phones and is wreaking havoc on our children.”

I’m inclined to agree, but also find myself shrinking more than a little at how much the conversation around the evils of our phones sounds like a classic moral panic. Sociologist Stanley Cohen first coined the term “moral panic” in his 1972 book Folk Devils and Moral Panics, a study of the hysteria that surrounded different kinds of youth culture, particularly the Mods and Rockers in the UK in the 1960s. “Societies appear to be subject, every now and then, to periods of moral panic,” Cohen writes. Some group or cultural trend “emerges to become defined as a threat to societal values and interests; its nature is presented in a stylized and stereotypical fashion by the mass media; the moral barricades are manned by editors, bishops, politicians and other right-thinking people; socially accredited experts pronounce their diagnoses and solutions.”

We can also see this familiar pattern when the target is a new technology rather than a cultural trend or group: excitement and wonder that quickly turn to dread and panic. The cheap printing technology of the late nineteenth century that gave rise to paperbacks and dime-store novels occasioned one critic to decry the genre publisher for “poisoning society . . . with his smutty stories and impure example . . . a moral ulcer, a plague spot, a leper, who ought to be treated as were the lepers of old, who were banished from society and commanded to cry ‘Unclean,’ as a warning to save others from the pestilence.” In 1929, as radio rose to become a dominant form of media in the country, The New York Times asked, “Do Radio Noises Cause Illness?” and informed its readers that there was “general agreement among doctors and scientific men that the coming of the radio has produced a great many illnesses, particularly caused by nervous troubles. The human system requires repose and cannot be kept up at the jazz rate forever.”

The brilliant illustrator Randall Munroe, creator of the webcomic xkcd, captures much of this in a timeline called “The Pace of Modern Life” chronicling the anxiety of contemporary critics about the development of industrial modernity, particularly the speed of communication and proliferation of easily accessible information and its impact on our minds. He starts with the Sunday Magazine in 1871 mourning the fact that the “art of letter-writing is fast dying out. . . . We fire off a multitude of rapid and short notes, instead of sitting down to have a good talk over a real sheet of paper.” He then quotes an 1894 politician decrying the shrinking attention spans: instead of reading, people were content with a “summary of the summary” and were “dipping into . . . many subjects and gathering information in a . . . superficial form” and thus losing “the habit of settling down to great works.” And my personal favorite, a 1907 note in the Journal of Education that laments the new “modern family gathering, silent around the fire, each individual with his head buried in his favorite magazine.”

All of this now seems amusingly hyperbolic, but there are two different ways to think about these consistent warnings and bouts of mourning for what modernity has taken from us. One way is to view it all as quaint: there will always be some set of people who will freak out about the effects of any new technology or media, and over time those people will find out that everything is fine, that the rise of, say, magazines, of all things, doesn’t rot children’s brains or destroy the fabric of family life.

But I don’t think that’s right. Rather, I think these complaints and concerns about accelerating technology and media are broadly correct. When writing was new, it really did pose a threat to all kinds of cherished older forms of thinking and communicating. Same too with the printing press and mass literacy, and then radio and television. And it is when a technology is newest, when it’s hottest to the touch, that it burns most intensely.

The very experience of what we call modernity is the experience of a world whose pace of life, scope of information, and sources of stimulus with a claim on our attention are always increasing. At each point up this curve, the ascent induces vertigo. When Henry David Thoreau escaped to Walden Pond in the summer of 1845, it was as a refuge from this precise experience, the invasive omnipresence of modernity and the way it can cloud a person’s faculties. Of our so-called modern improvements, he writes, “There is an illusion about them; there is not always a positive advance . . . Our inventions are wont to be pretty toys, which distract our attention from serious things.”

To achieve clarity about what it means to be human in this specific era, it’s necessary at each moment to ask what’s new and what’s not, what’s being driven by some novel technology or innovation and what’s inherent in human society itself. For example, it’s not a new phenomenon for masses of people to believe things that aren’t true. People didn’t need Facebook “disinformation” for witch trials and pogroms, but there’s also no question that frictionless, instant global communication acts as an accelerant. Also not new: our desires to occupy our minds when idle. Look at pictures of streetcar commuters of the early twentieth century and you’ll see cars packed with men in suits and hats, every last one reading the newspaper, their noses buried in them as surely as modern commuters are buried in their phones. But there’s also no question that the relationship we have to our phones is fundamentally different in kind than the relationship those streetcar commuters had to their newspapers.

In his book on the attention economy, Stolen Focus, writer Johann Hari gets into a bit of this debate with Nir Eyal (author of Hooked: How to Build Habit-Forming Products). Eyal makes the case that the freak-outs about social media are today’s version of the mid-twentieth-century moral panic over comic books, which got so heated there were a series of high-profile Senate hearings into what comic books were doing to America’s youth. All the grave warnings about phones and social media are, he contends, “literally verbatim, from the 1950s about the comic-book debate,” when people “went to the Senate and told the senators that comic books are turning children into addicted, hijacked [zombies]—literally, it’s the same stuff  Today, we think of comic books as so innocuous.”

In the end it turned out comic books weren’t worth the worry, which is why the panic looks silly in retrospect. But that’s another key question, isn’t it? Along with the question of what is and is not new, there’s also the deeper question of what is and is not harmful. It is easy to conflate the two. When tobacco use first exploded in Europe there were those who rang the alarm bells. As early as 1604, England’s King James decried the new habit as “lothsome to the eye, hatefull to the Nose, harmefull to the braine, daungerous to the Lungs, and in the blacke stinking fume thereof, neerest resembling the horrible Stigian smoke of the pit that is bottomelesse.” As hysterical and prudish as that must have sounded at that time, it was 100 percent correct. When I recently watched the incredible Peter Jackson documentary about the Beatles’ Let It Be sessions, the sheer number of cigarettes being inhaled in every recording session was both distracting and unsettling. In 1969, when the Beatles were recording what would become their final released album, there was already substantial research demonstrating that cigarettes were dangerous. It would be another thirty years until culture and law and regulation turned decisively against smoking and the practice started to decline and disappear from most public spaces.

One wonders sometimes if fifty years from now, people will look at footage from our age, with everyone constantly thumbing through our phones, the way I look at Ringo Starr chain-smoking. Stop doing that! It’s gonna kill you! In fact, the surgeon general of the United States has called for social media to come with a mandatory mental health warning label like the ones on cigarette packs. In response, researchers who study teen mental health have pushed back, saying the research just doesn’t justify such a drastic step. The debate over our digital lives, at least as it’s been reflected in the discourse, basically comes down to this: Is the development of a global, ubiquitous, chronically connected social media world more like comic books or cigarettes?

What I want to argue here is that the scale of transformation we’re experiencing is far more vast and more intimate than even the most panicked critics have understood. In other words: the problem with the main thrust of the current critiques of the attention economy and the scourge of social media is that (with some notable exceptions) they don’t actually go far enough. The rhetoric of moral condemnation undersells the level of transformation we’re experiencing. As tempting as it is to say the problem is the phones, they are as much symptom as cause, the natural conclusion of a set of forces transforming the texture of our lives. The attention economy isn’t like a bad new drug being pushed onto the populace, an addictive intoxicant with massive negative effects, or even a disruptive new form of media with broad social implications. It’s something more profound and different altogether. My contention is that the defining feature of this age is that the most important resource—our attention—is also the very thing that makes us human. Unlike land, coal, or capital, which exist outside of us, the chief resource of this age is embedded in our psyches. Extracting it requires cracking into our minds.

We all intuitively grasp the value of attention, as least internally, because what we pay attention to constitutes our inner lives. When it is taken from us, we feel the loss. But attention is also supremely valuable externally, out in the world. It is the foundation for nearly all we do, from the relationships we build to the way we act as workers, consumers, and citizens.

To illustrate, let’s do a little thought experiment. Let’s say tomorrow you decide to run for local office. After you google around to learn what paperwork you need to file, how many signatures you need, and what the deadlines are, you’ll have to figure out two main things: how to raise money, and how to let voters know who you are. You’ll probably start with your social network for both tasks: neighbors, friends, and relatives. You might host events, stand out on a street corner, go to local farmers’ markets or bowling leagues or subway platforms to shake hands and introduce yourself. You’ll need a staff, a message, campaign signs, positions on the issues, and on and on. But in all cases, what you need to win is other people’s attention. It is necessary for anything else that happens in a successful campaign.

Or let’s say you want to start a business. During the pandemic you developed a specialty chocolate chip cookie recipe with a hint of habanero chiles for heat, and everyone who tries it loves it. You’re going to have a bunch of logistical challenges that will keep you very busy— how to incorporate, acquire the right equipment, maybe secure a business loan. But ultimately, you’re going to end up in the same place as a political campaign: How do you let people know that you have cookies to sell? How do you get people’s attention? Answering this question is the foundation for a shockingly wide array of modern human endeavors—from getting a job to finding a date.

Attention is a kind of resource: it has value and if you can seize it you seize that value. This has been true for a very long time. Charismatic leaders and demagogues, showmen, preachers, great salespeople, marketers, advertisers, holy men and women who rallied disciples, all have used the power of attention to accrue wealth and power. What has changed is attention’s relative importance. Those who successfully extract it command fortunes, win elections, and topple regimes. The battle to control what we pay attention to at any given instant structures everything from our inner life (who and what we listen to, how and when we are present to those we love) to our collective public lives (which pressing matters of social concern are debated and legislated, which are neglected; which deaths are loudly mourned, which ones are quietly forgotten). Every single aspect of human life across the broadest categories of human organization is being reoriented around the pursuit of attention.

How did it get this way? Toward the end of the twentieth century, many wealthy nations began moving from an industrial, manufacturing economy to a digital one. In 1961, six of the ten largest US companies by assets were oil companies. The assets these companies controlled— fossil fuels—were the single most valuable resource in the postwar global order. Alongside fossil fuel companies were car companies like Ford Motor and industrial behemoths like DuPont.

Today, Forbes’ list of the largest US companies is dominated by banks and tech firms: Microsoft; Apple; Google’s parent, Alphabet; Meta; and Amazon. The central locus of economic activity has moved from those firms that manipulate atoms to those that manipulate bits. Typically, we tend to think of the rise of this new form of economic production as being dependent on information and data. “Data is the new oil” has become a kind of mantra of the age; whoever controls large stores of information are the power brokers of our time. This view is not completely wrong; information is vitally important. But it crucially misstates what’s both so distinct and so alienating about the era we’ve entered. Information is the opposite of a scarce resource: it is everywhere and there is always more of it. It is generative. It is copyable. Multiple entities can have the same information. Think for a moment about your personal data, information about who you are and what you like. Maybe there are half a dozen firms that have it or maybe there are a hundred, or maybe a thousand, and while it might have some effect on you in terms of which advertising you get, you don’t really know and functionally it doesn’t really matter.

But if someone has your attention, you know it. It can’t be in multiple places at once, the way information can.

If I put a picnic table in my backyard and my neighbor steals my idea by putting a picnic table in his own backyard, that doesn’t change my experience very much. But if my neighbor steals my picnic table, well, then, he’s made my life a lot worse. The brilliant legal scholar Lawrence Lessig uses that example to illustrate the difference between intellectual property and physical property, but it’s also a good way to think about the difference between information and attention. Information is the idea of the picnic table; attention is the actual picnic table.

I’m going to discuss the relationship between information and attention a lot more over the course of this book, but for our purposes here at the start, the axiom I want to drive home is that information is infinite and attention is limited. And value derives from scarcity, which is why attention is so valuable.

So if we return to the largest corporations of our times, they are dominated not by information companies, but more accurately by finance and attention companies. Apple is the company most singularly responsible for inaugurating the attention age with its 2007 introduction of the iPhone. Microsoft runs the operating system that hundreds of millions of people spend their attention on all day long, along with another attention magnet, the Xbox gaming console. Alphabet runs YouTube, as well as the internet’s largest advertising network, which profits from our attention. Meta and the Chinese social media company Tencent (which makes WeChat, the largest social network in China) similarly convert eyeballs into cash.

Amazon is also on the list of largest companies and is the world’s largest online retailer outside China, but even to call Amazon a “retailer” misstates the source of its market power. Amazon is an attention and logistics company, and the products it sells are an afterthought. You see this anytime you search for a product on Amazon and are confronted with dozens of nearly identical versions, all produced by companies you’ve often never heard of, in places you couldn’t name, primarily competing for the attentional space at the top of the search results, attentional space that Amazon owns. In many cases, Amazon has seen which products dominate that attentional space and then started producing them itself, cutting out the middleman.

Amazon is the most extreme example of how in the attention age, even the sale of stuff to consumers has more to do with getting their attention than making the stuff itself. The basic model of industrial age advertising was that a firm developed a product or service and then sought to advertise and market it, to capture people’s attention as a means of introducing them to the firm’s wares. But there’s another model, also present from the early days of the industrial age, which is the snake oil and supplements model. In the snake oil model, the attention and marketing are the most important part of the enterprise—capturing the imagination of customers—and the product is an afterthought, in fact often outright fraudulent.

As global incomes rise, and the variety of consumer choices expands accordingly, attentional competition becomes ever more ferocious. We’re seeing the relative emphasis between these two models shift rapidly. In so many instances, the ability to grab the attention of the consumer is more important than the actual product or service offered.

At the dawn of this era of globalization, the post–Cold War expansion of global capital and the rapid reduction of trade barriers, Naomi Klein published her instant classic, No Logo. In it, she argued that the new version of capitalism, which increasingly outsourced production to China and the Global South, meant that the relationship between product and brand had grown more and more attenuated. The brand now stood as the dominant feature of the product: not the shoe itself but the little swoosh stitched onto it. “Since many of today’s best-known manufacturers no longer produce products and advertise them,” she writes, “but rather buy products and ‘brand’ them, these companies are forever on the prowl for creative new ways to build and strengthen their brand images. What these companies produced primarily were not things but images of their brands. Their real work lay not in manufacturing but in marketing.”

This is so ubiquitous in contemporary capitalism that we don’t even notice it. And in the attention age, it’s been taken to its logical conclusion. We have our habits, which are remarkably durable—the dish soap we like, the toilet paper, the dog food, the toothpaste, and on and on up to the brand of cars we tend to buy. And we tell ourselves, to the extent we tell ourselves anything, that this loyalty is to the product. Occasionally, though, some catastrophe or disruption will remind us just how undifferentiated the actual “products” behind the brands are.

In 2007, a Canadian pet food company called Menu Foods was subject to a recall after some of its food was contaminated by a chemical called melamine; the food was causing illness and even death in the cats and dogs eating it. This was bad enough. Even worse was that Menu Foods had had a hand in producing pet food in its contaminated Chinese factory for just about every single pet food brand in the US. Nearly all of the big recognizable food conglomerate brands, from Colgate-Palmolive to Procter & Gamble, were using Menu Foods, not to mention nearly all of the “generic” brand pet food available at Safeway and Kroger and other stores.

In other words, whatever pet food you were buying, you were getting a pretty damn similar product. “In total, the Menu Foods recall covered products that had been retailed under a phenomenal 150 different names,” Barry Lynn writes in his book about modern monopoly capitalism. “Perhaps even more disturbing, especially for those pet owners who had been spending their dollars on a premium product, was that the recall revealed that high-end, expensive brands like Iams and Hill’s Pet Nutrition Science Diet rolled off the exact same Menu Foods packing lines as the cans that were wrapped in labels bearing such names as Supervalu and Price Chopper.”

What is a brand? At the most basic, perceptual level, it is simply a set of markings—physical identifiers like the swoosh or three stripes— that the consumer notices. It grabs your attention. That’s all. A brand is its own kind of siren. If you hear its call amidst the background noise, see its lights flashing against the backdrop of the grocery store aisle, it’s done its job.

Though she wasn’t quite putting it in these terms, what Klein identified is the process by which the attention economy eats the real economy. The bulk of the value of the enterprise of, say, Nike is in its central attentional holding—its instantly recognizable swoosh—not in the technical know-how or factors of production (supply chains, factories, access to labor) that industrial firms of an earlier era would view as their central source of value.

It is not just commercial life that is driven by the extraction of attention. Increasingly, social life, public life, and political life are dominated by it as well. In the nineteenth and twentieth centuries, wage labor and urbanization utterly transformed the contest for attention in politics. As democracy spread across rapidly industrializing Europe, a recognizably modern mass public took shape. Public opinion mattered more than ever, and “what the public thought” was largely determined by which issues people paid attention to and which they didn’t, which candidates they recognized and which remained strangers.

On top of that, as society grew orders of magnitude more complex, the sheer number of issues presenting themselves with a claim to a citizen’s attention exploded as well. In 1925, the critic Walter Lippmann pointed out that the duties citizens inherited in the twentieth century were overwhelming even for the most educated and informed people like the author himself. “My sympathies are with [the citizen],” Lippmann wrote, “for I believe that he has been saddled with an impossible task and that he is asked to practice an unattainable ideal. I find it so myself for, although public business is my main interest and I give most of my time to watching it, I cannot find time to do what is expected of me in the theory of democracy; that is, to know what is going on and to have an opinion worth expressing on every question which confronts a self-governing community.”

It was the same year Lippmann published these words in his book The Phantom Public that Europe watched the rise of charismatic fascist dictator Benito Mussolini, who unburdened the Italian citizenry from the onerous labor they’d been tasked with by offering instead a cult of personality. “Under . . . Fascism there appears for the first time in Europe a type of man who does not want to give reasons or to be right, but simply shows himself resolved to impose his opinions,” wrote Spanish intellectual José Ortega y Gasset in The Revolt of the Masses. “Here I see the most palpable manifestation of the new mentality of the masses, due to their having decided to rule society without the capacity for doing so.”

The experience of charismatic demagogues and genocidal world war in the twentieth century left an entire generation of intellectuals to wonder how compatible mass media and mass democracy truly were. Though they didn’t necessarily conceptualize it in these terms, they were wrestling with the ability of mass media—sometimes, but not always, in the hands of tyrants—to successfully monopolize attention, and therefore control of a nation: Did the presence of mass media itself extinguish the individual conscience that made human decency possible? “It is not an exaggeration,” wrote Pope Pius XII in 1950, “to say that the future of modern society and the stability of its inner life depend in large part on the maintenance of an equilibrium between the strength of the techniques of communication and the capacity of the individual’s own reaction.”

The TV age spawned dire warnings, from Marshall McLuhan to Neil Postman, that the broad narcotic effect of the new device was making the public stupider, duller, and less capable of self-governance. “Americans no longer talk to each other, they entertain each other,” Postman wrote. “They do not exchange ideas; they exchange images. They do not argue with propositions; they argue with good looks, celebrities and commercials.”

But all of that was a prologue for the attention age. Attention has never been more in demand, more contested, and more important than it is now.

Unlike, say, oil, a chemical compound buried in the earth, attention cannot be separated from who we are and what it means to be alive. In fact, attention is the most fundamental human need. The newborn of our species is utterly helpless. It can survive only with attention—that is, if some other human attends to it. That attention will not itself sustain an infant, but it is the necessary precondition to all care. If you neglect a child, it will perish. We are built and formed by attention; destroyed by neglect. This is our shared and inescapable human fate. Now our deepest neurological structures, human evolutionary inheritances, and social impulses are in a habitat designed to prey upon, to cultivate, distort, or destroy that which most fundamentally makes us human.

Which lives we protect and how depends ultimately on which deaths we pay attention to: if ten passenger planes went down tomorrow, for example, all airlines would be grounded. But during the Covid years we came to tolerate an equivalent death toll on a random winter Wednesday. If Al Qaeda sent roving bands of hit squads into nursing homes to live stream the murder of seniors, our societal response would be, I think it’s fair to say, significantly more strenuous, heated, and focused than our collective response to deaths on this scale from an invisible virus happening behind closed doors and away from camera phones.

In fact, the greatest civilizational challenge humans face or have ever faced, the warming of the planet from human activity, has proven so difficult to solve in large part because it evades our attentional faculties. “It’s always been a problem,” legendary writer and climate activist Bill McKibben once told me, “that the most dangerous thing on the planet [CO2] is invisible, odorless, tasteless, and doesn’t actually do anything to you directly.” At least not until it’s too late.

I know just how capricious public attention can be. For more than a decade, I have hosted an hour-long cable TV show on MSNBC. My background is in print journalism, but in my role as TV anchor my primary job is to sustain a baseline level of viewer attention that keeps the show on the air. This is my first professional obligation, prior to any other higher-level calculations.

It is from this perilous position, which feels as if it can collapse at any time, that you develop an almost sensory perception of the attentional vibrations of the audience, the way a surfer learns the timing of a wave—when it swells, when it crashes.

When, to take one fairly recent example, Russia invaded Ukraine, we went into breaking coverage and took no commercial breaks. Then after a few days, we started taking breaks. After several weeks, we started doing other stories mixed in with the war in Ukraine, but we were still leading with the Ukraine story. Then about a month in we started to sometimes lead with other stories. At forty-five days from the invasion’s start, we would occasionally do a show that didn’t have the war in it at all. A year later we’d go weeks or months without covering the war. The attention was all burned up.

What I’m describing is not particular to the war in Ukraine. It describes the natural life cycle of any news event. Depending on how big a deal the event is, that cycle can be quite short—in breaking for a few hours, and then things move on—or last several months.

The dynamics of how big these cycles are, how long they last, what kind of event qualifies to kick them off, are extremely difficult questions that I’m going to spend quite a bit of time exploring. But if there’s one thing I’ve learned from my last decade on air it’s that they are fundamentally driven by and respond to mass attention.

This may seem obvious, but it’s actually a somewhat controversial claim. There’s a notion many people have—one I encounter often, in fact—that the contours of news cycles and mass attention are essentially directed from above. The corporate media decides what people should pay attention to, and then—using its bag of tricks to trap our attention, like, for instance, the BREAKING NEWS banners—it directs attention toward those stories.

Though it’s closely associated with linguist and social critic Noam Chomsky, this isn’t just a left-wing critique. Or at least not anymore. Those in America who distrusted the Ukrainian government and NATO, or outright supported the Russian invasion, saw a great and nefarious conspiracy in the sudden attention devoted to Ukraine’s desperate condition. Fox News hosts called the US warnings of an imminent Russian invasion a “ruse,” and asked if it was an attempt to “take everybody’s attention away from what Hillary Clinton did and what we know to be a complete hoax over this Russia investigation?” Tucker Carlson told his audience, “The morning that Russia invaded Ukraine, you may have been talking about a lot of different things. Covid, or crime, or the southern border. Not anymore. Much to the relief of the White House, all of those topics have been forgotten, maybe forever.”

I can say from hard-won experience that covering the news, particularly large mass enterprises like cable news, involves chasing audience attention much more than leading it. For the most part, those who work inside the attention industry are stalked by fear that people will stop paying attention, that our tricks won’t work, that we’ll be ignored. This fear creates all kinds of negative effects and behaviors, not the least of which is a kind of herding instinct. But the core insight, one that comes from trial and error, is that attention is hard to direct, hard to attract, and hard to control. People whose jobs depend on capturing it know this more than they know anything else.

When I got my own TV show, I imagined it as something akin to the experience of first-time car ownership. I could drive wherever I wanted to drive, and while I would have to obey the law, I just had to figure out where I wanted to go and push the pedal. I could cover whatever I thought was most important, whenever I wanted, for as long as I wanted.

I learned quickly, it doesn’t work like that.

A cable news show is powered by attention. It has no internal combustion engine to make it go. Yes, you can cover whatever you desire night after night, but if no one watches it, the show will be canceled. This is what almost happened to me.

After a lot of trial and error, I now view audience attention as something like the wind that powers a sailboat. It’s a real phenomenon, independent of the boat, and you can successfully sail only if you harness it. You don’t turn the boat into the wind, but you also don’t simply allow the wind to set your course. You figure out where you want to go (in the case of my show, what I think is important for people to know), you identify which way the wind is blowing, and then, using your skills and the tools of the boat, you tack back and forth to manage to arrive at your destination using that wind power.

This experience gives me a certain perspective on how attention functions, I think. Every waking moment of my work life revolves around answering the question of how we capture attention. And it just so happens that constant pursuit of others’ attention is no longer just for professionals like myself. Indeed it has been democratized to include every teen with a phone.

This rearrangement of social and economic conditions around the pursuit of attention is, I’m going to argue, a transformation as profound as the dawn of industrial capitalism and the creation of wage labor as the central form of human toil. Attention now exists as a commodity in the same way labor did in the early years of industrial capitalism. What had previously been regarded as human effort was converted into a commodity with a price. People had always “worked” in one way or the other, but that work was now embedded in a complicated system that converted the work into a market good. This transition from “work” to “labor” was, for many, both punishing and strange. The worker, Karl Marx observed in Economic and Philosophic Manuscripts of 1844, “does not feel content but unhappy, does not develop freely his physical and mental energy but mortifies his body and ruins his mind. The worker therefore only feels himself outside his work, and in his work feels outside himself.”

This was the fundamental insight of Marx’s theory of labor and alienation: that a social system had been erected to coercively extract something from people that had previously, in a deep sense, been theirs. Even today, those words feel fresh. The sense of dislocation and being outside oneself. The inability we feel, even amidst what is ostensibly boundless choice and freedom—What do you want to watch tonight, babe?—to “develop freely” our mental energy. The trapped quality of the worker caught in a system he did not construct and from which he cannot extricate himself.

The epochal shift to industrial capitalism required what Marx described as the commodification of labor. Labor—what we do with our bodies and minds, the product of our effort and exertion—is quite an alienating thing to have turned into a market commodity. The transmutation of what had always been “work” or “things humans did for specific purposes” into “labor” as a category of activity with a price required an entire transformation of the structure of society and the daily experience of human life.

Indeed, in order to extract labor from a person, you need to compensate them through wages or coerce them into it or use violence— such as the overseer’s whip—to force it out of them. All these methods have been used. But the extraction of our attention happens in a different way. People can be forced to work in all kinds of cruel and oppressive ways, but they cannot be forced to do it purely through the manipulation of their preconscious faculties. If someone puts a gun to your head and tells you to dig a ditch, you know you are being coerced. If someone fires a gun in the air, your attention will instantly shift to the sound even before you can fully grasp what’s happening. Attention can be extracted from us at the purely sensory level, before our conscious will even gets to weigh in. In fact, this is how a siren functions.

Centering attention as a resource and understanding both its existential primacy and its increasing social, political, and economic domination is the key to understanding a lot of disparate aspects of twenty-first-century life. Attention is prior to other aspects of speech and communication we associate with power—persuasion, argumentation, information. Before you can persuade you must capture attention: “Friends, Romans, countrymen, lend me your ears!” Before you inform, insult, seduce, or anything else, you must make sure that your voice doesn’t end up in the muted background static that is 99.9 percent of speech directed our way. Public discourse is now a war of all against all for attention. Commerce is a war for attention. Social life is a war for attention. Parenting is a war for attention. And we are all feeling battle weary.

This book is an attempt at finding peace.



Thanks, everybody, for listening.

I should note at the top — I realized I should have said this earlier — I am on book tour. I’m going to be going to lots of places, including, the first week, I’m going to be in Washington, D.C., on January 29, Boston, Massachusetts, January 31 at Harvard Book Store, February 1 at Temple Emanu-El Streicker here in New York City.

And then I go out to the West Coast. I will be in Palo Alto and San Francisco on February 3, Town Hall Seattle February 4, San Diego February 5. All of this information is at sirenscallbook.com. You can buy tickets to the events there.

I would also love to hear — we were thinking maybe, if people read the book or — and they’re interested in it and want to talk about it or write about it, doing some kind of book club sort of thing here for the WITHpod listeners. So write to WITHPod@Gmail.com if you’re interested in that.

Get in touch with us using the hashtag #WITHPod. You, of course, can follow me across, well, a bunch of places, Threads, Bluesky and what was — used to be called Twitter. My handle is ChrisLHayes.

Be sure to hear new episodes every Tuesday. “Why Is This Happening?” is presented by MSNBC and NBC News, produced by Doni Holloway and Brendan O’Melia, engineered by Bob Mallory and featuring music by Eddie Cooper. Aisha Turner is the executive producer of MSNBC Audio.

You can see more of our work, including links to things we mentioned here, by going to NBCNews.com/WhyIsThisHappening.

  • About
  • Contact
  • help
  • Careers
  • AD Choices
  • Privacy Policy
  • Your privacy choices
  • CA Notice
  • Terms of Service
  • MS NOW Sitemap
  • Closed Captioning
  • Advertise
  • Join the MS NOW insights Community

© 2025 Versant Media, LLC