Last week, the CEO of Getty Images pushed all AI-generated pictures off the platform, but the real reason he did it isn’t the reason he shared. Center strategic advisor Brad Berens explains.

Images of fake faces generated with artificial intelligence.

The great Twentieth Century polymath Herbert A. Simon had a trenchant observation about decisions: Before we make a decision — and I’m wildly paraphrasing here — we need to make a decision about the decision that we need to make.

That might sound alarmingly like a Russian nesting dolls exercise, or a science fiction cliché where an infinite loop appears, we fall into it, and then we’re be trapped forever like Lazarus in that original Star Trek episode, but it’s pretty simple. The prior decision concerns how much effort we’re going to put into the decision. What are the stakes? Are we going to optimize or satisfice?

“Satisfice” is Simon’s neologism for “make satisfactory.” Readers who took high school Latin already know that “satisfactory” comes from the words “to make enough” as in “good enough but not great.”

Most of us already know what “optimize” means, although it’s now used so often in bloodless business discourse that it has lost some of its original zest that conveyed, “this decision is worth thinking hard about, worth the mental sweat that results in the best decision.”

We satisfice most of our daily decisions both because we have to make a lot of them, and also because most of our decisions just aren’t that important. Moreover, we have a limited amount of decision-making energy each day; if we squander it on what shoes to wear, then we’ll find ourselves making crappy decisions later.

What does any of this have to do with AI-generated images?

I’m so glad you asked.  (more)

With polarization at a dangerous high, is the United States facing a second civil war? Center director Jeffrey Cole digs in.

By Jeffrey Cole

The January 6, 2020 Capitol attack. Credit here.

In just the past three months, political tensions have moved perilously close to the breaking point. The Red and Blue visions of what America should be have become even more divided. The Supreme Court’s Dobbs decision banned a constitutional right to abortion. For now, a woman can exercise a right in California or New York that could lead to a felony conviction or forced delivery of a pregnancy (under all circumstances) in the South.  The lawful search of Mar-a-Lago to recover classified documents has already put the Department of Justice and the FBI on the enemies list for some Americans.

And, in the past week, to score political gains and photo opps in conservative media, the governors of Florida and Texas have transported (at state expense) migrants crossing the border to deep Blue places like Martha’s Vineyard and the front of the Vice-President’s residence in Washington, D.C. in order to “own the libs!”

Both sides are now openly frightened about what could happen if Donald Trump is indicted. One side demands it. The other considers it the product of a personal vendetta and a broad overreach of Federal power. This is yet another example of looking at the same thing and seeing something completely different based on which tribe we belong to.

The country will not face the possibility of an indictment and its fallout for another two months due to the precedent of not making a decision with such enormous political reverberations within two months of a national election. Had James Comey, director of the FBI at the time, honored such a moratorium in 2016, there might not be a former-President Donald Trump.

An indictment of a former president (common in Europe, Israel, and much of the world) might well be the thing that fully unleashes our pent-up anger and leads to violence. If the Proud Boys have been “standing by,” then this is why.

Looking back years from now, we might see January 6, 2021, as the dress rehearsal for what may be coming.  (more)

New forms of live media experiences are cropping up at different scales, says Center strategic advisor Brad Berens, but what are the differences between live and on-demand anyway, and why does it matter?

Since the arrival of the VCR and DVR, home video, VOD, podcasting, and streaming, the last few decades have shifted our media consumption from live shared experiences in real time (synchronous) to on-demand experiences on our own time (asynchronous).

While this shift to on-demand makes what we watch, when we watch, and where we watch more convenient, it also causes the decline of shared, real-time media experiences. This decline has other costs like a loss of what Russian philosopher Mikhail Bakhtin called “eventness” (sobytiinost’) and what science writer Daniel Goleman calls emotional looping.

I suspect that another cost is the increased polarization that plagues the world because without shared, real-time experiences how can we understand that other people can see the same things differently than we do?

These are high stakes, so it has been gratifying over the last few years to watch the growth and evolution of new live experiences.  (more)

A price sticker triggers a trip down memory lane for Center strategic advisor Brad Berens…and an explanation about how Experience Stacks can help us to understand how physical objects function in our memories.

One of my ongoing topics in this column is what I call Experience Stacks, which are the improvisational things that customers do over time with and around the things that companies make.

Experience Stacks are hard to talk about because we know their general shape but not their specific contents.

It’s a bit like the famous pointillist painting, “A Sunday Afternoon on the Island of La Grande Jatte” by Georges Seurat. The closer you get to the canvas, the less sense the picture makes as it dissolves into tiny dots: to see the whole you need to stand back.

Experience Stacks are also hard to talk about because they apply idiosyncratic external context to a shared experience. These applications overlap with fan references (e.g. Easter Eggs in DVDs) or broadly recognizable cultural references, but they aren’t the same.

In earlier columns, I’ve mostly used Experience Stacks to talk about media experiences (movies, television, social media), but the idea of an Experience Stack is also a useful tool when talking about physical objects.  (more)

A study suggests that inoculating internet users against misinformation might be more successful than fact checking later, but Center strategic advisor Brad Berens is not optimistic that this will help much in the fight for truth in journalism.

A new study in the journal Science Advances, “Psychological inoculation improves resilience against misinformation on social media,” suggests that “prebunking” online misinformation is more effective, cheaper, and more scaleable than trying to debunk misinformation via fact checking once people have already seen it. Prebunking tries to inoculate audiences ahead of time, like a vaccine, while debunking fights misinformation after exposure, like antibiotics.

The study provoked a small flurry of articles and posts, including:

The study finds that people who watch videos explaining various misinformation techniques (emotionally manipulative language, incoherence, false dichotomies, scapegoating, and ad hominem attacks) are less susceptible to those techniques when they later encounter them.

This seems like happy news because it means that if we can flood the internet with witty videos explaining how to spot misinformation we can stop misinformation from spreading.

I am not so optimistic for two reasons.  (more)

…if you don’t want to get much sleep. Although the privacy issues are paramount, says Center strategic advisor Brad Berens, another problem with companies compiling vast amounts of information about us is that we don’t know what they know.

Companies spying on Americans for our entertainment and their profit is nothing new.

How else can we understand Candid Camera, the show that for nearly 70 years put people in embarrassing situations, filmed those situations, and then broadcast the footage on TV?

One of the most popular things I’ve ever written online was a single sentence piece called “You Are Where You Live.” It featured a link to a site where you could enter your zip code and get the disturbingly accurate Claritas/Prizm profile for your neighborhood. This was when I was the digital editor at EarthLink, the ISP, and ran a weekly newsletter called eLink.

“My TiVo Thinks I’m Gay.” Back in the early 2000s, a story circulated about a straight guy who liked a show that straight guys don’t typically like. His TiVo (remember TiVo?) concluded that he was gay and started recording gay-themed programming. The guy then started recording more straight-seeming content (sports, military history) in an effort to change the TiVo’s mind. This became a plot in a couple of sitcoms. (Here’s a WSJ summary from 2002.)

Relatedly, for years the Amazon algorithm must have thought that I suffer from Sybil-like dissociative identity disorder because my entire family uses the same Amazon Prime account, which is under my name. This also used to be the case with Netflix, but now the streaming service has profiles for different users.

In 2012, Charles Duhigg reported a New York Times story about how an angry father confronted the manager of his local Target because his teen daughter had started receiving coupons for a first-trimester pregnancy. It turned out that the daughter really was pregnant but hadn’t shared the news with Dad. Target had data scientists tracking purchases (like unscented skin lotion) closely associated with early pregnancy and then sending coupons. What’s unsettling is that Target didn’t stop doing this sort of thing after the incident: instead, the retailer started hiding pregnancy-related coupons inside a “nothing to see here” mass of other coupons.  (more)

The Gray Lady blew it when it decided to review Jared Kushner’s new memoir, no matter how scathing the review. Your correspondent also blew it by posting about the review.

By Brad Berens

When people learn that I’m an atheist often the first thing they say is, “oh, so you don’t believe in God?”

“No,” I push back gently. “That’s not what atheism means.”

Defining atheism as “you don’t believe in God” keeps God at the center of the conversation. It reinscribes God’s importance, defying God’s authority but accepting it as the thing to defy. This is the territory of literary characters like Milton’s Satan and Marlowe’s Dr. Faustus who try to rebel against God and fail.

In contrast, what atheism really means is “I am skeptical about the value of belief in a supreme being.” It asks, what does the notion of a supreme being get people in the first place? Are the benefits worth the cost either to the individual or to society? Atheism puts aside the questions around the actual existence of a supreme being because there’s no scientific evidence either way. “Absence of evidence is not evidence of absence,” as the saying goes, but neither is it evidence of presence.

Stepping away from the unanswerable (scientifically speaking) question of a supreme being’s existence allows us to ask different questions.

The difference between the conventional definition of atheism and the one I’m describing here is like the difference between immoral and amoral. An immoral person does things that he or she believes to be wrong (and feels guilty about it). An amoral person doesn’t let moral questions get in the way of doing things (and feels no guilt).

Don’t get me wrong: there is no connection between amorality and atheism besides the Latin prefix “a.” I’m a deeply moral person—I believe in right and wrong in the world and strive to do right things. I am also an atheist.

Ideally, we’d have separate words for the two things I’m talking about: anti-deity (for the people who reject God’s authority but accept God’s existence) and atheism (for the people like me who question the benefit for individuals and society of a belief in a supreme being).

By now, you’re probably wondering what any of this has to do with The New York Times, which is fair.  (more)

We heard rumors last week that the retail giant wants to add a subscription to a streaming service to its Walmart+ rival to Amazon Prime. Walmart needs to think bigger.

By Brad Berens

Note: I wrote and first published the following column on Sunday, August 14, before the rumors came true the following day: Walmart had signed an agreement with Paramount. You can find a review of that news here. However, nothing about the news changes my argument that Walmart is missing a bigger opportunity, which is the topic of what follows.
__________

We humans are cognitive misers, which is a term psychologists use to describe how we think as little as possible about the things we don’t want to think about. When behavioral economist Daniel Kahneman talks about System 1 and System 2 in his famous book Thinking, Fast & Slow, nimble System 1 is the cognitive miser side of the mind. It uses shortcuts to get at the fastest answer rather than the best answer to a question.

I suspect that only about 10% of advertising depends on System 2, the lazy-but-methodical, data-driven side of the mind that sifts data, builds pros and cons lists, and makes good decisions but not fast ones. That first 10% (System 2) hacks its way into our awareness, and then the remaining 90% (System 1) works to automate that awareness into reflexive, routine use and purchase. Brands, as I’ve discussed before, exists to excuse people from thinking.

Amazon is unparalleled at creating no-brainer value propositions. Amazon Prime encourages us not to think about the $139 annual subscription cost: the service provides so much value (two-day shipping, a video service, a music service, free books and magazines and videogames, discounts at Whole Foods) that it repels thinking like a forcefield. Plus, for just about any individual purchase you know that—even if it’s not the best price—Amazon will have a fair price, and you don’t have to worry that it will take too long for your purchase to get there. (more)

How we experience the work of movies stars is different than how we experience the work of actors, says Center strategic advisor Brad Berens, and that difference also helps to understand what we lose when we spend a lot of time on Facebook.

The job of an actor and the job of a movie star are similar — they overlap — but they are not the same. The actor helps to tell a convincing story, and we forget that it’s an actor pretending to be somebody else. The movie star never lets us forget that we’ve seen that star pretending to be other people at other times.

Sam Rockwell is a terrific actor: for years I did not recognize him from role to role, only to sit up in my seat (“That guy??”) as the end credits unspooled. Tom Cruise is a movie star. Yes, he is also a talented actor who has been nominated for and won many awards, but you never forget that you’re watching Tom Cruise… even when he is buried in makeup and a fat suit like he was as Hollywood agent Les Grossman in Tropic Thunder (right).

Actors and movie stars create different kinds of immersion that ask us, the viewers, to perform different kinds of cognitive work. With the actor, we are immersed in the story. With the movie star, we move back and forth between being immersed in the story and being immersed in the telling of that story.

Another way of putting this is that the Experience Stack an actor helps us to create is different than the Experience Stack that a movie star helps us to create.  (more)

Data from the Center’s COVID Reset Project suggests that long after the virus disappears, Americans will still be wrestling with the psychic damage of the last few years. Center director Jeffrey Cole explains.

How long does it take before the painful memories of a national trauma fade away?

Long after we settle the sticky issues around return to work (probably with most of us returning to the office most of the time), the mental toll of living through the COVID pandemic (even if we never lost a loved one or caught it ourselves) will linger.

Decades from now, today’s teenagers will recount for their grandchildren the year or more we moved our lives inside, afraid of contact with anyone except those living with us. They will revisit the pain of what we have all lived through.

The Center’s COVID Reset Project shows that the toll on our mental health will be, by far, the most important legacy of COVID. It will cast the longest shadow.  (more)

What are Experience Stacks? And why is it important, asks Center strategic advisor Brad Berens, for businesses and customers for a wide range of industries to understand them?

Many companies refer to their selection and arrangement of software and hardware as a “Tech Stack” that focuses on the creation, management, production, and tracking of business activities.

On the reception side, we can think of the activities that people do over time with and around the things companies make and sell as an Experience Stack.

“Over time” are two important words in that last sentence because Experience Stacks sit between Customer Experience and Brand.

Customer Experience is about in-the-moment usability with a focus on whether or not the customer buys something. Brand is about a synthesis of the rational and non-rational reflexes a user or customer gradually builds up about a product or service. Brands exists to save people from having to think.

In contrast, Experience Stacks organize and connect different moments of thinking. Although synthesis occurs over time, the individual moments stay active in a thinker’s memory. Different thinkers have different, although overlapping, Experience Stacks around the same products.  (more)

Digital technologies crowd out our analog ability to make connections. That’s a problem, says Center strategic advisor Brad Berens, since analogical thinking is what makes us human.

In the middle of the night, Sting’s song “Moon over Bourbon Street” went through my head. I hadn’t thought of it in years, maybe decades. I love Sting, but I hadn’t listened to his music recently. Why did this song wake me up?

Some context: I had opened my eyes to cloudy skies. I was in a sleeping bag, no tent, on a warm summer night. This was day two of a glorious, five-day white water rafting trip on the lower Salmon River in Idaho. Here’s one image:

We were off the grid: no electricity, no internet. I bought a solar charger for our phones so that we could continue to use the cameras, but the phones didn’t connect to anything.

“Moon over Bourbon Street” is from Sting’s 1985 debut solo album The Dream of the Blue Turtles. It’s a dramatic monologue told from the point of view of a predator who might be a werewolf, but it’s ambiguous. The song was not the album’s biggest hit (“If You Love Somebody Set Them Free”), and there was no reason for it to be in my mind.

Or was there?  (more)

Employers and employees are at loggerheads about whether and how often to come back to the office, and the situation is likely to get worse. Center Director Jeffrey Cole explains.

Five of the seven sectors of life that we track in our COVID Reset Project (communities, shopping, learning, travel and entertainment) are sorted out and resolved as we come out of the pandemic (and we think we are indeed coming out, no matter what anyone at the CDC might have to say about it).

This time: the workplace

Far from being close to settled, a war is brewing over returning to the workplace, with different parties already drawing battle lines. Elon Musk, who is never a good barometer of normal behavior but a good bellwether for change, previewed the war when he said to Tesla workers reluctant to fully return to the office, “Anyone who wishes to do remote work must be in the office for a minimum (and I mean *minimum*) of 40 hours per week or depart Tesla.”

Few workers are ready now, or perhaps ever, to return full-time to the office. The Center’s work in the early days of COVID showed that only 10% wanted pre-pandemic work schedules, while 30% never wanted to return to the workplace. A few months into the pandemic was enough to convince 60% of workers that they would like a hybrid future—coming into the office only some of the time.

Over two years without a workforce in the office was enough to convince most employers that they want everyone back. Some bosses want to walk the hallways and see the troops. Some believe less work with worse quality occurs at home.

This proposition needs to be rigorously evaluated.  (more)

Now that the Tesla CEO is riding off into the sunset, says Center strategic advisor Brad Berens, the social media company needs to skip the protracted court battle and focus on what’s important.

On Friday, Elon Musk made official his desire to wiggle out of his Twitter acquisition.

Many readers kindly and gratifyingly reached out or posted saying “Brad, you called this one!” Why? On April 17, a few days after Musk announced his bid to buy Twitter, I argued that he wasn’t serious. It wasn’t a narcissistic, adolescent bid for attention: it was a savvy earned media play to sell cars. Then, on May 1 after the board accepted Musk’s offer, I doubled down saying he still wasn’t serious.

Even if the market hadn’t tanked and Twitter’s stock price hadn’t dropped to $36.81 (losing nearly half its value from a year ago), Musk still wouldn’t want Twitter. There is no bid to renegotiate: he has never wanted to own the company.

Since Friday, pearl-clutching pundits have busied themselves speculating about how much Musk will have to pay to get out of this. Will it be the $1B kill fee articulated in the bid? But wait! He waived diligence. Will he have to pay the full $42B and still acquire the company? Or pay the difference between his offer price and Twitter’s market cap today? I’m confident that the answers are “maybe” on the first and “heck, no” on the other two.

But who cares?

The important question is not “how much will Elon pay?” but “what should Twitter do next?”  (more)

Five sectors of human life have been changed permanently by the pandemic. Center Director Jeffrey Cole digs into what’s different forever.

As millions of Americans celebrated Independence Day by pouring onto to airplanes and heading to beaches, parks, sporting events, and concerts, we also celebrated independence from the COVID pandemic.

This pandemic is over — whether it really is or not. It doesn’t matter what science or common sense say, even as cases and hospitalizations are climbing again.

We need COVID to be over.

At the Center, we believe this is a once-in-a-lifetime disruption (hopefully something worse doesn’t come along). First graders many years in the future will regale their grandchildren with stories of scrubbing mail, wearing masks, and quarantining at home for months at a time.

Two months into the pandemic, we launched the COVID Reset Project that tracks seven sectors of life to understand how they may or may not permanently change.  (more)

In 2011, my near-future science fiction novel Redcrosse came out. The action was set in 2023, which is just a few short months from now. How clear was my vision?

By Brad Berens

Last week at a film festival, I was trapped in an endless concessions queue that (bonus!) doubled as an internet dead zone. After I had exhausted small talk with my fellow prisoners (“Wow, long line.” “This is going to take a while…” “Yeah…”), I dug around in the Kindle app on my phone for something to read.

I alighted on Redcrosse, the near-future science fiction dystopia I wrote that came out in 2011. It had been some years since I visited that world and that place in my head. In the intervening time two things happened: first, I could read it without focusing only on the things that I’d change. Second, the story of Redcrosse starts on April 27, 2023, which is just 10 months from now. Gulp. Yikes. Zoinks. But wait…

I had a scorecard! I could see how right and wrong I’d been in my predictions about where life in this country was headed. Hence, this column.  (more)

In 2022, Non-Fungible Tokens (NFTs) are just the latest speculative craze, but if you combine them with other trends and squint, says Center strategic advisor Brad Berens, then you can see a much different future.

I’m an NFT skeptic. They seem like digital litter—cybernetic landfill that will clutter the e-commons like plastic bags blowing across a public park. This skepticism is unusual for me. I’m usually an early adopter, as the elephants graveyard in my garage of once exciting/now vanished tech will attest.

Blockchain makes sense when it comes to smart contracts, although thinking about cryptocurrencies gives me migraines. But NFTs? Do we really need to keep more things forever? The internet is already written in indelible laundry pen. Even the name “Non-Fungible Token” seems like an idea that can’t quite lurch into focus.

The use case for NFTs today is weak, but some people felt that way about smartphones back around 2007. “I have a perfectly good feature phone. What do I need with all those app things?” Skeptics had to play around, live inside the tech in order to discover things like the endlessly useful flashlight app or built-in kitchen timer that make life easier, let alone having an entertainment center, telecommunications hub, and movie studio that all fit in a pocket.

So when it comes to NFTs I try to bear in mind one my favorite quotes about the future. It’s from Bill Gates and the 1996 edition of his book, The Road Ahead: “People often overestimate what will happen in the next two years and underestimate what will happen in ten.”  (more)

Sometimes the cost of not showing something horrific is too high. Would gun legislation move faster if people could see the violence of Uvalde and other mass shootings? Center director Jeffrey Cole explores.

The news coming out of Uvalde, Texas in late May was horrific. Nineteen elementary school children and two teachers were murdered by a lone, crazed gunman. Only by entering the deep recesses of a diseased soul of unimaginable evil could we understand how someone could tell a 10-year-old “it is time to die,” then look into their eyes and pull the trigger.

In the aftermath of the tragedy, reporters told the stories of the twenty-one human beings who did not make it out of the classroom. Earlier in the day, several of the fourth graders had been honored at a ceremony for academic distinction. Parents shared heartbreaking details of the lives of the children they would never see again.

From the same reporters we learned of the damage that an AR-15 assault weapon does to a human body. It is manufactured not just to stop someone but to inflict irreparable damage, making recovery nearly impossible. It tears the body apart so much that the families could not make visual identification of the remains.

Instead, the authorities had to rely on DNA and clothing to identify loved ones. The images we conjured in our minds were so devastating that it has led to the first bi-partisan discussion of even minor revisions to gun laws and policies in twenty years.  (more)

What makes things special, memorable, satisfying, says Center strategic advisor Brad Berens, often has less to do with the things themselves than with the context where we experience them.

Some mysteries are eternal.

If the Coyote can afford all those expensive items sold by the Acme Company, then why doesn’t he just visit a desert KFC to eat plumper poultry than the scrawny Roadrunner? Why doesn’t Charlie Brown ask somebody more trustworthy to hold the football? Why does Mr. Darcy jump in that river? Why can’t a genius Professor who can make internal combustion engines out of bamboo fix a boat to escape from Gilligan’s Island? And why do AirBNB hosts always have thimble-sized coffee mugs in their kitchens?

That last one gets me every time.

I prefer coffee mugs large enough to double as hot tubs. More times than I care to confess, when staying at an AirBNB, I’ve dashed to a nearby store that sells adequately-sized mugs and purchased one so that the rate of my early morning coffee intake need not change.  (more)

A new campaign by Check My Ads to get advertisers to stop supporting the conservative news network, reports Center strategic advisor Brad Berens, prompted an entirely inadequate response.

On Thursday, the folks at Check My Ads received widespread coverage about their new campaign to stop advertisers from supporting Fox News. The three Check My Ads founders—Claire Atkin, Nandini Jammi, and Mikel Ellcessor—believe that Fox News has created and disseminated disinformation about the 2020 Presidential Election and other topics. (You can see representative coverage here and here.)

Regardless of whether or not you agree with Check My Ads, the statement Fox News put out in response was inadequate.

The Check My Ads Campaign

What’s unusual about Check My Ads is that they’ve gone after online advertising exchanges instead of advertisers themselves or their advertising agencies.  (more)

With most streamers already accepting advertising and Netflix joining by the end of this year, what does that mean for programming? Will creativity be stifled to satisfy the demands of advertisers? Center director Jeffrey Cole digs in.

The party’s over.

Imagine The Sopranos, Game of Thrones, or even the recent four-hour tribute to George Carlin appearing on broadcast television. All were on HBO. Carlin’s most famous routine is “the seven words you can never say on television.” Even The Marvelous Mrs. Maisel or Ted Lasso could not be shown on network television without so many edits for language or nudity that they would be unrecognizable to fans.

We pay up to $20 a month to watch content that is not censored and only available on pay-cable or streaming. The fact it is not interrupted by ads is important, but it is the lack of a nanny deciding what we are allowed to see and hear that has really driven the rise of pay channels.

All that is about to change. (more)

Lots of walled gardens and videogame platforms are now touting themselves as part of the metaverse, says Center strategic advisor Brad Berens, but there’s an easy way to tell if it’s true. Plus, revisiting Neil Postman’s “Amusing Ourselves to Death” in our digital age.

Two shorter (although slightly connected) main stories this week…

1. Revisiting Neil Postman’s “Amusing Ourselves to Death

If you subscribe to Audible, then you should know that terrific audio content comes as part of the subscription—originals, podcasts you can’t get anywhere else, and audiobooks. Right now, one of the included-with-subscription audiobooks is Neil Postman’s 1985 masterpiece, Amusing Ourselves to Death: Public Discourse in the Age of Show Business. (You can find the paper/ebook version here.)

Descriptions like “prescient” or “ahead of its time” don’t capture the importance of Postman’s thinking. Re-reading or listening to the book today helps to explain Fox News, Trump, and our hyper-polarized society.

Postman’s argument is that, as the USA moved from a print-based culture to a television-based culture, American attitudes toward news shifted from a desire for information to a desire for entertainment. Americans had been afraid of the world turning out like the dystopian dictatorship of Orwell’s 1984, but it had really turned out like the self-medicating society of Aldous Huxley’s Brave New World(more)

A handshake, says Center strategic advisor Brad Berens, is worth a thousand Zoom calls. This has implications for going back to the office, building corporate culture, and democracy.

You’re on a short elevator ride with one other person. Neither of you speak, but you get a lot of information.

Does the other person politely keep a distance? Make momentary eye contact? If you’re a woman and the other person is a man, does he look at parts of your body in a creepy way?

If you’re a black guy, do you see the other person uncomfortably pull a bag closer or shift a package to the other side, thinking that you don’t notice the racism?

If you can smell the other person, then is it because she or he just went running? OK, the person is health conscious. Can you smell perfume or cologne? OK, the person is going out. If the person smells bad, then your Spidey sense tingles: am I trapped in an elevator with somebody who isn’t stable?

None of that information comes through on Zoom.  (more)

The recent crypto currency downfall eerily echoes the dot-com bubble of 2002 — two moments when disruption failed or at least appears to have failed. Why do many cheer when they believe disruption has failed?  Center Director Jeffrey Cole explains.

“Thank God I never invested in Crypto!”

“See, it was always a scam. Now the laws of nature have been restored.”

“Those who thought they would get rich quick have gotten what they deserved!”

“Now it’s back to business as usual.”

“Maybe the name Crypto.com Stadium will be removed, and it will be Staples Center again.”

It has not been a good year for investors. Even Amazon and Apple — and especially Netflix and Facebook — are down, more than just an adjustment.

But the year has been particularly harsh for those who invested in crypto currencies like Bitcoin, whether they saw them as extraordinary investments or a new and digital way of moving money around and eventually becoming a currency.

This month, crypto currencies lost over $200 billion in one day. By some estimates the world of crypto has lost over $3 trillion from its highest point. These are investments that can grow by 90% one day and be down 80% the next day. Perhaps this is just a blip on their way to economic dominance.

But that’s unlikely.  (more)

The latest phase of the digital revolution, says Center strategic advisor Brad Berens, is a Read/Write/Own structure where more culture creators can join a new Artistic Middle Class…maybe.

Calling something “Web3” makes it sound like everybody agrees on what it means. That’s not the case: we’re at the start of our Web3 journey.

It might be more accurate to call it Web3.001.

There are different shapes of Web3, including DeFi (Decentralized Finance), Cryptocurrencies, Decentralized Autonomous Organizations (DAOs), and digital goods like Non-Fungible Tokens (NFTs). This boosterish PDF from Andreessen Horowitz, a Venture Capital (VC) firm that has invested heavily in Web3, is useful if a bit uncritical.

In this column, I’m focusing on creators—artists—and how they might and might not use Web3 to make a living.

Web3 in Context

It’s important not to talk about trends in isolation because trends tend to collide. Here is my current, most optimistic model for how Web3 fits with other trends. (more)

Two smaller stories this time from the Center’s strategic advisor Brad Berens.

The Fragile Glory of “Star Trek: Strange New Worlds”

I’m a nerd. A big nerd. Across many directions. (Just ask my kids.) One of my biggest and longest-term nerdy interests is Star Trek. For just one piece of proof, in the 1980s I went to a design-your-own t-shirt store called “Chicken Shirt” in my home town (Encino, California) and created a t-shirt that said “Trekkie” on the back.* (This was from the heart but—shocker!—did not win me cool kid points at school.)

You can imagine my joy when Star Trek: Strange New Worlds premiered two Thursdays ago on Paramount+.

The show is fantastic!

It’s a prequel to and stylistically hearkens back to The Original Series (TOS, from the 1960s) with self-contained adventures rather than season-long arcs, although the character development arcs do extend from episode to episode. Strange New Worlds has the breakthrough racial and gender diversity of the original (although it’s strangely devoid of LGBTQ+ characters, unlike its sibling show, Discovery, and why are there no Jewish characters?), and shares the original’s “we can build a better future” optimism.

We really need that now.  (more)

So much information comes at us all day, every day, that it’s a wonder we ever get anything done. Center strategic advisor Brad Berens describes a collection of apps, products, and services to help you manage the torrent.

Recently, in Distraction Audits & Why to Do One, I discussed how information and attention are inversely proportional. Or, as the great 20th Century polymath Herbert Simon put it, “a wealth of information creates a poverty of attention.” The earlier issue was about throttling back distractions. This week’s issue is about managing the super-soaker of information squirting at your face all day, every day.

(Note: I first wrote about the suite of applications, services, products and gadgets I use to keep my head above water in 2012, then updated it in 2015, but so much has changed that it’s time for an update.)

Here are my “Change Your Life” productivity apps and how I use them.

This is a long piece, but, unlike my usual, it’s skimmable. I’ve divvied up the apps into sections, alphabetized within each section:  (more)

Three years ago, Center director Jeffrey Cole predicted that after a charmed decade, Netflix, faced with rising competition, might be navigating treacherous waters. Cole’s prescription for what the streaming giant needed to do in 2019 is even more relevant today.

Until two weeks ago, Netflix has led a charmed life. It successfully transformed itself from a DVD-by-mail competitor to Blockbuster Video (which turned down the opportunity to buy Netflix for $50 million) into the first superstar streaming service.

For years, Netflix had complete access to the best product from all the Hollywood studios. Its unparalleled success in attracting subscribers across the globe then gave it massive budgets to create original programming exclusively for Netflix.

Disney, realizing that by selling its content to Netflix it was creating a competitive monster, decided to stop making its content available outside the company. The other studios followed suit. Soon, it became clear they would start their own streaming services. Still, Netflix had a budget of $20 billion (this year) to fund its own content; Netflix grew like a prairie weed when it had almost no competition for the consumer’s wallet.  (more)

Although the Twitter board accepting Musk’s acquisition offer seems to settle the issue about the Tesla founder’s true motives, Center strategic advisor Brad Berens suggests there’s a lot more to this story under the surface.

Two weeks ago in Musk, Trump, Twitter, and New Media Math, I argued that Elon Musk doesn’t really want to buy Twitter: he just wants to use the earned media to help him sell more Teslas.

Then, on Monday, to my surprise the Twitter board accepted Musk’s $44 Billion offer, for which Musk had arranged the financing. I thought, “Boy, did I call that one wrong,” and contemplated sending out a mid-week, “mea maxima culpa” special issue.

But by Wednesday I was back on the fence. That was the day Musk criticized Twitter’s chief legal officer, Vijaya Gadde, over her handling of the Hunter Biden story. This is just one of many of Musk’s tweets criticizing the company that he is trying to acquire.

This violates the terms of the takeover agreement filed with the SEC, which includes: “the Equity Investor shall be permitted to issue Tweets about the Merger or the transactions contemplated hereby so long as such Tweets do not disparage the Company or any of its Representatives.”  (more)

Biometrics aren’t new, but a fresh payment technology turns Amazon into a competitor to Apple Pay, Google Pay, Square, Venmo, the Cash App, PayPal and others.  Center strategic advisor Brad Berens speculates: can the Bank of Amazon be far behind?

If you live in Austin and love experiencing the sharpest edge of technology, then head to the Whole Foods at Arbor Trails. There you can use a new service called Amazon One to pay for your groceries simply by putting your palm on a scanner. Here’s an excerpt from a fascinating piece in the April 19 edition of Progressive Grocer:

Customer enrollment in the Amazon One service takes less than a minute, which involves linking credit/debit card info and creating palm signatures for one or both palms. A palm signature is created when a customer holds their palm over the Amazon One device, allowing the technology to evaluate multiple aspects of the palm. With no two palms alike, vision technology analyzes all aspects to select the most distinct identifiers on a palm to create a unique palm signature.

Once they’re enrolled and done shopping, customers come to the checkout counter or point of sale, hover their hand over the Amazon One device for about a second or so, and the card linked to their palm will be charged for their purchase. Customers don’t have to worry about fumbling with their wallets and handbags anymore to pull out credit cards at checkout counters.

Some things worth noting:

The service is called Amazon One, not Whole Foods One. Amazon always has at least two reasons for everything it does (I call this “the two-strategy strategy”), so expect the service to roll out first to other Amazon brick-and-mortar retail environments and then as a service that other businesses can use.

However, caveat emptor: when other businesses do enable Amazon One, then they’ll be sharing some of their purchase data with Amazon, which might not work out so well for the other businesses.  (more)

Since Will Smith slapped Chris Rock at The Oscars, everybody has had an opinion about Smith’s action, but what about ABC’s decision to censor Smith’s post-slap profanity? Center founder Jeffrey Cole weighs in.

What did he say?

The adult audience viewing the 94th Oscar Broadcast on March 27 could tell that something had happened when Will Smith got out of his seat and approached Chris Rock as he was about to list the nominees for best documentary feature film.

Smith appeared to slap the comedian. It was difficult to tell if it was a staged bit or if something completely unprecedented had occurred. After Smith returned to his seat he was shouting at Rock, but it was impossible to hear what Smith said because ABC bleeped out the entire exchange.

Had the viewers at home been able to hear what those in the Nokia Theater heard (especially Lupita Nyong’o, sitting next to Smith and his wife Jada Pinkett Smith), they would have known something intense and unpleasant had happened. Watching from afar, we could see the stunned look of Nyong’o and know it was not good.

If the Smith-Rock verbal altercation not been completely censored, viewers would have quickly realized that the slap was not comical. It was a physical assault. They also would have seen a side of Will Smith that might have been a once-in-his-lifetime outburst or a reflection of a deeper anger running through his personality that he has kept hidden until that moment.

In short, without bleeping, the audience would have understood what it saw. Instead, it had to wait for others to interpret it for them or go to YouTube or other places on the internet to see what they missed on the broadcast.  (more)

Looking at the Tesla CEO’s offer to buy Twitter through the lens of AQ (Attention Quotient), says Center strategic advisor Brad Berens, makes Elon Musk’s real motives clear.

It’s a good thing for the commonwealth that Elon Musk was born in South Africa; that fact bars him from seeking the U.S. presidency. Otherwise, it’s a sure bet that he’d run as a third-party candidate in 2024. He’d win, too. Musk understands the media better than all but one other person.

That one other person is Donald Trump.

Don’t get me wrong: I am not a Trump fan. I loathe the man and think that he was the worst president in the history of this great country. But I recognize that Trump is the greatest marketer the world has seen since the early Catholic Church (in its world domination days before the Protestant breakaways).

Musk is a better person than Trump across nearly all criteria:

  • Trump inflates his net worth (he’s a fake billionaire); Musk is the world’s richest individual
  • Trump’s products are terrible and fail; Musk makes best-in-class products (disclosure: I drive a Tesla 3; it’s the greatest car I’ve ever had)
  • Trump makes the world smaller and worse; Musk creates technologies that make the world bigger and better, even planning to take humanity to Mars
  • Trump famously doesn’t read; Musk reads voraciously
  • Where the two men are the same is when it comes to playing the media like Joshua Bell plays his Stradivarius: they are virtuosos.

In a column a few years ago, I posited a new metric for attention called AQ, for “Attention Quotient.”

If IQ (Intelligence Quotient) measures raw intellect and EQ (Emotional Intelligence) measures self-awareness and empathy, then AQ measures how much raw attention an individual can pull towards him or herself.  (more)

Typically, people explain new subscription models as a way of stabilizing monthly revenue. But as Center strategic advisor Brad Berens suggests, subscriptions also point to a massive failure in advertising.

Two recent articles caught my eye about a new vogue for subscriptions for products that are typically transactional.

The first has a misleading title: “Apple Is Working on a Hardware Subscription Service for iPhones” (Bloomberg, March 24th) is misleading because the planned service actually covers all Apple hardware software.

In the March 29 episode of The Pivot Podcast (to which I am devoted), Kara Swisher and Scott Galloway talked about the customer experience side of an Apple subscription in what I can only call twin arias of self-involved entitlement that nonetheless had savvy observations (it starts at 27:00… I laughed and thought at the same time, a neat trick).

The second article’s title is accurate: “Airlines, Restaurant Chains Join the Subscription Bandwagon” (Wall Street Journal, March 30th). The subscriptions covered are odd. (more)

Although 50% of Americans want a new job, that’s only one part of a bigger story. Center founder Jeffrey Cole explains the broader context.

As we come out of the greatest disruption of our lifetime, half of us want to find another job, and half of those in a different industry. The toll the pandemic has taken on our lives is producing a mountain (actually, an entire mountain range) of statistics. None is more compelling than the 50% that make up what social scientists are calling “The Great Resignation.”

The last time we witnessed such a seismic shift in the employment market — with massive job openings and transitions — was in World War II when millions of soldiers went off to Europe and Asia. To fill those jobs, millions of women entered the workplace for the first time.

It should come as no surprise that we have not all come through COVID equally.  (more)

What do Tinder, Free-Range Kids, Wattpad, CoComelon, and the movie business all have to do with each other, asks Center strategic advisor Brad Berens

The world seems more dangerous today than it ever has before, but study after study shows that we’re safer now. Hans Rosling’s Factfulness, Matt Ridley’s Rational Optimist, and Steven Pinker’s The Better Angels of Our Nature are three books that dig into this.

In part, life feels more dangerous today because we have so much information about bad things that happen via the news and social media, both of which are incentivized to lead with what agitates us. Don’t get me wrong: horrible things happen to people all over the world all the time, like Russian atrocities in Ukraine. For most people reading this piece, though, things outside your front door aren’t deadly.

Life also feels more dangerous because as individuals we have less practice taking everyday risks today than we did a few years ago.  (more)

From Zoom to cannabis, from telemedicine to leisure travel, Center director Jeffrey Cole analyzes the industries coming out of the pandemic stronger than ever.

This is the third of a three-part series about the winners and losers as we start to emerge from the COVID pandemic.

In Part 1, I covered COVID losers: cash, Uber and Lyft, in-person shopping, commercial real estate, business travel, and mental health.

Fortunately for all of us, there are more winners than losers, so I started reviewing the winners last time in Part 2, talking about Amazon, Disney, Netflix, and Labor.

Let’s continue with more winners.

(more)

The ecommerce and grocery store giant, predicts Center strategic advisor Brad Berens, needs expertise that only America’s favorite coffee shop can provide.

I’m not usually one for predictions with due dates. I see the trends, where the dominos are falling, but spotting precisely when a trend will happen is harder. This time, though, I’ll go out on a limb because two events this week have combined to make me think that Amazon will buy Starbucks within the next two years.  (more)

With infection levels dropping and the end of COVID seeming to be here, what companies were big winners because of how they reacted to the pandemic? Center director Jeffrey Cole digs in.

Last time, we looked at the industries and companies that lost ground during the COVID Pandemic. This week we look at the winners, those who emerged stronger than ever before. We often resist finding new ways of doing things until its necessary. During COVID lockdown, it became absolutely necessary.

Trends accelerated, some by years. A few well-placed and nimble companies benefited.

Amazon

How do you find the right superlatives?

In the first nine months of 2020, Amazon saw its earnings increase 70%. No big deal for a startup experiencing explosive growth, but at the beginning of 2020 Amazon was already a trillion-dollar behemoth. The stock price — at a record high of $1,785 on March 13, 2020 — almost doubled to $3,401 in a little over five months!

You can’t make these numbers up!  (more)

February 19 — The Center for the Digital Future has released the 2021 Digital Future Project, the longest-running study of Americans and their behavior and views about computers and mobile technology, internet use and trust, and the effects of social media.

The report continues the Center’s work as one of the first research studies to explore the impact of digital technology on internet users in the United States. The Center was the first to develop a longitudinal panel study of these issues, beginning in 2000.

For more on the 2021 Digital Future Project and to download the report, click here.

From the Center’s 2021 Digital Future Project

Infographic by Michael Bronstein.

See all of the Center’s infographics here.

From the Center’s 2021 Digital Future Project

Infographic by Ani Tookoian.

See all of the Center’s infographics here.

October 8 — More Americans rely on CNN as their primary information source about COVID-19 than other cable outlets, and Anderson Cooper is trusted by more Americans than other cable commentators, a study by the USC Center for the Digital Future (CDF) has found.

The CDF study also reports extreme differences in views about cable news channels and commentators based on political viewpoint of the respondents.

CNN’s popularity declines, but still leads as cable news source

The CDF study, conducted twice since the pandemic began (April and June), found CNN continues to be the primary source for pandemic news for the largest percentage of Americans – 40% in the June study, down from 49% in April. Fox news held steady with 33% reporting the network as the primary source about the pandemic, the same as in April. The popularity of MSNBC grew in the June study – now 24% of Americans, up from 14% in April.  (more)

October 1, 2020 — In spite of the stress from COVID-19 and stay-at-home restrictions, many Americans continue to say the relationships with their spouses and children have improved during the pandemic, a study by the USC Center for the Digital Future (CDF) has found.

The CDF study, conducted twice since the pandemic began, found in its first survey in April that large percentages of Americans say that relationships at home are better since the pandemic began – and those percentages increased during the Center’s second study in June.  (more)

September 24, 2020 — After more than six months of living in a pandemic, large percentages of Americans continue to indulge in unhealthy lifestyle habits, including overeating, and increased use of alcohol and marijuana – all while many are exercising less, according to a study of the cultural impact of COVID-19 conducted by the USC Center for the Digital Future (CDF).

The CDF study, conducted twice since the pandemic began, found in its first project in April that indulging had increased while exercising declined; the behavior persisted into the Center’s second study in June.  (more)

September 16, 2020 — Six months into the most severe global pandemic in more than a century, are Americans complying with basic precautions to avoid infection and spread of the coronavirus? And will they be vaccinated when a proven treatment for COVID-19 is released?

For many Americans, the answers are no.

A study of the social impact of COVID-19 by the Center for the Digital Future found that while large numbers of Americans do indeed use recommended precautions against infection and spread of the disease, alarmingly high percentages do not participate in these safety programs, and one-fifth will refuse to receive a vaccine.

Do you wear a mask and participate in social distancing?

The Center’s study found many people – but not everyone – take precautions to avoid infection with the coronavirus.

Eighty-three percent of Americans said they participate in social distancing. However, only 77% say they wear a mask.  (more)

September 9, 2020 — A growing number of college students like their online instruction during the COVID-19 pandemic, but many want reduced tuition if their education is online and not in person, reported the second study on the social and cultural impact of the coronavirus conducted by the USC Center for the Digital Future.

The Center’s study found an increase in college students who reported satisfaction with their instruction on the internet: 43% of college students say they enjoy remote learning better than in-class instruction — up from 34% in the Center’s first study in April. While a majority of college students in the current study (52%) say they prefer in-person classroom learning, that number was down from 63% in April.

Fewer students say their teachers are good at adapting their courses for online instruction — now 46%, down from 51% in April. A slightly smaller percentage say they learn less online than in person – 52% in the current study, down marginally from 54% in April.

How do students feel about the online learning environment? A majority of college students in the current study (54%) say they have to work harder when learning online, down slightly from 56% in April. Although a large percentage of college students say their online instruction makes them feel more isolated from their learning community (55%), that number was down from 61% reported in April.  (more)

September 2, 2020 — Increased levels of loneliness and anxiety reported early in the COVID-19 pandemic have declined in recent months, but about one-third of Americans say they are more depressed since the pandemic began, according to a study by the USC Center for the Digital Future.

The second study of the social and cultural impact of the coronavirus conducted by the Center also found two-thirds of Americans who reported increased anxiety are concerned about the future of the world – higher percentages than those who reported being anxious about their own health, politics, their jobs, or safety.

Anxiety and loneliness drop

The study reported 32% of Americans say they are feeling more lonely since the beginning of the pandemic, down from 37% reported in April. Forty-nine percent say they are feeling more anxious, down from 62% reported in April.

However, more than one-third say they are more depressed: 35% of Americans say they are somewhat or much more depressed since the beginning of the pandemic.

Nearly twice as many women (11%) compared to men (6%) say they are much more depressed since the pandemic began.  (more)

August 26, 2020 — Almost all Americans want to change their work life when the COVID-19 pandemic ends, with large percentages ready to shift to a permanent home office, according to a study by the USC Center for the Digital Future.

The study found that working from home during the pandemic has created unique opportunities as well as unprecedented challenges for millions of Americans, including reduced visits to an office, increased working from home, or not going to a traditional office at all.

The study found:

Many Americans want to change their careers and work from home – More than 40 percent (42 percent) want work from home to be permanent, while 25% disagree.

More than one-quarter could adapt all of their job to working from home — For many, working from home could be a permanent reality; 26% could adapt all of their job to work from home; 22% most of their work, 17% some, 9% a little, 26% none.

Work after the pandemic — More than one-third of employees anticipate they will work more from home when the pandemic is over (38% would work more from home, 43% the same, 18% less).  (more)

August 19, 2020 — In spite of efforts to re-open the nation’s economy during the COVID-19 pandemic, most Americans are not comfortable resuming daily life outside the home, and one-quarter say they will do nothing in public until a vaccine is available, reports a study by the USC Center for the Digital Future.

Low percentages of Americans are ready for return to public activities

The study found that other than grocery shopping, most people are uncomfortable doing anything outside their homes right now. For example, only 41% are willing to see a doctor for a non-urgent appointment, and 39% would shop in retail store.

Even fewer said they would dine in a restaurant (25%), stay in a hotel (19%), use public transportation (14%), go to a movie or play (11%), travel by plane or train (11%), or go to a live sports event or concert (8%).

One-quarter will wait for a vaccine to do anything in public.  (more)

Study finds reliance on Trump drops; public support of government response to the coronavirus declines

August 5, 2020 — A growing number of Americans say federal, state, and local governments are doing a poor job of responding to COVID-19, and Anthony Fauci continues to be the nation’s most relied-upon source about the coronavirus, reports a new study by the USC Center for the Digital Future.

Fauci still #1 source for pandemic information; Trump slumps

The Center’s second survey of the social impact of the coronavirus, conducted during the fourth week of June as follow-up to an initial study in April, found more Americans (44%) rely on Anthony Fauci, director of the National Institute of Allergy and Infectious Diseases, than any other individual for information about the pandemic.

After Fauci, individuals rely on New York governor Andrew Cuomo (19%), CNN medical correspondent Sanjay Gupta or the respondent’s own mayor (16%), and Coronavirus Response Coordinator Deborah Birx (15%).

The June survey found President Donald Trump is relied on by 12% of Americans for pandemic information, down from 20% in the Center’s survey in April. In the June survey, 29% of conservatives and 2% of liberals said they rely on Trump. The largest level of reliance on Trump was 40% of those who identify themselves as very conservative.  (more)

The coronavirus pandemic has produced unprecedented disruption of our generation, but it could have been So. Much. Worse. Jeffrey Cole explores what our pandemic experience would have been if — like George Bailey in “It’s a Wonderful Life” — the internet had never been born.

The internet won!

In the middle of March, with little warning or preparation, we moved our lives online.

What couldn’t be transferred online simply stopped. Movie theaters, concert halls, and theme parks closed. Baseball and basketball suspended their seasons, and it is still not clear if they will resume, or if the football and hockey seasons will ever start.

Almost all dining in restaurants stopped, and many people were hesitant to eat even at outside tables at the small number that stayed opened. Travel came to a standstill with airlines barely operating and hotels facing little occupancy. Cruise ships will not see passengers for a very long time.

If it couldn’t happen on the internet, it didn’t happen. If it could, it did.  (more)

The coronavirus pandemic has accelerated many trends that already existed, teaching us to stream more and forcing us to reconsider how much we need offices or stores. But as Center director Jeffrey Cole describes, one environment that has resisted evolutionary pressure, though, is college.

This column focuses on how the coronavirus pandemic and the move to learning online has affected the lives of traditional college students under the age of 25 who live on or near campus. A later column will look at less traditional students who may be older, attending part-time or working full-time and may not live near campus.

It’s the middle of June: all across America families are celebrating high school and college graduations. This year it’s very different. Graduates of the class of 2020 will be forever remembered as the Covid Grads, finishing school during the pandemic.

In 2020, there were no proms, grad nights or graduation ceremonies. Some graduated on Zoom—celebrating virtually with their classmates—while others stood in the street in front of their homes while friends and families drove by honking from a safe distance.

Most college students living away, whether they were seniors or not, finished the last months of the school year by packing up their belongings and moving back in with Mom and Dad.

As bad as it was for those finishing the school year, it will be even worse for the high school grads starting their first year of college in the fall. Forget summer travel before starting college. Forget freshman orientation as they get introduced to their living arrangements on a new campus.

The only travel in their future will be to their parents’ basement, where they will not need an orientation. All of the experiences of moving to college, making friends and meeting roommates, regulating their own hours and behaviors, and sitting in a classroom soaking up knowledge may have to be deferred for a semester or more.  (more)

How much of our lives can we squeeze through Zoom and other videoconference services? A recent funeral marked out a boundary.

By Brad Berens

When a family member dies, the script is clear: you scramble the jets, cancel your appointments, lean on a friend to watch the dog, and get there. For me, that means getting to Los Angeles from Portland.

My aunt, Marlene Meyer, my mother’s sister, died on May 15th. She was 86, vibrant, still working as an insurance agent days before her death, not ready to die. Our family wasn’t ready either. We do not know if she had contracted Coronavirus — a maddening ambiguity — but we do know that Coronavirus changed her decline, death, and funeral.

I’ve lived in Oregon since 2009, always aware that the biggest challenge of being far from where I grew up and where my first family still lives would be moments like these.

The script is clear, but Coronavirus changed the script.  (more)

May 21, 2020 — While many Americans agree that the coronavirus is changing life at home on an unprecedented scale, men and women report significant differences in their views and behavior, according to the first comprehensive study of the social and cultural impact of the pandemic conducted by the Center for the Digital Future at USC Annenberg and the Interactive Advertising Bureau (IAB).

“We are seeing many differences between how men and women are experiencing life during the pandemic – especially in their levels of concern about the effects of the coronavirus, what they miss, and what they enjoy,” said Jeffrey Cole, director of the Center for the Digital Future.

The overall findings released April 29 in the Center’s study, “The Coronavirus Disruption Project: Living and Coping During the Pandemic,” revealed many changes in views and behavior – both positive and negative – reported by Americans since the pandemic and safer-at-home restrictions began.  (For an overview of key issues found in the study, go here.)

Looking more closely at the study’s findings about life at home reveals some sharp differences between men and women and how they are experiencing the pandemic.  (more)

(For comprehensive material about the Coronavirus Disruption Project Study — reports, PDFs, and releases — go here.)

Going to work: a commute of ten miles or ten feet?

Data from the Center’s new Coronavirus Disruption Project suggests that many Americans will never go back to daily commutes to work in offices, and as Center director Jeffrey Cole explains, that’s not a bad thing, either.

The phrase “going to work” has taken on an entirely new meaning.

Two months ago, most of us had never heard of Zoom. Now, for those who are working at home during the Coronavirus pandemic, Zoom is a way of life.

Zoom has moved into a rarefied atmosphere of the tiny list of companies whose brands that have become verbs: Google, Xerox, Uber. The invitation is not, “do you want to join me in a Zoom Meeting,” but rather, “let’s Zoom.”

The latest unemployment figures, the highest since the Great Depression, show that about 15% of Americans are unemployed. Other than essential workers (health care, delivery, police, supermarkets), the rest have moved much (if not all) of our jobs online. We made this move in a matter of days without preparation. Many of us did it without any prior experience doing our jobs online.

Data from the Center’s new study with the Interactive Advertising Bureau, “The Coronavirus Disruption Project: Living and Coping During the Pandemic,” shows that moving our work lives online has been a success — particularly compared to other activities we have been compelled to move online, such as school work.  (more)

April 29, 2020 – Americans coping with the coronavirus are reporting changes in their lives occurring in days that previously took months or years, a wide-ranging study of life during the pandemic conducted by the USC Center for the Digital Future and the Interactive Advertising Bureau has found.

The study shows Americans report many concerns about their lives as well as increased loneliness and anxiety since the outset of the coronavirus pandemic, but they also describe strengthened relationships and enjoying the benefits of working at home.

Titled “The Coronavirus Disruption Project: How We are Living and Coping During the Pandemic,” the study also found significant percentages of Americans who had never previously banked online or bought from internet sources have now been pushed into the online experience because of the pandemic.

“We are exploring the biggest disruption of our lives,” said Jeffrey Cole, director of the Center for the Digital Future in the USC Annenberg School for Communication and Journalism. “Daily life is far more disrupted by the pandemic than after 9/11 or the beginning of World War II, and anxiety is at levels only seen after Pearl Harbor and the Great Depression.

“Yet in spite of the upheaval,” Cole said, “we also found that Americans have positive views about their relationships and hope for how their lives will proceed after the pandemic ends.”  (more)

From the Center’s Future of  Health Care Study.

Infographic by Kelsey Dempsey.

See all of the Center’s infographics here

How quickly should one reply to a personal message received online? What is the appropriate length of time? And has the perceived appropriate length changed over the years?

We have asked this question in our Digital Future Survey since 2012… (more)

The Center has published the tenth edition of World Internet Project report, the collaboration between the Center for the Digital Future and partner organizations in countries worldwide.

The 47-page study explores views and behavior about internet use and non-use, devices for internet access, years online, user proficiency, reasons for not going online, politics and the internet, freedom of expression online, media reliability, online security and personal privacy, and activities on the internet.

Download the tenth World Internet Project Report here.

Center director Jeffrey Cole explores transformation of the media for the keynote address at the leadership meeting of the Interactive Advertising Bureau.

View the video here.