How one Batman cartoon from 1992 demonstrates the strategic value of looking for what makes your business impossible to compare to the competition.   Center strategic advisor Brad Berens explores.

The most important question in business isn’t “how does your offering compare to the competition?” Instead, the question to ask is “how is your offering incomparable?” What is it about your business that the competition simply doesn’t have and cannot duplicate?

If the answer to that question is “nothing,” then your product is a commodity and your customers will construct consideration sets within which they’ll make arbitrary decisions.

At the moment, Tesla’s incomparability in the EV market comes from its vast network of Superchargers, which is why I find it shortsighted that Tesla will open the Superchargers to other EVs in 2024.

Often, incomparability comes from context. This is easiest to see with narrative products, but the phenomenon exists with other offerings. Amazon Prime, for example, is strategically incomparable: subscribers get so much for it that keeping the service is a hard to think about no brainer.

One helpful illustration of incomparability comes from a 1992 Batman cartoon. . .  (more)

The world’s most valuable company won’t buy Disney anytime soon, but there’s a giant caveat. Plus, what else will the e-commerce giant do with Amazon One, its new biometric payment platform?  Center strategic advisor Brad Berens weighs in.

Being a futurist can be glum when other writers breathlessly announce new-to-them ideas that I’ve been talking about for years while missing the broader implications. This happened twice in the last few days.

#1: Will Apple Buy Disney? (Answer: no time soon, but…)

A spate of writers (e.g. here, here, and here) have predicted that Apple will buy the Walt Disney Company. Argh! I started writing about this in 2018, having talked about it on keynote stages even earlier. Today, with Disney CEO Bob Iger needing to turn profitable, with even more studio consolidation inevitable, and with Apple having entered the entertainment biz, acquiring Disney would instantly make it the biggest entertainment company on the planet.

On top of Disney’s immense content library and theme parks ripe for hardware and software integration, every TV show and movie set in the present or near future would become opportunities for product placement (like in Shrinking on AppleTV+ where key emotional moments happen via iPhones).

The missed broader implication? I still think this will happen someday, but in the short term, Apple really only wants ESPN.  (more)

The organization on the other side of the negotiating table from the striking writers and actors is the AMPTP.  But how, asks Center strategic advisor Brad Berens, can one organization represent studios with such divergent interests?

The Alliance of Motion Picture and Television Producers (AMPTP) is a mysterious Hollywood trade association. Its website contains no list of member organizations: just some technical documents, a contact form, and links to a collection of terse press releases.

The association’s president, Carol Lombardini, has a page on the AMPTP site that you can find via diligent Googling, but there is no way to navigate to that page directly, the copyright date at the bottom of the page is 2010, and most of the links on that page are broken. The website lights are on but nobody’s home.

AMPTP is the organization on the other side of the negotiating table from the Writers Guild and the Actors Guild that are both on strike. In contrast to AMPTP, the websites of the other guilds overflow with useful information.

AMPTP’s reticence is important because the interests of its different members (the ones we know about) don’t seem to align, which is one reason why the negotiations aren’t going anywhere.

A Wikipedia page has a partial list of AMPTP members: Amazon, Apple, Comcast, Disney, Netflix, Paramount, Sony, and Warner Bros. Discovery (WBD).

Put simply, it’s ridiculous to talk about Amazon, Apple, Comcast, and Sony in the same breath as Disney, Netflix, Paramount, and WBD.  (more)

The algorithmic rabbit holes of Facebook, Instagram, and Snap can isolate us.  Center strategic advisor Brad Berens asks: Could we use both today’s and tomorrow’s digital technologies to connect In Real Life (IRL)?

Credit: Jumbo.com

Both in its etymology and our common practice “social media” is oxymoronic, like “jumbo shrimp.” The Latin root socius means ally (or similar words like comrade) and is a shared root for society; media is the plural of medium, which means in the middle or between. In an oddly literal, etymological way, therefore, social media stands between people and prevents them from making contact.

You see it on every subway, bus, and plane, in every coffee shop and restaurant, on park benches, in checkout lines: people stare into their phones, checking Facebook, Snap, Instagram, and that-thing-that-used-to-be-called Twitter, and do not interact with the physical beings around them.

I love public transportation and take it whenever I can. Riding the subway when I visit New York, I often gaze out the window instead of into a screen. Sometimes people look uneasy—in a “did somebody fart?” way—because we’re so accustomed to our elective isolation.

The difference between social media contact and In Real Life (IRL) contact is like the difference between chewing cherry-flavored gum and eating cherries. The gum is available on demand, requires no work, always tastes the same, and has neither pit nor nutrients. Cherries are only around during the warmer months. They vary in taste and quality, can stain your shirt if you’re not careful, have fiber and sugar, and watch out for that pit. Gum is easier but cherries are better.  (more)

The ingredients for Digital Transformation, says Center strategic advisor Brad Berens, are institutional pain, inflection points, and tools lying around.  But you also have to get the people part right.

Credit: Gerd Altmann from Pixabay

Usually, when we talk about Digital Transformation, we focus on the physical: paper and ink newspapers dissolve and go online; albums become cassettes then CDs then MP3s then streams; physical books vanish into Kindles; videotapes become DVDs then Blu-Rays then streaming services; a taxi service that you call to explain where you are turns into a smartphone app that already knows where you are.

But Digital Transformation is really about people, not things. You can put anything into the Transmogrifier from Calvin and Hobbes, but if you can’t change people’s behavior then you’re just wasting a cardboard box.   (more)

Christopher Nolan’s Oppenheimer is three hours long. So is Martin Scorsese’s Killers of the Flower Moon. Those are among the longest. What’s going on? Center director Jeffrey Cole digs in.

(Note: You don’t have to read my last column about movie length to understand this one; but click here if you’d like to check it out.)

Let’s start with Martin Scorsese.

There isn’t a more talented or respected film maker. He started directing films in the late 1960s. At 80, he is as productive as ever.

As Scorsese built his reputation, his films got longer. His first film, Who’s That Knocking at My Door (1967) was 90 minutes. His second, Boxcar Bertha (1972) was shorter at 87 minutes.

Those lengths seem tiny compared to his last four films. The Wolf of Wall Street (2013) was 180 minutes. Silence (2016) came in at a relatively short 161 minutes. The Irishman (2019) is his longest film at 209 minutes. This year’s Killers of the Flower Moon is almost as long at 206 minutes.

Scorsese finished The Irishman before the COVID pandemic; it was intended for movie theaters. When all theaters closed, it was sold to Netflix, where it became a major prestige film. The web was full of guides sharing—depending on whether you wanted to watch it in one or more parts—the best points to break for the bathroom or until the next day. During COVID, we didn’t care if The Irishman was four and a half hours! With a pause button and the ability come back whenever you want, there are different rules for length in streaming.  (more)

There have always been long movies in Hollywood. But, as Center director Jeffrey Cole points out, all of a sudden many more are getting even longer.

How do you know when a movie is too long?

Harry Cohn knew.

The legendary and much-feared head of Columbia Pictures in the 30s and 40s shared his secret: when sitting alone in a projection room, he could always tell when a film was too long and not good “if my fanny squirms…it’s as simple as that.”

That led to Herman Mankiewitz, the great co-writer of Citizen Kane, grumbling, “Imagine, the whole world wired to Harry Cohn’s ass!”

A lot of fannies should be squirming in movie theaters these days.  (more)

Center strategic advisor Brad Berens explores the Hollywood strikes versus what makes experiences special in the first place.

Credit: ufcw770

Bad products can yield positive experiences, but we don’t have useful tools to describe the difference because we tend to focus more on products than the context in which people use and experience those products.

This is what I mean by overfocusing. It’s not only true of bad experiences. However, the gap between a product and the experience of that product can be helpfully vivid with great experiences around bad (or even mediocre) products.

Here are some examples… (more)

Center strategic advisor Brad Berens describes how an aural invasion led to a meditation on what makes experiences memorable.

What are the ingredients for a memorable experience? One recent event has some clues.

My scintillating compadre in nerdery, Benjamin Karney, and I have been friends since we were eight. A few days ago, we had a chance to catch up while he was on a long commute and I was working from home. I don’t get great signal on my iPhone in my home office, so I grabbed my trusty Bose noise-cancelling, over-the-ear headphones and wandered up our long driveway towards the gigantic climbing structure in our front yard that my kids loved when they were little. These days, it mainly supports my phone calls with friends, particularly on beautiful Oregon summer days.

When we work to create memorable experiences, we typically focus on the first ingredient (the experience itself). After the isolation of COVID lockdown, we’re more attuned to the value of shared experience, both in real time/different places and especially real time/same place. The hardest ingredient and the one that gets the least attention is comparison, the connection to the past that then leans into the future.

As I left the house, I felt something momentarily prick at my left ear. A few steps later, it happened again. I whipped the headphones off to glance at the left cup. Nothing. I had just replaced the cushions on the headphones, so as Ben and I were chatting a tiny background part of me wondered if I’d screwed up somehow and caused a short circuit. I sat on one of the swings. We kept chatting as he drove and I swung.

It happened again. “Hold on a sec.” I took the headphones off, popped my iPhone onto speaker mode, and said to Ben, “there’s something wrong with my headphones.”

The next thing Ben heard was a loud scream of “Aaaah!”

“What happened?” Ben asked…   (more)

Colliding Trends: as the Supreme Court changed college admissions, Chief Justice Roberts argued that personal essays will be more important.  But, asks Center strategic advisor Brad Berens, are applicants learning to write in the age of ChatGPT?

Affirmative Action rally at Harvard. Credit: Majxuh.

When I give sharpest-edge trend keynotes, I often use the phrase “colliding trends” to describe how I approach peering into the future. As a species, we overfocus on head-to-head comparisons and fail to see threats coming from our blind spots.

At first, trends run parallel, so we focus on our direct competitors. “I’m Canon, and I sell a cute little digital camera called the Elph; my competitors are Olympus, Sony, and Kodak” (this was in the late 1990s). When we do this for too long, we dodge urgent, colliding trends questions like, “what happens to the Elph if the cameras inside smartphones get really good?”

Outside of photography enthusiasts, how many people carry an independent digital camera in their pockets these days?

Last week, the Supreme Court issued two decisions against Affirmative Action admissions policies in higher education. One part of The New York Times coverage caught my eye from this colliding trends perspective… (more)

The newest product from the world’s most valuable company will transform the world, but not, says Center strategic advisor Brad Berens, for quite some time.

I’ve been hesitant to write about Apple’s new “spatial computing” device: the thing that the world’s most valuable company doesn’t want to call an Augmented Reality (AR) headset. However, since there are things that I haven’t seen others talk about, here goes…

Who is the real competition?

It’s a distraction to compare the Vision Pro to the Meta/Oculus Rift, Microsoft’s HoloLens, HTC’s Vive, or any of the other Heads Up Display (HUD) rigs, whether those devices are fully immersive Virtual Reality (VR) or two-worlds AR (or MR, XR, or any of the other spoonfuls of alphabet soup).

The Vision Pro isn’t competing with other HUD: it’s competing with all the other hardware in your life:

  • Smartphone
  • Laptop/Desktop
  • Television
  • Stereo System
  • Game Console

If you add up the costs of those five appliances, then suddenly $3,500 for the Vision Pro doesn’t cause the same level of sticker shock.  (more)

Cole recognized for “pioneering digital strategies important to the security of the nation,” and “leadership in technology and emerging media.” 

Center director Jeffrey I. Cole, who has tracked the history of media and the future of digital technology for more than 40 years, has received the Ellis Island Medal of Honor for professional contributions to American society.

Cole received the Medal of Honor at the organization’s annual awards ceremony on Ellis Island on May 13.

Under Cole’s leadership, the Center’s World Internet Project is the longest-running longitudinal study of the effects of digital technology on Americans and in partner countries around the world. The Center has conducted deep examinations of entertainment, sports media, transportation, and banking to identify where the next wave of disruption will occur.  (more)

Nineteen years before COVID, the 9/11 attacks gave Center strategic advisor Brad Berens a sneak peek at how quickly we can change if the right three ingredients are in place.

“Turn on the television.” 6:00am. Urgency in my brother-in-law’s voice cut through the grogginess as I held the bedside phone to my ear and clutched for my glasses. Our six-month-old daughter had slept poorly, and so had we. I was in no mood for jokes.

“What the hell?” I croaked.

“The World Trade Center has been attacked,” he said. “It’s all over the news. Turn on the television.”

Baby in our laps, jaws hanging in shock, we watched CNN. From 3,000 miles away in Encino, we saw the Twin Towers collapse after hijacked jets hit them, people falling to their deaths.

Organizations are too busy with today’s known problems to change in anticipation of tomorrow’s mysteries, so it takes a meteor strike or the like to make people willing to try something new.

It was my first digital transformation and still the scariest.

On 9/11 and in the days thereafter, I had a sneak peek at how quickly we could change everything about how we worked.  (more)

As Center strategic advisor Brad Berens points out, people decide with their hearts and justify with their heads. Knowing that, and knowing what to do with it, will make you a more effective communicator. Also, a lesson that Beren’s old boss, Rick Parkhill, taught him about persuasion.

The word “persuasion” gets a bad rap because it sounds like a con job where the persuader pulls one over on an unsuspecting mark. P.T. Barnum famously had a sign that read “This Way to the Great Egress!” that fooled circus goers into going through a door to see another carnival exhibit, “the egress,” only to find themselves having exited the carnival. They had to pay again to get back in.

That’s not persuasion. That’s just lying.

The most important aspect of persuasion is empathy. You have to know about the other person’s life, job, likes, dislikes. The more you know, the easier it is to co-create a reality that includes things that you both want.

Usually, businesspeople talk about friction as a bad thing, but with persuasion friction is your friend. The more embodied and immersive a conversation you have when you’re trying to convince somebody to do something, the more likely you are to get their attention, and attention is the first step towards persuasion.

It sounds like I’m only talking about one-to-one persuasion, but I’m not. Advertising works by taking categories of things that lots of people want and then constructing narratives around how their products might fit into those desires. The more friction in the environment where somebody experiences a narrative, the more likely they are to engage.

You are, for example, more likely to buy a souvenir t-shirt at a concert than to order one online later because you want to remember the concert and be able to show off that you were there when you wear the t-shirt. Disneyland is a massive exercise in immersive persuasion that people pay big bucks to visit. It’s not one-to-one.  (more)

We’ve never watched more content and less broadcast television. The death of a once-dominant medium, long predicted, is finally at hand. Center director Jeffrey Cole elaborates.

When I was growing up, the arrival of September, in addition to the beginning of autumn, cooler temperatures and getting dark earlier, meant three things: one was not so good, one didn’t matter much, and the third was exciting. The not-so-good annual September event was going back to school; the one that didn’t matter much was GM, Ford and Chrysler introducing their new cars. The exciting one was the premiere of the new network television shows.

Broadcast television is in a devastating downward spiral. It has been this way since the 1990s and each innovation makes it worse. People under the age of thirty do not even think about network television (except for big sports). The audience is getting older and older, leading to an inevitable conclusion.

It’s hard to remember how powerful the three broadcast networks—ABC, CBS, and NBC—were. Much later, after they were already significantly weakened by cable (the first of many technological and social trends that would diminish network importance and profitability), Fox, introduced a fourth network.

More than anything else, far more than movies or literature, the broadcast networks defined our culture. Nothing else came close.  (more)

By now, says Center strategic advisor Brad Berens, it is common knowledge that programs like ChatGPT say things that just aren’t true, but why do we believe the lies so readily? The answer is F.A.B.S. 

Most people writing about generative AI (ChatGPT, DALL-E, Bard) focus on what the AIs can do, which is understandable since these algorithms are still new. With ChatGPT, our conversational aperture widened to include worries about how kids might cheat on their schoolwork.

Then the aperture widened again because ChatGPT suffers from “hallucinations” (the euphemism AI boosters use for “lying”): it makes stuff up and then presents its fictions so confidently that people accept them as fact.

It’s that last bit that interests me. While I care that ChatGPT is, to put it mildly, an unreliable narrator, what I really want to know is why we humans accept algorithmic misinformation so easily?  (more)

Center Director Jeffrey Cole analyzes five emerging issues for theaters breathing sighs of relief after hits like Top Gun: Maverick and Super Mario Bros.

Movies theaters won.

If ever they were going to disappear, it was during COVID when all the theaters shut down, one major chain declared bankruptcy, we got out of the habit of leaving the home to see films, and even our grandparents learned how to stream.

The theaters survived disruption. The audiences won, too.

Something magical would have been lost if motion pictures had only survived on television sets (even large ones) at home.

Before the pandemic, studios and other producers wanted to experiment with different ways to release films. They wanted to try shorter than 90-day windows between theaters and the home, sell directly on the internet without a streaming service, sell directly to streamers (some of which they owned) as part of subscriptions, and sell via streamers with an extra fee (around $30). They also wanted the biggest experiment of all: releasing to the theater and the home on the same day (Day and Date) in order to understand whether the two environments could co-exist.

Nothing less than the survival of movie theaters, the locale of our first dates, and an essential part of our culture were at stake.

All of the experiments failed.

Led by Tom Cruise and Top Gun: Maverick, followed by Spider-Man: No Way HomeAvatar: The Way of Water, and, most recently, Super Mario Bros., people have gone back to the theater.

Movie theaters are here to stay. There cannot be billion dollar movies without theaters. As COVID ended, the movie house reclaimed its place at the apex of the film distribution chain. Movies are not the same on a television at home, even a very big one. The majesty of a great movie comes from watching it in a theater with other people.  (more)

.

.

Center director Jeffrey Cole joins Llewellyn King and Adam Clayton Powell III on White House Chronicle in a discussion of the future of communications in a post-pandemic world. (more)

Watching the series finale of “Star Trek: Picard” was a lonely exercise because most of the value of experiences comes from sharing them.  Center strategic advisor Brad Berens explains.

Regret seldom punctuates my day-to-day life, but if I had Prof. Peabody’s Wayback Machine handy I would jump back a few days and then schlep up to Seattle or down to L.A. for the IMAX Live Experience of the final two episodes of Star Trek: Picard.

Picard’s third season and particularly Thursday’s finale were an exciting and nostalgia-filled bye bye to the cast of Star Trek: The Next Generation (TNG). I’m not writing a review of the good and bad bits because plenty of other folks have done so. Instead, I’m digging into my regret.

I didn’t go to the IMAX thing because I’m busy and because the Q&A with the cast was prerecorded, so I didn’t feel I could justify the time. That was a mistake because I wound up watching the finale solo, very late on Thursday night after La Profesora and our son had gone to bed.

It was lonely, lacking eventness—that special quality that comes from sharing experiences with other people in real time. On the other hand, I knew that if I waited until the next day much of what was unquantifiable about that night, when the finale was new, would drain away and turn something special into an obligation(more)

When the Twitter owner and Tesla CEO wrongly labeled NPR as “state-affiliated media,” NPR fought back in a powerful way, but it’s only a start. What, asks Center strategic advisor Brad Berens, needs to happen next?

Money has no morals. Money erodes morals because it washes away context and specificity in favor of interchangeability. Think about the differences among giving a friend a birthday gift that is…

1. An exchangeable thing from a local store
2. A gift card from a local store
3. Something from Amazon with a gift receipt
4. An Amazon gift card
5. Cash, a check, a gift card from VISA or MasterCard.

The first two options pull double duty: they convey to your friend that you remembered the birthday and also invest in your community because they support a local store. You’re also saying to your friend, “hey, if you don’t already know this local store you should wander in because there’s a lot of cool stuff in there, and I know you well enough to think that you’ll dig it.”

In contrast, while the last three options are still kind, they lack the additional dimensions of investing in your community and testing your knowledge of your friend’s taste in a way that says, “I see you.” These options neither create nor exploit context. Instead, they put a small, generalized burst of economic power into the world: “go buy yourself something special.” The paradox of that gesture is that specialness comes from the intimacy of the gift from a friend—the context that cash lacks.

The difference between the words moral and immoral are contextual, geographic: they depend on where you stand.

Generally, you only accuse other people of immorality. You would only call yourself immoral if you broke your own code. “I’m a vegetarian who just ate a steak: that was immoral.” Even then, you’d probably only call that particular act immoral rather than condemn your entire being because, as Stephen Covey observed, “We judge ourselves by our intentions and others by their behavior.”

If you have typical, mid-twentieth century values around gender roles and relationships, then you might think that sex before marriage is immoral. On the other hand, if you have feminist views about those typical gender roles, then you might think the roles themselves are immoral. Context matters with morality.

Amorality is different. If you are amoral, then you are not interested in context. Everything is the same to an amoral person except what makes them feel good in the moment or advantages them in the near term.

It is therefore no surprise that one of the richest individuals on the planet, Twitter-owner Elon Musk, is amoral. Having that much money has pulled him away from any grounding context.  (more)

A new book explains how we got to our age of giant culture companies shaking down artists, why it matters to the rest of us, and what we can do about it.   Center strategic advisor Brad Berens explains.

It baffles me that Rebecca Giblin and Cory Doctorow’s book Chokepoint Capitalism isn’t on top of The New York Times bestseller list. It’s an important book about how big companies have captured most of the value in the creator economy, leaving artists, musicians, and writers of all sorts struggling to make ends meet.

There are also clear harms to the rest of us, the audiences who want to enjoy all forms of art but whose options are shrinking while prices skyrocket. Plus, artists are only one set of workers watching big companies extract disproportionate value. As the authors make clear, chokepoint capitalism is also suffocating gig workers of all kinds, suppliers, and professionals.

None of this is news to anybody who has been paying attention, nor is it news to the creators themselves, but Giblin and Doctorow pull open cultural curtains to let blazing sunlight into deliberately darkened rooms in ways that are bracing and appalling.

They tell arresting stories, like how the Recording Industry Association of America paid a Congressional staffer to surreptitiously insert four words into a law between when it was written by legislators and when the President signed it. This change stole millions of dollars from musicians by turning some recordings into work-for-hire that therefore did not earn royalties. It took a concerted effort on the part of musicians to expose this theft and get the amendment rolled back. The congressional staffer who perpetrated what the authors call a “dead-of-night maneuver,” Mitch Glazier, later became the RIAA’s CEO and Chair.  (more)

Apple is preparing to cannibalize its most profitable product, the iPhone. It’s a pattern, says Center strategic advisor Brad Berens, we’ve seen before. Remember the Newton?

This week in The New York Times, we learned about uncharacteristic debate in the executive ranks at Apple around the long-awaited Augmented Reality (AR) glasses that the company will release in June. Some execs worry that there is no market for these glasses, that the price point is too high at $3,000, that there aren’t enough uses for it, and that they’ll gather dust like Meta’s Oculus Rift Virtual Reality (VR) headsets.

Apple CEO Tim Cook, however, is all in on AR: “you’ll wonder how you lived your life without augmented reality, just like today you wonder: How did people like me grow up without the internet?”

Cook is right and the skeptics are wrong. What’s going on is that the skeptics are confusing temporary market realities (have you noticed the economy is a little shaky?) and technical challenges (mapping meat space and digital space onto each other is hard; our current battery technology is Flintstonian) with inevitable technology and behavior trends.

The lack of massive uptake on VR is not a bellwether for AR: that’s the wrong comparison. VR anemia is more like the brief vogue for 3DTV that was all the rage at CES a decade ago and then faded. 3DTV isn’t different enough from 2DTV for most folks to bother upgrading, and the work you have to do to enjoy 3DTV (wear glasses, sit in a particular place) is a drag. Likewise, VR can be amazing, but you need have space to move around blindfolded and not be paranoid that some halfwit family member is going to play a practical joke on you while you are, in essence, wearing a paper bag over your head with an inviting target on your back.

AR is different: it’s not just another world (VR), it’s also a smart window into the real world.  (more)

The CEO of OpenAi is on a path that few have ever walked. As Center director Jeffrey Cole points out, he could end up as the most loved or the most hated person in the world. Or both.

Sam Altman better get ready. He is about to get very rich and, more importantly, very famous.

There is nothing that could ever prepare him for what is about to happen.

Altman is one of the initial investors and now CEO of the most talked-about company on Earth: Open AI, a leader in artificial intelligence and creator of ChatGPT. Already the recipient of multi-billion-dollar investments (including $10 billion from Microsoft, which also integrated ChatGPT into Bing and its Office suite), Open AI has burst into the public consciousness faster than any other company in history. In a few short months, it grew to 100 million users.

That’s just the beginning.

AI has been talked about in laboratories and science fiction for generations. Skeptics have argued that artificial intelligence has a great potential… and always will. Those days are over. It is here. We are about to discover its potential. For better or worse. Probably both.  (more)

“Paying attention,” a common metaphor, is misleading because there are different sorts of attention, and the relationship among them isn’t reducible to numbers.

By Brad Berens

If you’re in the Attention Business—and whether you’re selling movies, cars, toothpaste, whoopee cushions, sex toys, health insurance, a ride hailing service, or a new ointment for that embarrassing rash, every business is in the attention business—then understanding that there’s more than one sort or attention is a good first step towards getting the right sorts of attention for your business.

Attention creates experience. In Principles of Psychology (1890), William James wrote, “my experience is what I agree to attend to.”

Context amplifies and changes experience so that two people with different contexts can experience the same moment in distinct ways. Here are two examples from entertainment, but the phenomenon holds true across the experience of many products.  (more)

In the A.I. revolution, figuring out the contours of our human intelligence has never been more important. Who, asks Center strategic advisor Brad Berens, is best equipped to do this work?

I give keynote addresses all over the planet about digital transformation and sharpest-edged technology trends. One of my themes is that anything that can be digital will be digital. The counterintuitive corollary to this is that the value of what you’re up to will come from the bits that can’t be digitized—the leftovers.

This is like grade school division: you learn that eight goes into 60 only so many times (seven), and you’ve got four left over. That’s the remainder. When you get to junior high and learn about decimals, the leftovers acquire an illusion of tidiness: instead of 60÷8=7 r4 you get 60÷8=7.5. This disappears the remainder, but remainders are important because what doesn’t fit into tidy equations is what we call culture.

The quantification of life started with the Industrial Revolution, sped up with the Digital Revolution, and now the next revolution, Artificial Intelligence, is hitting the gas — whoops! I meant to say, “the acceleration pedal on the EV.”

Sure, more people are making more culture more of the time than ever before, but the experience of that culture takes up less geography in our lives. We get most of it on the six inch by three inch piece of glass in our hands. Algorithms decide what we see, and as I’ve written previously those algorithms are increasingly able to create things that look like culture.

Figuring out the remainder—what can’t be digitized, what only humans can do in an age of evermore powerful A.I.—is the project of the humanities.  (more)

Like “The Mandalorian” or “Loki,” the Paramount+ series “Star Trek: Picard” is unlikely ever to attract new viewers. But, asks Center strategic advisor Brad Berens, is that a problem?

Credit: Paramount+

Here’s a newsflash to no one who has met me.  I’m a nerd.

One piece of evidence from a vast jigsaw puzzle of nerdery: Friday night, I watched the third episode of the third and final season of Star Trek: Picard on Paramount+.

 

I loved it. However, it would be unthinkable for me not to love it.

I’ve been watching Star Trek in all its incarnations since I was a teenager, starting with reruns of the original series after school on Channel 5 in Los Angeles. I took my grandfather to see Star Trek II: The Wrath of Khan in Westwood, California in 1982 because Papa was a Trek fan, too.

A few years later at university, I gathered with other fans for the collective nerdgasm that was the premiere of Star Trek: The Next Generation (TNG, 1987). In the 1990s, my patient wife, La Profesora, watched Star Trek: DS9 and Star Trek: Voyager with me in our courting days, and more recently she watched the first two seasons of Star Trek: Discovery. (She dropped out in Season 3. Fair.)

Picard brings Patrick Stewart (82) back to his iconic role as retired Admiral Jean-Luc Picard, former captain of the Starship Enterprise, which he played across seven seasons on television and four movies, ending with Star Trek: Nemesis in 2002. The first season of Picard came out in 2020, 18 years after Nemesis. I enjoyed Season 1, thought Season 2 was meh, and think that Season 3 is a different animal altogether.  (more)

Fifty-nine years ago the Supreme Court made it exceedingly difficult to win a defamation case against the media. A plaintiff had to prove the media deliberately falsified the facts and did so with malice. Coming out of the 2020 election we now see a case that threads that narrow needle. Center Director Jeffrey Cole digs in.

It has always been an easy call.

The press should be, now and forever, free of any restrictions, impediments, or laws that hinder its ability to cover news. Its rights should always be placed in a preferred position when balanced against any others. Any law, tax benefit, or effort supporting the stability or growth of news, whatever the political flavor, is a good thing. John Stuart Mill’s 19th century argument for the free marketplace of ideas is more vital today than ever.

Equally clear is that the disappearance of newspapers and the weakening of broadcast and cable news is a bad thing. Had a strong local newspaper not folded in Long Island, New York, George Santos would not be sitting in Congress today. His obvious lies would have been quickly detected by a paper able to focus on one local campaign rather than a national newspaper like The New York Times reporting on hundreds of political contests.

The Supreme Court was correct in its 1964 landmark New York Times vs. Sullivan decision that made it almost impossible to win a defamation case against the news media. That’s as it should be. Anything else would create so much fear of retribution that the news media could not do its job. Then, just like the citizens of Santos’ congressional district, we would all pay the price.

I always root for the media.  (more)

Older sci fi can help us see the difference between where we are as a culture and where we thought we’d be. As Center strategic advisor Brad Berens points out, a look back at Isaac Asimov’s 1940s robot stories can help us make sense of AI today.

Image by pch.vector on Freepik

Some science fiction is a potpourri of lasers and explosions and aliens popping out, but the better sort asks what if? We’ve been at science fiction for a while, so looking back at older science fiction to see what the creator got right and wrong can be illuminating. Retro Futures help us to measure the distance between where we are and where we thought we’d be.

Along these lines, a recent article on Yahoo! Finance caught my eye. “AI needs a set of rules—for its own good, and for ours” by Emilia David is a reprint from Business Insider that discussed how lawmakers and industry folks all agree that AI needs strong regulations.

We need AI rules because too many weird things have been happening with ChatGPT, the new Bing, and other algorithms. AIs make things up without realizing it (because they haven’t been programmed to understand the difference between fact and fantasy) and then confidently share their “hallucinations” (this is, believe it or not, a technical term) with human users who believe the lies because they seem plausible.

Folks agree that something has to be done, but nobody knows what to do.  (more)

Pundits panicked last week when oddball chats with the new Bing pushed back the frontiers of weirdness, but, asks Center strategic advisor Brad Berens, were those conversations a fair test in the first place?

Last week, columnists and analysts took to their fainting couches (limply dragging their laptops with them) and described surreal and disturbing conversations with Microsoft’s new chat-driven Bing search engine, which is powered by OpenAI’s ChatGPT technology.

On Wednesday, Ben Thompson of the influential Stratechery blog said, “I had the most surprising and mind-blowing computer experience of my life today” chatting with Bing and Bing’s rapidly-spawned alter egos, Sydney and Venom. As Thompson’s session with Bing extended, the A.I.’s responses got weird.

Eventually, and I think this is telling, it was Bing/Sydney/Venom rather than Thompson that backed away from the conversation. (more)

Center director Jeffrey Cole shares a tale of two TV ratings behemoths.

While I was growing up, the two most popular television programs of most years were the Super Bowl in January or February and the Oscars in March or April. By the end of the first quarter of the year, it was all downhill for television ratings.

These were the two shows that amassed enormous and—more importantly—live audiences. Even as VCRs and DVRs came along, you didn’t want to watch the Super Bowl or Oscars later. You had to watch them in the moment, live. Both programs, more than any other, year in and year out, provided fodder for arguments and discussion at the office water cooler the next morning.

The Super Bowl has held up and prospered. The Oscars. . .not so much!

Now, there is only one program that captures the whole nation’s attention at the same time. The Super Bowl is just about the only collective experience the whole country shares.  (more)

What makes people believe nonsense for which there is no evidence?  Center strategic advisor Brad Berens digs in.

As I first wrote a year ago, you can see the elements of persuasion in this simple quadrant (right).

Mostly, people decide with their hearts and then justify with their heads. They’re also more keen to avoid loss than to pursue gain. Knowing where your argument sits on the quadrant can help you to be more persuasive.

This time, I want to flip to the other side and dig into why people abandon their heads altogether to believe nonsense.

I got to this topic by way of an email my friend Robert Moskowitz sent a few days ago about conspiracy theories:

It occurs to me that people who subscribe to conspiracy theories are only partially affirming that their theory is correct. An equal or perhaps larger part of the basis for their belief may be their desire to deny the reality that they don’t like.

Hence, it’s sensible to believe that reptile alien shapeshifters are running our planet because it’s uncomfortable to acknowledge that legitimately elected human leaders are making the decisions that seem so wrong-headed to the conspiracy theory believer.

This idea was so interesting that I asked Robert if he minded me digging into it here, and he kindly said go right ahead.  (more)

After disrupting how people get information, as well as the entire media ecosystem, Google now finds itself about to be disrupted by a new search engine: ChatGPT.  Center director Jeffrey Cole explains.

It was a “Code Red” followed by an urgent call for “all hands on deck,” or at least for the two most important pair of hands.

Twenty-four years ago, two Stanford Computer Science Ph.D. students in a Silicon Valley garage developed a new algorithm that seemed to read the minds of those who posed an inquiry to its new search engine, Google.

That algorithm led to a cascade of disruption, beginning with Microsoft’s business models, then spreading to the advertising industry, newspapers and magazines, encyclopedias, yellow pages, and far more, redistributing tens of billions of dollars.

Google became so defensive about the devastation its new search methods wreaked across the economic landscape that they adopted as its corporate motto: don’t be evil. All this devastation was not personal or directed at any one place: it was simply collateral damage along the path to progress.

Anyone believing in karma will be smiling now that Google itself is in the bullseye of disruption.  (more)

42 years ago, a murder mystery predicted digital twins and deepfakes.  Center strategic advisor Brad Berens asks: what did this howlingly bad movie get right and wrong?

Writing science fiction is a what if? exercise that tells us a lot about the moment when the writer first posed the question. Looking at where those predictions went awry can help us to understand where we are today and also where we’re headed.

In the 1940s, thinking about the spread of totalitarianism, George Orwell asked himself “what if this all goes horribly wrong?” and wrote 1984. In the 1960s, thinking that things were improving, Gene Roddenberry asked himself, “what if this all goes terrifically right?” and created Star Trek. In the 1980s, Margaret Atwood asked herself, “what if Roe v. Wade isn’t as settled as we think?” and wrote The Handmaid’s Tale, which proved disturbingly prescient.

At a smaller, tech-focused scale, Michael Crichton’s 1981 movie Looker asked, “what if we could create completely accurate digital copies of people and then use those copies instead of the originals?” Today, we call those technologies digital twins (usually this refers to products, not people) and deepfakes (this has a malicious connotation with things like revenge porn).

Nobody should watch this terrible movie. You don’t have to take my word for its epic badness. Just watch the trailer  (more)

The country’s largest ecommerce company ended a program that donated 0.5% of eligible purchases to charities customers selected. Center strategic advisor Brad Berens says this might have surprising negative consequences for Amazon’s brand.

This week, Amazon announced that it was ending its “AmazonSmile” program that enabled customers to support charities with most purchases. The program will end on February 20th.

I hope Amazon changes its mind both because it’s the right thing to do and also because it would be better for Amazon’s brand.

What happened

Amazon’s stated reason for ending Smile is that the program didn’t have the impact the company wanted. Despite funneling roughly half a billion dollars to charities since the program launched in 2013, Amazon said:

“After almost a decade, the program has not grown to create the impact that we had originally hoped,” the company said. “With so many eligible organizations—more than 1 million globally—our ability to have an impact was often spread too thin.”

The average donation to charities was less than $230, Amazon said.

Amazon will continue to invest in areas where it can “make meaningful change,” such as assisting with natural disaster relief, affordable housing initiatives and community assistance programs, the company said.

The timing and optics are odd.  (more)

As generative AI makes first-pass creation faster and easier, Center strategic advisor Brad Berens says an unintended consequence is that humans may become less able to make great things.

Let me start by stipulating that generative AI (ChatGPT, DALL-E) will change how we do what we do, taking the heavy lifting off much human endeavor. This will be true whether it’s creating a PowerPoint presentation, finding an image to illustrate your point, writing the first draft of a memo, coming up with an initial radiological diagnosis, designing a new building, or just about anything else we can imagine that isn’t purely hands on (like scrambling eggs). (Dig into my last few columns for details.)

But where does this leave craft?

Any human endeavor starts with innocent ignorance of your quality, moves through grim “oh dear lord no” self-consciousness about the dejecting low quality of what you’re making, settles into journeyman’s “ah, I’m starting to get it now,” and then, if you’re lucky, arrives at mastery. It isn’t just true of art: it’s true for anything humans do. You don’t get to make the good stuff without making a whole lot of crap along the way.  (more)

Center strategic advisor Brad Berens describes what this week’s Consumer Electronics Show has to do with the death of cursive writing in American schools, how to break down the elements of disruption, and more.

I spent the week leading tours of the automotive hall at CES with my friends at StoryTech. (My favorite exhibit was the quietly transformative What3Words.)

As we explored new Electric Vehicles (EVs), charging technologies, autonomous vehicles, and more, I found myself thinking again and again about Thomas Kuhn’s notion of paradigm shifts in his classic book The Structure of Scientific Revolutions.

A paradigm shift occurs when there’s enough disconfirming data to make you discard the mental model you’ve been using. In the 1400s, Copernicus realized that the Earth-centered model of astronomy he was taught didn’t make sense given the sun-centered data he uncovered, which led to better understanding of the world. Einstein realized that Newtonian physics didn’t fully account for energy, which led to nuclear power.

One important and misunderstood thing about paradigm shifts is that you can only see them in the rear-view mirror once your thinking has already changed.  (more)

Many thinkers end each year with a cluster of predictions for the next year. Center strategic advisor Brad Berens has just one—and it’s more of a prayer than a prediction—about trust.

The pressing question of our age isn’t new. The Marx Brothers asked it in Duck Soup (1933): “who ya gonna believe, me or your own eyes?”

In a recent Los Angeles Times article (thanks, Dad, for sharing it), movie director Scott Mann talked about seamless AI-powered film editing that can replace F-bombs with “freaking” or translate dialog into another language without awkward out-of-sync lips. “You can’t tell what’s real and what’s not,” Mann said, “which is the whole thing.” In that same article, computer science professor Robert Wahl observed, “We can no longer fully trust what we see.”

Both Mann and Wahl get things backwards: we’ve never been able to tell what’s real, and we’ve always trusted what we see. We are especially credulous with new technologies. We saw it with social media in the 2016 election cycle (e.g., Cambridge Analytica) and we’re seeing it now with AI   (more)

Elon Musk’s right-wing posts on his Twitter platform have plummeted the stock at Tesla, the public company where Musk is CEO. Center strategic advisor Brad Berens discusses this with two thought leaders: Lana McGilvray of Purpose and Peter Horan of Horan MediaTech Advisors.

This column is a bit unusual. Instead of just me sharing my thoughts, you’ll get a ranging conversation I had with two friends and advisors: Lana McGilvray, founder/CEO of Purpose Worldwide, and Peter Horan, founder at Horan MediaTech Advisors.

Background: On Tuesday, December 13, Peter shared this article from Inside EVs about recent research from YouGov and Morning Consult, each arguing that Tesla is now partisan because CEO Elon Musk also owns Twitter and has been posting right-wing content. As a result, Tesla is becoming less popular with liberals at the same time that it’s becoming more popular with conservatives. That article was sparked by this earlier Wall Street Journal article on Tesla and partisanship. A subsequent WSJ article reveals that Tesla’s investors are concerned about both Musk’s split focus between Tesla and Twitter and also the plummeting value of Tesla’s market cap.

Peter’s question concerned when brands like Tesla started to become politically partisan, which we kicked around via text. Early in our back and forth, I invited Lana to join the conversation, whereupon we all pivoted into an asynchronous conversation in a Google Doc.

I’ve lightly edited and reorganized the results into this interview. — BB  (more)

Most dystopian fantasies concern monsters we can see conquering us. But with new technologies, asks Center strategic advisor Brad Berens, will we even know if we’ve been conquered?

You can tell a lot about a culture by its dystopias: its fantasies of fear. When you have dueling fantasies, you can tell even more by what they agree on and what they miss.

Our most common dystopias are still analog: an enemy takes over, and we fight or we don’t, but everybody knows who the good guys and bad guys are. The classic example of this is George Orwell’s 1984 (it came out in 1949), which shows how a regime takes over both the world and the minds of its citizens through constant surveillance: “Big Brother is Watching You.” Margaret Atwood’s The Handmaid’s Tale (1985) is similar, but in Atwood’s Gilead a male regime enslaves women.

On the surface, it looks like the technological versions of these dystopias are digital, but they’re still analog: in The Terminator movies (starting in 1984), Skynet is a machine regime seeking to eradicate humans. In The Matrix movies (starting in 1999), the machine regime uses humans as Duracell batteries to power their VR world. In both, there is still a clear dichotomy: good guys versus bad guys.

The real digital dystopia happens when you don’t know you’ve been conquered and you don’t know that any alternatives to captivity exist.  (more)

The new Netflix series about the daughter from The Addams Family going to a Hogwarts-style high school doesn’t ignore the earlier versions of the story: it embraces them — which, says Center strategic advisor Brad Berens, is part of why it succeeds.

One difference between artificial intelligence and the human kind (at least for now) is that AI is amazing at pattern recognition while humans are terrific at pattern forging.

We can’t help ourselves. We use analogies to understand the world around us, asking of any new experience, “OK, what is this like?” and then using a rapid compare-and-contrast to figure out how the new thing is similar to but also distinct from the old thing. It’s not just inside our heads where this sort of forging happens: businesses can channel that pattern forging energy for their customers or (with entertainment) audiences.

The delightful new Netflix show Wednesday, starring Jenna Ortega as Wednesday Addams, is a master class in this sort of channeling. It elegantly deploys references to 85 years’ worth of other versions of The Addams Family stories, as well as references to other horror and teen supernatural television (Sabrina the Teenaged Witch, Buffy the Vampire Slayer) and movies (Harry Potter, Twilight).

For the viewer who recognizes these references, there’s an extra level of cognition that sparks, amplifying the overall experience.  (more)

As CNN and FOX News pivot in the face of turbulent times, four other cable news organizations are waiting to grab big audiences: One may disappear, one is waiting to be noticed, one is happy where it is, and one faces a big opportunity. Center Director Jeffrey Cole explains.

In the forty-year history of 24/7 cable news networks, the ground is shifting like never before.

At the two biggest news networks, CNN and FOX News, there is profound change that may disorient and confound their audiences, forcing them to the left or right.

CNN, under strict budget cutbacks from its new owner, is forsaking its slightly left position and racing toward the middle. For the first time, it is fact-checking President Biden.

FOX News (and all the Murdoch properties), which built its programming over the past six years around Donald Trump, has abandoned the former President and embraced Florida Governor Ron DeSantis. The day after Trump recently announced his third presidential bid, Murdoch’s New York Post covered the story with a small banner across the bottom of page one: “Florida Man Makes Announcement (p.26).” Ouch!

While these two channels, which I discussed at length in my last column, make up more than 50% of the audience for around-the-clock news, there are four other players. At least two stand to benefit, perhaps significantly, from the makeovers at the two behemoths.  (more)

After a surprise-filled mid-term election, CNN is moving to the right and FOX News is moving away from Trump. What’s going on? What about the other channels? Center Director Jeffrey Cole digs into big changes in cable news.

What a roller coaster!

Last week began with us expecting, as history shows almost always happens, the party in power to lose a large number of seats in the midterm elections. The only question—in an overused metaphor—was whether it would be a red wave or a red tsunami. Even worse was the fear that democracy would unravel as “don’t confuse me with the facts” election deniers and ignorant candidates won their elections.

The week finished on a different note. Although troubled by inflation and crime, Americans could see past these transitory issues and voted to protect our constitutional democracy. As a nation we might not know much about civics, but we can discern a real threat when it is upon us.

The ground has shifted in our politics as the Republican party debates its future direction. At the same time, the Democrats are mulling over whether to retire their leader: a surprisingly successful President who is also now the oldest man ever to occupy the White House. If re-elected, at the end of his second term Joe Biden would smash the record by eight years.

It is not just the ground under the political parties that has shifted.  (more)

As media continues to fragment in the face of changes in legislation and technology, where will new big audiences come from? Center strategic advisor Brad Berens explores.

Recently I explored how changes in legislation and technology are signaling the end of cheap digital scale for media. (Don’t worry: you don’t have to read that issue to understand this one.)

If I’m right that digital scale will only get less massive and more expensive, then where will people go to find new big audiences to spread the word about whatever they’re selling?

In other words, where are the frontiers of scale?

One recent frontier of scale is Retail Media, which you experience every time you go to a retail website (like Walgreen’s, Best Buy, or Target) and see ads while you’re shopping… not just ads for things that the retailer sells but also ads for other things like movies and cars. Advertisers love retail media because retailers know a lot about their customers and can match ads to customer interest (we hope in a non-creepy way).

App stores, like the one you use to download or buy new apps on your smartphone, make up another frontier of scale because the stores have massive and identifiable audiences, just like retailers. Already, app developers pay to promote their apps in app stores, but I suspect that over time we’ll see non-app, non-digital products advertising in app stores. This is a big opportunity for the 2024 election cycle.  (more)

A call from your burglar alarm company turns out to be something else entirely. Center strategic advisor Brad Berens explains what you can do to protect yourself from clever criminals.

One of the most popular things I’ve ever written is “Beware the Words with Friends Scammers” about how predators were targeting lonely older women who played this online equivalent of Scrabble.

Here’s another scam to watch out for: the “Middle of the Night” call.

We were having dinner with my parents when my Dad mentioned an odd call from their burglar alarm company that day. The phone rang at 2:00am, and when Dad picked it up the person on the other end, talking quickly, said that she was calling from the alarm company about a medical emergency.

No, we’re fine, Dad said.

The caller then asked for the address, which Dad provided. “No, that’s not the right address,” the caller said. “We’re calling about another address” (a mile or so away). Then, the caller asked for the alarm abort code, which Dad provided. The caller then apologized for the mistake and ended the call.

The number on Caller ID was the number of the alarm company.

Dad went back to sleep. Remember: this was at 2:00am, when most people are groggy rather than suspicious.

When I heard this story, something about it had a wrong shape: as a practitioner and researcher I’ve done a lot of work on phishing schemes and other scams; plus, I personally get dozens of scam texts a week with increasingly creative angles of approach. I’d never heard of a burglar alarm company calling about a medical emergency, and it sounded like Dad had given more information than he’d received.  (more)

It’s fine to look for answers, but often you don’t find them. Instead, says Center strategic advisor Brad Berens, if you’re lucky, you wind up with better questions. WITDO is one of them.

One of my first corporate gigs was as the digital editor at EarthLink, an early dial-up Internet Service Provider (ISP). We scrutinized every move that AOL and Microsoft made. They were our closest competitors selling identical dial-up products. AOL was the biggest; when AOL raised prices so did Microsoft and EarthLink. It was an apples-to-apples world, and we were all selling apples.

We kinda/sorta saw broadband coming (DSL and Cable), but we thought our members would value their @earthlink.net email addresses and the EarthLink software package enough to stick around and buy their broadband through us. This did not happen. Broadband was selling customers a five-course gourmet meal while we were still selling… apples.

We no longer live in a world where apples to apples comparisons make sense much of the time. The problem is that business thinkers love to overfocus and break problems down to fruit on fruit comparisons even though doing so misses crucial context.

One powerful way to dodge this problem is to get in the habit of asking different questions, questions that change the context of your endeavor and let you focus on the things that matter rather than the things that happen to be in front of you. This is the difference between focusing on what’s urgent versus what’s important, a.k.a. the Eisenhower Matrix.

Here are three of my favorite change-the-conversation questions (more):

More important than who owns Twitter is whether anybody can create a massive new social networking service. Also, asks Center strategic advisor Brad Berens, what would a non-profit version of Twitter—let’s call it Quack—look like?

As I wrote last time, I’m taking a break from the endless hand-wringing around Elon Musk’s acquisition of Twitter. The more interesting question is whether anybody can do anything to prop up any massively-scaled social networking service in an age of ever-increasing fragmentation?

In other words, is this the end of cheap scale?

By “cheap scale” I really mean cheap digital scale where a service can quickly and profitably reach billions of people (users, audience members, customers, consumers) using hardware that people already own (smartphones, tablets, computers) and bandwidth that they already buy.  (more)

The funeral business can prey on a family’s darkest moments, but the internet has begun to change this. Center founder Jeffrey Cole explains.

Disruption even finds you in death.

Ben Franklin’s quip that “in this world nothing can be said to be certain, except death and taxes” aside, if ever there was an industry that would seem immune to disruptive change, it would be the funeral business.

Yet, the funeral industry is experiencing a tidal wave of disruption that it does not like one bit. As usual, it is the customer rather than the established business who has the most to gain.

Many, not all, but many, mortuaries take advantage of grief-stricken families in their time of need.

When a loved one dies—particularly if that death was unexpected—the family is in no frame of mind to make considered financial decisions. This is the moment when the funeral business pounces, framing an expensive series of decisions—how much they spend on the casket, the burial plot, the funeral—as a reflection of how much they loved the family member who just died.

For many families, a funeral is the third most expensive financial transaction they will ever make after buying a house or a car. But we do not approach funeral expenditures the same way. It would seem ghoulish or inappropriate to visit five funeral homes (as we might visit car dealers), get prices, and then negotiate with the first by saying, “another funeral home said they could do it for less.”

At least funeral directors (we hope) do not try to close a deal by asking, “what would it take to get your loved one into a casket today?”  (more)

Nothing crazy has happened in the hours since the world’s wealthiest individual took control of the world’s oddest social media company, says Center strategic advisor Brad Berens, and nothing crazy is likely to happen.

A bunch of things—some nutty and some not—happened once Elon Musk’s acquisition of Twitter closed on Thursday:

  • Musk fired the CEO and a bunch of other executives.
  • Musk walked into Twitter HQ carrying a sink and tweeted a video of it with the caption “let that sink in.”
  • Late night hosts dunked on this mild dad joke in ways that were way less funny than the not-terribly-funny dad joke itself. (BTW, if I’d made that joke, my kids’ eyes would have rolled so hard you would have thought somebody was playing Ping Pong.)
  • Musk, the free speech absolutist, tweeted a letter to advertisers in which he said that he would absolutely moderate content on Twitter. This is because advertising is currently Twitter’s sole source of revenue.
  • Advertisers and media agencies adopted a wait-and-see attitude, except for General Motors which paused its advertising… in part, perhaps, because Musk owns Tesla, a rival car company.
  • Musk also followed Facebook’s lead and formed a “content moderation council” that would make the tough decisions about who to kick off the platform. “No major content decisions or account reinstatements will happen before that council convenes,” he said in a tweet.
  • He did this because hate speech exploded across Twitter right after the deal closed.
  • I saw a tweet allegedly from the 45th president saying, “I’m back! Thanks, Elon!” But it was obviously from a newly created account from somebody else. Twitter deactivated the account. But the folks I was with when this happened didn’t know to look at the Twitter username and almost freaked out.
  • A bunch of pundits and people I follow on social media wrung their hands, deactivated their accounts, and acted as if it we had reached the end of days.

We can still measure how long Musk has been in control of Twitter in hours. It’s way too soon to fret about what is going to happen.  (more)

As Center strategic advisor Brad Berens explains, Max Fisher’s new book The Chaos Machine shows the downside of what happens when companies pursue growth at all costs.

In her 1963 book about the trial of Adolph Eichmann, one of the chief architects of the Nazi murder of six million Jews during the Second World War, Hannah Arendt coined the phrase “the banality of evil” to describe Eichmann’s failure to see the consequences of his actions because he was just doing his job.

I’m always hesitant to compare any group to the Nazis except for the Neo-Nazis who embrace the label. However, as I finished reading Max Fisher’s excellent new book, The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World, Arendt’s phrase about the banality of evil kept coming to mind.

To be clear, I am not comparing Facebook, Instagram, Snap, TikTok, Twitter and YouTube to the Third Reich. But I am comparing the indifference of those companies to how their pursuit of profits has destroyed lives to the indifference of Eichmann in Jerusalem.

Fisher is a talented reporter for The New York Times. For years, he has been on the ground chasing the stories that make up the book’s main argument: social media companies use sophisticated software (algorithms) that deliberately provoke extreme emotional reactions in users. The algorithms do this by putting agitating content in front of users even though that content is often misinformation or disinformation. More agitated users spend more time on social media and spread the lies, which means that the social media companies can sell more ads and make more money.

Facts move slower than lies, so corrections or comments or warnings to think first and share later don’t do any good.

What are the consequences that the social media companies ignore?  (more)

Two recent developments in the world of comic books have lessons for all businesses in the age of digital transformation.  Center strategic advisor Brad Berens explains.

From the “Big Story You Haven’t Noticed” department: this month, two things happened in the world of comic books that combine to make a huge inflection point. My friend Peter Horan calls this sort of thing a “meteor strike” where “the expected only wounds you; the unexpected kills you.”

First, GlobalComix and Ox Eye Media announced a new joint venture: an on-demand comics printing service called GC Press:

GlobalComix has announced it will partner with Source Point Press parent company Ox Eye Media starting in 2023 to form GC Press, an on-demand comics printing service. This will be the first of its kind in the comics industry, allowing readers to purchase any comic in physical form (regardless of retail availability) and have it shipped directly to them, anywhere in the world.

Readers can’t buy Print-On-Demand (POD) issues of big publisher characters like Spider-Man, Batman, or Hellboy comics via GC Press (although I am amused by “The Scintillating Spider-Squirrel,” which seems more homage than parody). Instead, this is a niche service that enables niche writers to monetize their work via POD.

Second, the already ridiculously named DC Universe Infinite digital comics service got an even more ludicrous name for its new top level subscription: DC Universe Infinite Ultra. (What’s next? DC Universe Infinite Super Bat Ultra Plus?) For $99.99 per year, readers get digital access to DC comics one month after they hit comic book shops and the bookstores and newsstands that still carry comics. Prior to this, for $74.99 per year readers got digital access to DC comics from six months ago.

So what’s the problem? Why do these two things make a meteor strike?  (more)

Center director Jeffrey Cole explores transformation of the media for the keynote address at the leadership meeting of the Interactive Advertising Bureau.

View the video here.