MLB’s existential dilemma — why sharing the wealth for the Greater Good can save the game

MLB’s brightest star, Shohei Ohtani of the Los Angeles Dodgers

As a longtime listener to the Dan Patrick radio show, I was incensed a couple years ago when I heard Dan say that Major League Baseball is no longer a national sport.

“It’s more of a regional sport today,” he said.

Dan followed up by saying that certain cities — St. Louis, Los Angeles, New York come to mind — have large fanbases, but that doesn’t translate into national interest in the game.

Plus, at that time baseball had no one player that had a national or global presence like, say, a LeBron James or Patrick Mahomes.

After I got over my initial righteous indignation, I came around to what Dan was saying about MLB. National ratings have slumped badly over the past couple of decades as young fans have put their focus on the NFL and NBA.

I couldn’t think of a single player that could command the attention of fans nationwide like LeBron or Mahomes.  Shohei Ohtani may be the closest baseball player to a true global superstar.

I’ve written about this before, but my sports passion has always been with baseball, first as a Little Leaguer and later as a fan of the St. Louis Cardinals and Texas Rangers.

Still, it’s clear that baseball, with its slow pace and not-made-for-TV presence — you can’t see all the players at once — has clearly been surpassed by the NFL and NBA.

So, when ESPN announced it would opt out of its MLB rights deal after the 2025 season, I was disappointed by not surprised. ESPN has been struggling with its viewership, too, and it is much more focused on NFL and NBA.

I was puzzled at how MLB Commissioner Rob Manfred planned to replace the ESPN revenue shared by all teams. What network would want to pay hundreds of millions to broadcast baseball and create surrounding programming?

MLB commissioner Rob Manfred

Turns out, Manfred DOES have a plan, according to the Wall Street Journal article. In a lengthy and comprehensive article, the WSJ outlined the commissioners proposed scenario that appears to be a long shot.

Said the Journal:

“Manfred’s model would require teams to cede control of their local rights to the league office so that MLB could sell them collectively as a unified streaming package. Viewers would be able to purchase the games of teams they want to see without the blackouts that have long vexed devotees who actually live near where their favorite team plays.

“No cable subscription would be required. Revenue would be distributed among all teams, like it already is for national deals with Fox and Warner Bros. Discovery.

“The change that we’re talking about,” Manfred said in an interview, “is the only rational response to where the media market is today.”

There’s a huge problem with that plan.

MLB teams don’t share their local revenue with their baseball counterparts. Teams in Los Angeles, New York, Boston and Chicago all generate massive amounts of revenue through their local TV rights and are reluctant to give up any of that revenue for the Greater Good.

According to the WSJ, MLB teams lean on their local broadcast revenue more heavily than their NFL and NBA counterparts. Those sports have much larger national TV deals, and share the revenue across the league.

More from the WSJ:

“Cubs president Crane Kenney said in a recent interview at the team’s spring training facility last week in Mesa, Ariz., that his team would be willing to go along with a new TV model — as long as it accounts for his organization’s status as one of baseball’s highest-revenue teams.

“Treat us fairly,” Kenney said, “and we’re in.”

There’s little incentive for the big players to share their local broadcast revenue with their MLB brothers, unless they truly are concerned with the overall national decline of interest in the game. If a few teams folded, that might get their attention.

However, I can’t see the big market teams sharing their wealth with their small market counterparts — even if it helps sustain the sport.

This is 2025 America. Who does anything for the Greater Good?

Apple draws the line on altered reality in photos

Screenshot
The Wall Street Journal’s Joanna Stern takes a selfie with Apple software chief Craig Federighi

If you’ve ever been fooled by a photo that had something added — or eliminated — you should watch this fascinating video interview by Wall Street Journal tech reporter Joanna Stern with Apple Inc.’s software chief Craig Federighi. The interview focused on Apple Intelligence, which is Apple’s version of artificial intelligence.

Near the end of the 25-minute interview, Stern raises her iPhone and takes a selfie of herself and Federighi as they are seated across from each other at the company’s Apple Park headquarters in Cupertino, Calif.

Then it got really interesting.

Stern showed the photo to Federighi and, using Apple’s most recent photo editing software, quickly edited out a water bottle and a microphone that the photo had captured.

She edited the photo with the intention of showing how easy it is to remove unwanted objects from photos, then asked Federighi about Apple’s approach to allowing users to alter reality in their photos.  Or even adding in objects or people who weren’t there.

Federighi’s thoughtful answer about Apple’s decisions on limiting AI use in its photo software intrigued me.

“There were a lot of debates internally, ‘do we want to make it easy to remove that water bottle or microphone’ because that water bottle was there when you took that photo,” he said. “The demand from people to clean up what seem like extraneous details in a photo that don’t fundamentally change the meaning of what happened has been very, very high. So we were willing to take that small step.”

However, the company ensured that if a photo was altered, it was reflected in the metadata for that photo. And Federighi said Apple drew a line on further editing to alter the reality of their photos.

“We are concerned that the great history of photography and how people view photographic content as something that you can rely on, that is indicative of reality …” Federighi said. “And our products, our phones are used a lot, and it’s important to us that we help convey accurate information, not fantasy … we make sure that if you do remove a little detail in a photo, we update the metadata on the photo so you can go back and check that this is an altered photo.”

It’s clear that Apple has given this subject a lot of thought and is working to distance itself and its software from ‘deepfakes’ that seem to be showing up everywhere. Just check your Facebook feed.

Here’s a link to an article in Info Security Magazine that lists the top 10 deepfakes from 2022.

That debate over editing photos took me back to my days as a reporter and editor at The Oklahoman in the 1980s and 1990s. It was a time certainly before digital photos and software that let you easily alter the reality of a picture.

However, I recall there was quite a debate at the paper over whether drinks in the hands of people at a party should be edited out, by cropping or by being retouched by an artist.

So, editing photos has been an issue for decades.

And that led me to contact Doug Hoke, The Oklahoman’s current photo manager who worked at the paper all through the pre-digital age of the ’80s and ’90s.

Screenshot
Doug Hoke from his profile image on Facebook.

Doug is one of my favorite photographers, with a long history of shooting great photos. His work was regularly featured in Sports illustrated in the pre-digital days.

I asked Doug if my memory was correct and altered photos were an issue back in the day. Here’s what he said in response to the question:

“Way back when if Gaylord (the publisher) didn’t want something in the paper, it wasn’t there,” he said. “The airbrushing of photos was originally done to help with the reproduction, as coarse screens and letter press technique left much to be desired. That evolved into the removal of items, like cocktail drinks, (or) the adding of details like clothing, lengthening hems,  adding material to swimsuits, closing up v-necks, etc.

“When the digital age hit, the ease that photos could be altered called for new guidelines for photography. What is the common practice now is no pixels should be added or removed, except by cropping, and cleaning up dust spots on the chip. Toning and adjusting contrast should only be to help reproduce the image as accurately as possible.”

Doug said he supports Apple’s limits to digital editing that distorts the reality of photos.

“When Apple first announced that they would only allow small details to be removed, I applauded them,” he said. “Craig is correct that photography is based in reality, and I firmly believe that the photos should remain as untouched as possible. You may think that water bottle is in the way, but future generations will look at these details with amazement. Think of old photos you look at, you study every detail in the photo to get a better sense of history. If we remove all those details now, no one will ever see them.”

There’s a distinction between photograph and a photo illustration, Doug said. Or there once was.

“The line between photograph and illustration has been blurred and will never be the same,” he said. “Publications try to hold onto the strict guidelines of what is a photo and what is an illustration but the public probably doesn’t really care. I don’t think the general public has a strong grasp of reality anymore. Games, TikTok, IG, X, whatever they look at. If they think an image is cool they like it without giving any thought to whether it is accurate or not.

“We have had to reject several ‘photos’ that were obviously enhanced by AI, mostly portraits. Accepting photos from unknown sources will be a huge lift in the near future as AI will just continue to get better. Really glad Apple took a stand and said just because we can doesn’t mean we should.”

Did you catch what Doug said? The public is suffering from both ignorance and apathy on whether a photo has been altered.

But we should be concerned. Thank you, Apple, for taking a stand.

A short thread on Threads

Untitled design - 1I’m here today to write about the new social media platform, Threads. But first I have to talk about Twitter, because without the bird app, I’m pretty sure there would not be a Threads.

Back in the Spring of 2008, my friend Russ Florence invited me to connect with him on Twitter, a social media platform that debuted in 2006. I was in the final year of my career as a reporter at The Oklahoman.

So, I signed up on the app and followed Russ as my lone Twitter connection.

As a Twitter newbie, I didn’t realize there was a big Twitter world out there with lots of potential accounts to follow. I loved following Russ and his personal tweets like the one from the day his dishwasher quit on him.

But one day I happened to look at Russ’s profile and saw he was following scores of other Twitter accounts. So I clicked on his follow list. It opened a new world to me because there were so many news and technology sources that I didn’t realize existed until that moment.

I followed a couple dozen right off the bat, and my interest in Twitter grew exponentially.

What I loved about it was being able to follow big national media sources like the New York Times and NPR, or more local sites like The Oklahoman and Tulsa World. Plus there were sports accounts like ESPN, and eventually MLB, NBA and NFL.  I got instant alerts anytime there was breaking news or sports.

Plus there was a growing number of Oklahomans joining every day, providing local perspectives.

I enjoyed Twitter immensely, because, until Donald Trump started opining 30 times a day on Twitter on the run-up to the 2016 election, there were few of what I call the Crazy Uncles on Twitter that you frequently find on Facebook. It was upbeat and fun.

Fast forward to 2022.

Billionaire Elon Musk completed his purchase of Twitter in October, and it’s all been downhill from there. Musk encouraged less-than-objective news sources to begin posting on Twitter. He appealed to the type of voices like podcaster Joe Rogan, who broadcast and repeat misinformation. Trolls blossomed. New rules were imposed that limited the number of tweets a subscriber could view on a daily basis.

With all that roiling long-time Twitter subscribers, along comes Threads, owned by Meta and launched through Instagram. I heard about it and signed up on Day 2. By the end of the week (last week!) I read that 110 million individual accounts had opened.

Threads looks suspiciously like Twitter in that you can comment, like and repost items with or without your own commentary. In fact, Twitter has threatened to sue Meta over the copycat status of Threads.

The downside I’ve seen so far is that you can’t set up lists that contain just the accounts — Threaders? — or topics you want to see, and posts aren’t presented in chronological order. And there’s no Threads site set up for Mac or (I assume) Windows computer users — it’s all mobile based so far.

But I’ve read those features are coming soon. Read this article from the Wall Street Journal,  if you want to know more about Threads.

So, here’s my dilemma and that of millions of other long-time Twitter users. Many — including me — have made their living posting items on behalf of employers to Twitter accounts that are well established and have many followers. Many thought leaders still post regularly to Twitter, although you can find many of the same folks over on Threads.

Instead of just dropping my Twitter account, I’m hanging on, checking both Threads and my Twitter feed on a fairly regular basis.

Until further notice, I’ll be tweeting and threading simultaneously. I welcome followers on both.

Twitter: @James_Stafford
Threads: @jimstafford

Below is a sample Threads post. Seems familiar?

threads sample

A salute to 1971, the coolest year, from a cool kid wannabe

From the cool year of 1971, a cool kid wannabe peers out from his high school yearbook

I stumbled upon a Wall Street Journal article the other day that outlined what a watershed year 1971 was in many, many ways. (You can read it here with a WSJ subscription.) 

It was the year that Nixon/Kissinger reached out to China and opened the U.S. to an important trading partner that had only been seen previously as an arch enemy.

It was the beginning of the end of AT&T’s monopoly of the nation’s telecommunications industry, with an FCC ruling that opened the door to a second long-distance calling provider.

It was the end of the link that tied the U.S. dollar to the value of gold, opening the way to what are known as “floating exchange rates.”

Walt Disney World opened in 1971, as did a little coffee business known as Starbucks, as well as the Nasdaq trading market. The 26th amendment passed that gave 18-year-olds the right to vote. Intel introduced the 4004 chip, considered the first “computer on a chip” and launching a wave of technology innovation that continues today.

The Journal article pointed out that all of these events happened in a single year exactly 50 years ago.

Then it hit me. I graduated high school in 1971, which means I’ve been out of high school for half a century.

The thought almost brought me to tears as I was hit by a wave of nostalgia.

I’m not nostalgic for my high school class, because I never, ever sat at the cool kids table. I was a cool kid wannabe, but never made the cut.

I was mostly invisible to my classmates at Southside High School in Fort Smith, Ark.

So, why did this article hit me so hard? I think it’s because I had never really given any thought to how many years had passed since Graduation Day in 1971.

And how I’ve lived sort of my own version of Forrest Gump’s life in the intervening 50 years, still trying to be one of the cool kids and never quite making it.

But I’m proud of the newspaper career I pursued for more than 30 of those years, a career that brought me to OKC where I would meet the woman who became my wife, the kids we raised, yada, yada, yada.

Enough of that.

Just know that 1971 was a really, really cool year. I’m proud that it’s the year of my high school graduation.

Even if I wasn’t one of the cool kids.