A while back, I wrote a blog article about “The Fairness Doctrine.” After the January 6th siege on Capitol Hill, many people began wondering if this policy, originally enacted by the Federal Communications Commission (FCC) in 1949, but then eliminated under President Ronald Reagan, should be re-instated.
To review, this doctrine required the holder of a broadcast license to both present controversial issues of public importance, and to present these issues in a manner that was honest, equitable, fair and balanced.
In other words, broadcasters were supposed to not only uncover what the people in their broadcast service area should be aware of, but also to present both sides of the issue.
The Fairness Doctrine only applied to radio and television licensees and no other form of media. Even if it was still in place today, it wouldn’t have applied to Facebook, Snapchat, Twitter, Instagram or any other forms of non-broadcast communication. The problem with social media is that what we read, see, and hear is all controlled by algorithms.
The Challenge of Controlling Algorithms
Unlike most innovations that human beings have designed, algorithms are not static and easily defined. You can’t say that one algorithm is good and the other is evil. They are like a living organism, in that they can learn, adapt and change over time.
Cornell University online behavior scholar, J. Nathan Matias, put it this way:
“If you buy a car from Pennsylvania and drive it to Connecticut, you know that it will work the same way in both places. And when someone else takes the driver’s seat, the engine is going to do what it always did.”
With an algorithm, it changes with each human behavior it comes in contact with and that’s what makes trying to regulate it, from a government standpoint, such a challenge.
Broadcast radio and television was an unknown when it appeared, and government was challenged to regulate it. It used as a model, the regulations that had been developed to oversee America’s railroads. In fact, that’s where the concept of requiring radio and TV stations to operate in the “public interest, convenience and/or necessity” comes from. It’s also why no one has ever been exactly sure of what this phrase actually means when it comes to broadcast regulation.
Closing the Barn Door
The old saying “It’s too late to close the barn door, once the horse is gone,” might be the type of problem facing regulators trying to bring fairness to today’s internet dominated world.
The European Union’s first go at trying to regulate Google Shopping, demonstrated how the slow moving wheels of justice are no match for the high speed technology of today. By the time regulators issued their decision, the technology in question had become irrelevant.
20th Century Solutions Don’t Work on 21st Century Problems
We all learned in school how America’s Justice Department, and in some cases individual states, broke up monopolies in oil and the railroads. Historically, what government was trying to do was breakup price-setting cartels, and lower prices for consumers. But with entities like Facebook and Google, no one pays to use their service; it’s free!
Promising Technology or Dystopian Reality?
When commercial radio was born a hundred years ago, it was greeted with the same exuberance that the internet was and people thought radio would connect people, end wars and bring about world peace.
Then American radio would give a voice to Father Charles Coughlin, a Detroit priest who eventually turned against American democracy itself through his nationwide radio broadcasts, opening the door for the FCC’s Fairness Doctrine coming into regulatory existence.
A Collaborative Solution
Media regulation in the 21st Century with algorithms that act like living organisms maybe should be regulated in the same way we protect our environment.
As an example, how would you go about improving a polluted river?
“To improve the ecology around a river, it isn’t enough to simply regulate companies’ pollution. Nor will it help to just break up the polluting companies. You need to think about how the river is used by citizens—what sort of residential buildings are constructed along the banks, what is transported up and down the river—and the fish that swim in the water. Fishermen, yachtsmen, ecologists, property developers, and area residents all need a say. Apply that metaphor to the online world: Politicians, citizen-scientists, activists, and ordinary people will all have to work together to co-govern a technology whose impact is dependent on everyone’s behavior, and that will be as integral to our lives and our economies as rivers once were to the emergence of early civilizations.”-Anne Applebaum and Peter Pomerantsev, The Atlantic, “How to Put Out Democracy’s Dumpster Fire”
Now you know why bringing back “The Fairness Doctrine” will not work in a communications world controlled by algorithms.
We need to think differently.
Albert Einstein said it best,
“We cannot solve our problems with the same thinking we used when we created them.”