You are currently viewing Fake news is hard to come by

Fake news is hard to come by

This comprehensive analysis of the fake news situation by the Pew Research Center is a fascinating account of the complexity of the situation. They queried more than 1,000 experts about their opinion on the fake news situation: will it improve or not? The experts are divided: 51% for 49% against.

The analysis goes deeply into the drivers, dependencies and also the history of the topic. It touches on political, economic, social, technology, and legal facts and trends and paints a very comprehensive picture of the full complexity of this topic — a complexity with a long history that is full of conflicting interests.

Overall, fake news seems to be there to stay as there is ample supply and demand for it. We should be able to control them better, but it remains to be seen if the measures we implement are effective or not.

It is a very long read. Below is my summary with some additions. Quotations are from the Pew Research article.

Fact: The dividing line between true and fake has never been binary

The space between the full truth and a complete lie is occupied by opinions, propaganda, omissions, exaggerations, deception, urban legends, faulty analysis and mere mistakes. Fact-checking sites like Snopes approach this spectrum methodologically with a rating system.

The abundance and diversity of information at our disposal do not only increase “the ability to get people to believe the wrong thing [but also gives them] the ability to get people to doubt the right thing.”

History: Fake news is not new

It’s an old problem that was previously called propaganda, for example. “Misinformation and fake news will exist as long as humans do; they have existed ever since language was invented.”

The media is secondary to that. Even before the internet and social media, fake news existed in newspapers. “When the television became popular, people also believed everything on TV was true. It’s how people choose to react and access to information and news that’s important, not the mechanisms that distribute them.”

Conflict: truth vs free speech

“There is always a fight between ‘truth’ and free speech.” “The best cure for ‘offensive’ speech is MORE speech.”

This holds a dilemma for democratic societies: “There’s a fundamental conflict between anonymity and control of public speech, and the countries that don’t value anonymous speech domestically are still free to weaponise it internationally, whereas the countries that do value anonymous speech must make it available to all, [or] else fail to uphold their own principle.” This may put totalitarian regimes at an advantage. China is the place to watch for that at the moment.

Politics: information control is part of the power game

The powerful have always used their influence to shape public opinion not only by faking news but also by suppressing them. They will continue to do that.

“Big political players have just learned how to play this game. I don’t think they will put much effort into eliminating it.” “There are a lot of rich and unethical people, politicians, non-state actors and state actors who are strongly incentivised to get fake information out there to serve their selfish purposes.” “Information is a source of power and thus a source of contemporary warfare.”

It like the arms race, terrorism, or IT security. It’s a battle between good and bad: “those who want to falsify information and those who want to produce accurate information, the former will always have an advantage.” The villain acts focussed in the dark while the hero needs to patrol every single street all the time.

Economy: fake news is good for business

“It is not in the interests of either the media or the internet giants who propagate information, nor of governments, to create a climate in which information cannot be manipulated for political, social or economic gain.”

“There are strong economic forces incentivising the creation and spread of fake news. In the digital realm, attention is currency. It’s good for democracy to stop the spread of misinformation, but it’s bad for business.” For example, fake news articles are often used to pull traffic to websites that make money with advertising.

Technology: part of the problem and solution

Technology is part of the solution but also part of the problem as it allows fake news to be generated and spread cheaply and wider than ever before.

Fake news was to be expected as a consequence of the rise of social media. Every technology risks encountering a social backlash as it is adopted more widely. This leads to public and political debate and eventually, regulation.

“Our information environment has been immeasurably improved by the democratisation of the means of publication since the creation of the web nearly 25 years ago. We are now seeing the downsides of that transformation, with bad actors manipulating the new freedoms for antisocial purposes, but techniques for managing and mitigating those harms will improve, creating potential for freer, but well-governed, information environments in the 2020s.”

The ability of technology to counter fake news is limited. “It is nearly impossible to implement solutions at scale – the attack surface is too large to be defended successfully.” “It is too easy to create fake facts, too labour-intensive to check and too easy to fool checking algorithms.”

Technical solutions limit privacy, free speech, and anonymity

There are also important side effects that need to be considered. “The most-effective tech solutions to misinformation will endanger people’s dwindling privacy options, and they are likely to limit free speech and remove the ability for people to be anonymous online.” “Any requirement for authenticated identities would take away the public’s highly valued free-speech rights and allow major powers to control the information environment.” “Increased censorship and mass surveillance will tend to create official ‘truths’ in various parts of the world.”

Social: if you cannot regulate the sender, educate the receiver

“Tech can’t win the battle.” “Relying on algorithms and automated measures will result in various unwanted consequences.” “Misinformation is not like a plumbing problem you fix. It is a social condition, like crime, that you must constantly monitor and adjust to.” “Unless we equip people with media literacy and critical-thinking skills, the spread of misinformation will prevail.”

“Human beings are losing their capability to question and to refuse. Young people are growing into a world where those skills are not being taught.”

“Information is only as reliable as the people who are receiving it.” “The responsibility rests with the individual to deal with misinformation responsibly and choose reliable sources.” You cannot prevent someone from trying to manipulate you. But you can spot it and respond appropriately. There are still enough reliable sources out there. But they need our backing as “high-quality journalism has been decimated due to changes in the attention economy.”

Despite all education, information overflow and growing complexity continue to make this an intellectually challenging task that not everybody is skilled and prepared to take on.

“The average man or woman in America today has less knowledge of the underpinnings of his or her daily life than they did 50 or a hundred years ago. There has been a tremendous insertion of complex systems into many aspects of how we live in the decades since World War II, fuelled by a tremendous growth in knowledge in general. Even among highly intelligent people, there is a significant growth in personal specialisation in order to trim the boundaries of expected expertise to manageable levels. Among educated people, we have learned mechanisms for coping with complexity. A growing fraction of the population has neither the skills nor the native intelligence to master growing complexity. Educated or not, no one wants to be a dummy – all the wrong connotations. So ignorance breeds frustration, which breeds acting out, which breeds antisocial and pathological behaviour, such as the disinformation.” Fake news and populism are closely connected. A new social, digital divide may develop between the people who can and who cannot manage this new complexity intellectually as well as financially.

There is also a deeply human perspective aspect of what information we prefer. “Misinformation is a two-way street. Producers have an easy publishing platform to reach wide audiences, and those audiences are flocking to the sources. The audiences typically are looking for information that fits their belief systems, so it is a really tough problem.”

“People on systems like Facebook are increasingly forming into ‘echo chambers’ of those who think alike. They will keep unfriending those who don’t, and passing on rumours and fake news that agrees with their point of view.”

“People don’t want to BE informed, they want to FEEL informed.”
Roger Ailes

Solutions: What seems possible

“There will be mechanisms for flagging suspicious content and providers and then apps and plugins for people to see the ‘trust rating’ for a piece of content, an outlet or even an IP address. Perhaps people can even install filters so that, when they’re doing searches, hits that don’t meet a certain trust threshold will not appear on the list.” “The future will attach credibility to the source of any information. The more a given source is attributed to ‘fake news,’ the lower it will sit in the credibility tree.”

“Regulatory remedies could include software liability law, required identities, unbundling of social networks like Facebook” which may hamper and limit the spread of fake news.

“There are no technological solutions that correct for the dominance of Facebook and Google in our lives. These incumbents are locked into monopoly power over our information ecosystem, and as they drain advertising money from all other low-cost commercial media, they impoverish the public sphere.”

“Legal options include reversing the notion that providers of content services over the internet are mere conduits without responsibility for the content.”

“In order to reduce the spread of fake news, we must de-incentivise it financially. If an article bursts into the collective consciousness and is later proven to be fake, the sites that control or host that content could refuse to distribute advertising revenue to the entity that created or published it. This would require a system of delayed advertising revenue distribution where ad funds are held until the article is proven as accurate or not. A lot of fake news is created by a few people, and removing their incentive could stop much of the news postings.”

“Systems like blockchain are a start, but in some ways, analogue systems (e.g., scanned voting ballots) can be more resilient to outside influence than digital solutions such as increased encryption.”

There is also hope in the young generation: “People born into the internet age move into positions of authority they’ll be better able to distil and discern fake news than those of us who remember an age of trusted gatekeepers. They’ll be part of the immune system. It’s not that the environment will get better; it’s that those younger will be better fitted to survive it.” Contrary to this hope, a recent Stanford University study found that the young have difficulties with distinguishing news and ads.