Most mornings, I can be found listening to “The Daily,” a podcast from The New York Times.
The podcast takes a subject that’s currently in the news and dissects it, usually with journalists from the Times and various subject-matter experts. It’s always full of interesting information, and it’s certainly an intellectually stimulating start to my day.
The Feb. 26 episode was about Russian interference in the upcoming presidential election.
According to the FBI, Russian operatives are – again – actively interfering in our elections, and, this time, they’re playing both sides.
Sure, the Russians are running interference for President Donald J. Trump – as they did in 2016 – but they’re also toying with the Democratic primary process in favor of the current frontrunner, Sen. Bernie Sanders of Vermont.
(As a side note, I’ll mention here that the president does not believe in the accusations against the Russians and, at an early February rally, called the accusations “idiotic.” Remember, it’s the U.S. intelligence community making the accusations. The intelligence community is headed by a director of national intelligence, who is appointed by the president.)
One of the podcast’s guests was Richard E. Sanger, a national security correspondent for the Times. Sanger said the Russians are placing disinformation – false information intended to mislead – on popular Internet forums like Reddit with the hopes that Americans will believe the false information and share it with their family and friends.
According to Sanger, this tactic is not new.
“In the Cold War, the Soviets called (Americans sharing disinformation) ‘useful idiots’ because they unintentionally picked up a Russian theme, a piece of disinformation, and repeated it until it became organic,” he said.
Sanger said there are two theories to the end goals of the Russian interference.
The first is to “have the country screaming at each other.” A chaotic U.S., polarized and lost in heated political debate, is good for Russia.
“It makes the United States look like a place that can’t get its act together. It makes democracy look like a chaotic form of governance that can’t really be trusted to make progress,” said Sanger.
The second theory, according to the correspondent, is “that the Russians really do favor Trump, and they think that Bernie Sanders is the most beatable Democrat.”
Both theories are alarming, and I don’t want to be a victim of Russian interference. I don’t want you to be, either.
That’s why I’m urging you to be smarter about what you read – and share – on the Internet.
Just recently, I ran across a Facebook post claiming to show what Americans would pay in taxes in a Sanders administration. The numbers were terrifying; for my tax bracket, my income taxes would sharply increase.
A quick Google search showed me that the post was inaccurate, but the scary thing was the number of my friends who were actively sharing the post without doing any investigation. The post had thousands of “likes” and even more shares.
I don’t know if the post was Russian propaganda or good old dirty American politics, but it certainly raises red flags.
In the 2016 presidential election, Russian bots spammed the Internet with lies about then-Democrat frontrunner Hillary Clinton. The posts ranged from claims about Clinton’s health to her “involvement” with a child prostitution ring in a Washington, D.C., pizza joint.
The claims were effective, according to experts. Michael V. Hayden, a former director of the CIA, said the 2016 disinformation campaign was “the most successful covert influence operation in history,” and Nate Silver, a statistician, wrote in February 2018 that “…Russian interference tactics were consistent with the reasons Clinton lost.”
I won’t make a judgment here if that is true or not, but I will say it’s a scary thought.
We must constantly be on guard, and we must realize the truth, although it may be politically inconvenient for some: the Russians are not our friends, and they’re actively interfering in the most sacred part of our democracy.
We can’t expect social media giants like Facebook and Twitter to police every post, and we must be able to judge items for ourselves. An easy way to spot disinformation, according to the Poynter Institute’s Daniel Funke, is to look carefully at the posts that make you react.
“If a piece of information makes you feel scared, angry, disproportionately upset, or even smug, then it’s worth doing additional checks or slowing down before re-sharing it,” he said.
An easy way to verify a piece of information is to do a quick Google search. Often, disinformation pieces have been debunked by Snopes or PolitiFact. You should always consider the source of posts, too, and check to see if they’re coming from a reputable source of information.
If a person you know is actively sharing disinformation, it may be time to engage with that person, too, and tell them what’s going on. Back your claim up with proof, and try to open a dialogue with them. It may not always be an easy option, and it may not work, but you can say you did your part.
Most importantly, build up your internal system of red flags. If something seems fishy to you, do some investigating before sharing.
It’s your civic duty, and it really matters. In fact, the future of our democracy may rest upon it.