Five years ago, the news story that we're going to discuss in the following paragraphs likely would have struck you as impossibly far-fetched.
Today? Not so much — although the story did stop us in our tracks when we noticed it this week in The Oregonian.
As an outbreak of measles continues in the Vancouver area, and slowly works its way south, the debate over vaccines continues unabated: Hundreds of vaccination opponents gathered last weekend at the Washington state Capitol in Olympia to protest a bill that would restrict personal exemptions to vaccines for school-age children.
You can expect this issue to surface in the Oregon Legislature as well: Rep. Mitch Greenlick, D-Portland, said he's asked for a bill that would eliminate a provision of Oregon law allowing parents to forego vaccinations for their kids because of religious or philosophical reasons.
It's fair to say that people opposed to vaccinations are suspicious about the vast consensus of scientific data showing the safety and efficacy of vaccines. The internet and various social media platforms offer plenty of spurious information that these opponents can use to back up their suspicions.
And it turns out, according to a study published last year in the American Journal of Public Health, some of that disinformation is provided courtesy of Russian social-media trolls backed by the government of Vladimir Putin.
The study took a detailed look at one particular Twitter hashtag — #VaccinateUS — "designed to promote discord using vaccination as a political wedge issue."
The study continued: "#VaccinateUS tweets were uniquely identified with Russian troll accounts linked to the Internet Research Agency — a company backed by the Russian government specializing in online influence operations. Thus, health communications have become 'weaponized': Public health issues, such as vaccination, are included in attempts to spread misinformation and disinformation by foreign powers."
(As you might recall, the Internet Research Agency has been indicted by Robert Mueller's special counsel office for its role in 2016 election interference.)
The study pinpointed one technique frequently used by Russian trolls: "Accounts masquerading as legitimate users create false equivalency, eroding public consensus on vaccination."
Of course, agents of the Russian government aren't the only ones spreading health disinformation. The study also looked at the role of "bots" (accounts that automate content promotion) in promulgating false information about vaccination. The study found that the bots didn't aim so much to sow discord but instead disseminated anti-vaccination (and generally false) messages. And those messages bounce around for years in the vast echo chambers of social media: "Despite significant potential to enable dissemination of factual information, social media are frequently abused to spread harmful health content, including unverified and erroneous information about vaccines," the study concluded.
It represents a vexing new challenge for public health officials, and the stakes are high, as the study notes that "recent resurgences of measles, mumps and pertussis and increased mortality from vaccine-preventable diseases such as influenza and viral pneumonia underscore the importance of combating online misinformation about vaccines."
But it's not at all clear yet how health officials can best combat bot-driven content: Merely increasing the flow of accurate vaccination information runs the risk of feeding the trolls and bots by giving them new information to twist and warp. More research on the topic is needed, the study said.
In the meantime, the message is relatively clear for the rest of us: Don't blindly accept material you see on the internet or on social media. Ask questions. Check out primary sources, wherever possible. Look for verification.
Near the end of the study, its authors inserted a bit of puckish humor: "Antivaccine content may increase the risks of infection by both computer and biological viruses," they wrote. That's nicely put. We'll leave it at that. (mm)