I knew the job was dangerous when I took it ...
- Jul 19, 2004
- Reaction score
- Out of Bounds
This later (May 2020) MIT Technology Review item provides more detailed quantification of the heightened usage of bots to spread disinfo / misinfo on COVID-19 this year.Social Media Is Full of Bots Spreading COVID-19 Anxiety. Don't Fall For It
FULL STORY: https://www.sciencealert.com/bots-are-causing-anxiety-by-spreading-coronavirus-misinformation
FULL STORY: https://www.technologyreview.com/20...-bot-twitter-accounts-push-to-reopen-america/Nearly half of Twitter accounts pushing to reopen America may be bots
There has been a huge upswell of Twitter bot activity since the start of the coronavirus pandemic, amplifying medical disinformation and the push to reopen America.
Kathleen M. Carley and her team at Carnegie Mellon University’s Center for Informed Democracy & Social Cybersecurity have been tracking bots and influence campaigns for a long time. Across US and foreign elections, natural disasters, and other politicized events, the level of bot involvement is normally between 10 and 20%, she says.
But in a new study, the researchers have found that bots may account for between 45 and 60% of Twitter accounts discussing covid-19. Many of those accounts were created in February and have since been spreading and amplifying misinformation, including false medical advice, conspiracy theories about the origin of the virus, and pushes to end stay-at-home orders and reopen America. ...
They follow well-worn patterns of coordinated influence campaigns, and their strategy is already working: since the beginning of the crisis, the researchers have observed a greater polarization in Twitter discourse around the topic.
A number of factors could account for this surge. The global nature of the pandemic means a larger swath of actors are motivated to capitalize on the crisis as a way to meet their political agendas. Disinformation is also now more coordinated in general, with more firms available for hire to create such influence campaigns. ...
To perform their most recent analysis, the researchers studied more than 200 million tweets discussing coronavirus or covid-19 since January. They used machine-learning and network analysis techniques to identify which accounts were spreading disinformation and which were most likely bots or cyborgs (accounts run jointly by bots and humans).
The system looks for 16 different maneuvers that disinformation accounts can perform, including “bridging” between two groups (connecting two online communities), “backing” an individual (following the account to increase the person’s level of perceived influence), and “nuking” a group (actions that lead to an online community being dismantled).
Through the analysis, they identified more than 100 types of inaccurate covid-19 stories and found that not only were bots gaining traction and accumulating followers, but they accounted for 82% of the top 50 and 62% of the top 1,000 influential retweeters. The influence of each account was calculated to reflect the number of followers it reached as well as the number of followers its followers reached. ...