How to stop pesky ‘conspiracy theories’ (information they don’t like) from gaining traction online has become the number one preoccupation of social media overlords. Now they may have a solution.
Researchers at the university of Sheffield in the UK have developed a ‘precognition’ system that they say can predict who will spread ‘disinformation’ online before they even do it.
The ‘artificial intelligence-based algorithm’ can “accurately predict (79.7 per cent) which Twitter users are likely to share content from unreliable news sources before they actually do it,” a press release states.
The algorithm works by first identifying which Twitter users “mostly tweet about politics or religion,” compared to those who just share selfies and meaningless tripe.
So the algorithm leaves serious tweeters alone, and the tripe is eliminated, right? No, the other way around.
Another factor the algorithm takes into account is that “Twitter users who share disinformation use impolite language more frequently than users who share reliable news sources.”
The research involved grouping Twitter users into two categories, “those who have shared unreliable news sources and those who only share stories from reliable news sources.”
There is no explanation of what is a reliable news source, and what constitutes ‘unreliable’, so that was seemingly just decided upon by the University personnel. You can probably guess how that went.
The researchers also say that those who spread ‘disinformation’ more often use words like ‘liberal’, ‘government’, ‘media’, and often mention ‘Islam’.
You can see where this is going now.
The researchers say that Findings “could help governments and social media companies… design more effective models for tackling the spread of disinformation.”
In other words this will help Facebook and Twitter censor people spreading information they don’t want on their platforms.
“Social media has become the primary platform for spreading disinformation, which is having a huge impact on society and can influence people’s judgement of what is happening in the world around them,” said Dr Nikos Aletras, Lecturer in Natural Language Processing, University of Sheffield.
Censoring people with opinions you don’t like is also a very effective way of ‘influencing’ society, as dictators, despots and enemies of free speech throughout history have discovered.
By Summit News