This report is the second part of a new research project by HOPE not hate that aims to develop our understanding of how conspiracy theory Telegram chats could help drive radicalisation and the spread of far-right narratives.
Conspiracy theory chats on Telegram are spaces which propagate hate and call for violence. Our report analysed over 1.7 million messages over just under three years to understand how conspiracy theory communities in Britain function today. We find that conspiracy theories are often grounded in racist and far-right world views and that members threaten and justify violence towards their enemies.
We used a deep learning based classifier which identified 16,555 messages which justify violence and just over 14% of these are direct threats, targeting a specific person or a community.
The report looks at individuals commonly targeted by conspiracy theory chats and analyses what motivates this interest. These targets, often belonging to minority groups or holding opinions that conspiracy theorists oppose, find themselves at the receiving end of coordinated disinformation campaigns. We also identify a number of prominent conspiracy theorist individuals as well as far-right profiles whose content has significant influence in these chats and often helps direct the community towards different targets.
The outsized influence of these specific sources of information shows how conspiracy theorist communities can be weaponised to not only spread false narratives but also to create an ‘us versus them’ mentality. This dynamic is dangerous as it not only misinforms but also legitimises hostility towards scapegoats. Such a scenario underscores the need for vigilance and critical engagement with information, to prevent the erosion of trust and the escalation of tensions in our communities.