The recent riots in several towns and cities across the UK have been linked to false information spread by a website called Channel3Now. The website falsely named a 17-year-old as the attacker in the Southport incident and wrongly suggested he was an asylum seeker. This misinformation, combined with other false claims, led to violence targeting mosques and Muslim communities.
The BBC investigated Channel3Now and found that it is a commercial operation that aggregates crime news to make money on social media. The website has been accused of spreading disinformation, but denies any affiliation with the Russian state. The site’s management admitted to the error in naming the attacker but claimed it was unintentional.
The BBC tracked down individuals linked to Channel3Now, including a hockey player from Nova Scotia and a journalist from Pakistan. The website’s editor-in-chief, Kevin, based in Houston, Texas, defended the site’s operations and claimed that the false information was the result of a mistake by their UK-based team.
The false claims shared by Channel3Now were amplified by social media accounts, including those promoting conspiracy theories and far-right ideas. These accounts, some of which have purchased blue ticks for greater prominence, have been able to profit from spreading misinformation on platforms like X.
While there have been calls for social media companies to take action against disinformation, the UK’s Online Safety Bill does not currently legislate against it. The responsibility to address this issue lies with the platforms themselves, but tracking down individuals involved in spreading false information can be challenging, especially when they are based abroad.
As the investigation into Channel3Now and its impact on the recent riots continues, the role of social media in spreading misinformation and inciting violence remains a pressing issue that requires attention and action.