Biznab
โ–ถโ—Ž๐•fin

X commits to UK crackdown on illegal hate and terror content under Ofcom pressure

X has agreed to new commitments with UK regulator Ofcom to combat illegal hate and terrorist content, including faster takedowns and access restrictions. The platform will assess 85% of reported hate and terror content within 48 hours and submit quarterly reports to Ofcom.

Biznab Editor
ยท
X commits to UK crackdown on illegal hate and terror content under Ofcom pressure

British online safety regulator Ofcom has secured new commitments from X, the social media platform formerly known as Twitter, to better protect UK users from illegal hate speech and terrorist content. The agreement, announced today, requires X to take specific actions to identify and remove such material more quickly, following months of regulatory pressure. Ofcom emphasized that these measures are legally binding and will be closely monitored.

Under the terms of the deal, X will withhold access in the UK to accounts that are reported for posting illegal terrorist content and determined to be operated by UK terror groups. The platform also commits to assessing "at least 85 percent" of terror and hate speech reports submitted by users within a maximum of 48 hours. This marks a significant acceleration from previous response times, which often stretched to several days or longer.

X has also agreed to collaborate with external experts to improve its reporting systems for illegal hate and terrorist content. The company will submit quarterly performance data to Ofcom over the next 12 months, allowing the regulator to track compliance and effectiveness. Failure to meet these commitments could result in enforcement actions, including fines or other penalties under the UK's Online Safety Act.

The agreement comes as part of Ofcom's broader efforts to enforce the Online Safety Act, which imposes a duty of care on platforms to protect users from illegal content. X has faced criticism for reducing content moderation teams and reinstating accounts previously banned for hate speech under Elon Musk's ownership. Ofcom has been particularly concerned about the platform's handling of terrorist propaganda and hate speech targeting minority groups.

Compared to other social media platforms, X has lagged in implementing robust content moderation tools. For example, Meta's Facebook and Instagram use automated systems to detect and remove terror content, while YouTube employs a combination of AI and human reviewers. X's reliance on user reports and manual review has been slower, leading to complaints from advocacy groups. The new commitments aim to bring X closer to industry standards.

For UK users, the changes mean that reported hate speech and terrorist content should be removed or restricted more swiftly, reducing exposure to harmful material. The commitments apply only to users in the UK, not globally, as X continues to face different regulatory standards in other jurisdictions. There is no financial penalty attached to the agreement, but non-compliance could lead to formal enforcement by Ofcom.

Ofcom has not disclosed specific timelines for when the new measures will take full effect, but X is expected to begin implementation immediately. The regulator will publish its first assessment of X's compliance in the coming months. The agreement may also influence how X approaches content regulation in other countries, particularly the European Union under the Digital Services Act. However, X has not commented on whether it plans to extend similar commitments beyond the UK.

๐Ÿ’ก Try our tool for this topic

Meme Generatorโ†’

Create your own memes