The U.K. government will unveil on Thursday its long-awaited Online Safety Bill, which seeks to curb online harms by significantly increasing the responsibilities of Big Tech firms to monitor content posted on their platforms.
The core concept of the Online Safety bill is the imposition of a new online duty of care on platforms, requiring the removal of illegal content. For “high-risk, high-reach” services, this will extend to material that is lawful but harmful.
The government has changed the bill since the first draft was published in May 2021. It added more provisions to tackle child pornography, scams and fake advertising. However, the most notorious feature of the bill is that executives from tech companies like Google or Meta could be held accountable, including the possibility to serve jail time, if they fail to comply with requests from Ofcom, the communications regulator overseeing the implementation of this bill.
This bill represents a departure from the self-regulation approach that Big Tech, and in particular social media firms, have been enjoying for many years. This legislation seeks to impose obligations on firms to identify illegal and harmful content and remove it from their platforms. “Tech firms haven’t been held to account when harm, abuse and criminal behavior have run riot on their platform,” said Culture Secretary Nadine Dorries.
Despite these new requirements and obligations, the law doesn’t specify how Big Tech firms will assess the risks of the types of legal harm or how they will deal with them in a consistent manner. Many of these details will be set out via secondary legislation, which is usually less scrutinized by members of parliament than the original bill.
Ofcom, the media and telecom operator, will have the power to audit the algorithms that rule what consumers see in their search results and social media feeds. The regulator will also have enforcement powers, and it will be in charge of implementing “code of practice.” The government itself may retain important delegated powers for the Secretary of State.
The power that the government may have over the implementation of this bill may raise some concerns about political interferences. Experts involved in the initial discussion of the bill told PYMNTS, “Such codes are often developed in cooperation with the objects of regulation and can respond flexibly and quickly to emerging issues (such as the targeting of videos at minors). Codes of practice or conduct, however, also tend to have weak statutory underpinnings and are not readily susceptible to public scrutiny.”
While the government strongly supports the bill, privacy groups argued that the bill may not live up to expectations. “The fact that the bill keeps changing its content after four years of debate should tell everyone that it is a mess, and likely to be a bitter disappointment in practice,” said Jim Killock, Open Rights Group executive director.
Opponents insist these new rules would put platforms such as Meta and Google at risk for liabilities. It also has the potential to set up a battle with European data protection rules and deter companies investing in the U.K.
The bill will likely be introduced in the U.K. Parliament this week, but it will need the approval of the House of Commons and the House of Lords before it becomes law. If the parliament approves the bill, it may become law by the end of the year.
Read More: New UK Provision to Hold Big Tech Responsible for Paid Ads From Fraudsters
The U.K. Online Safety Bill is the most far-reaching legislation when it comes to content moderation. Europe recently approved the Digital Service Act (DSA) that will hold Big Tech companies accountable for the illegal content posted in their platforms — they will be required to put in place mechanisms to ensure the content is removed in a timely fashion.
In the U.S., small steps in this direction have been taken, but the reach of the proposed legislation is still far from the European counterparts. Senators Richard Blumenthal (D., Conn) and Marsha Blackburn (R., Tenn.) introduced the Kids Online Safety Act, aimed at holding social-media platforms responsible for harm they cause to children.
Read More: US Lawmakers Propose Bill To Impose Platform Content Moderation