Google and Facebook owner Meta have both pledged to permit only registered financial firms to advertise promotions on their sites, the head of the U.K. Financial Conduct Authority (FCA) said.
As Bloomberg reported Tuesday (April 26), FCA CEO Nikhil Rathi also called on Twitter to make its position on the matter clear when speaking at a City Week conference in London.
“We look forward to seeing them deliver,” Rathi said of Meta and Google, “and await clearer plans from Twitter and others.”
According to Bloomberg, the FCA is aiming to be more assertive under Rathi’s stewardship, even in places where it doesn’t have the specific authority to enact change. That means pressuring some of the world’s largest companies, and issuing a number of warnings about online scams as consumers increasingly become victims to frauds tied to online ads.
Read more: New UK Provision to Hold Big Tech Responsible for Paid Ads From Fraudsters
Last month, the U.K. introduced new provisions to its Online Safety Bill that will require platforms like Meta, TikTok and Twitter, as well as search engines such as Google, to keep paid fraudulent adverts from appearing on their sites.
“We want to protect people from online scams and have heard the calls to strengthen our new internet safety laws. These changes to the upcoming Online Safety bill will help stop fraudsters conning people out of their hard-earned cash using fake online adverts,” said U.K. Culture Secretary Nadine Dorries in a statement.
The current draft of the Online Safety Bill already requires Big Tech to protect users from fraud committed by other users. The revised law will put the focus on fraudulent paid advertisements whether they are controlled by the platform itself or an advertising intermediary.
See also: Will Elon Musk’s New Twitter Be Foe or Friend With EU Regulators?
This is happening as Elon Musk prepares to acquire Twitter and the European Union is set to introduce the Digital Services Act (DSA), a new law that will greatly increase the responsibilities of Big Tech companies for the content on in their platforms.
Among the new requirements are annual risk assessments aimed at reducing the risks associated with dissemination of illegal and harmful content.