Ofcom, the UK’s media watchdog, is boosting its online safety team by hiring experts from big tech companies like Meta, Microsoft, and Google. This move is part of getting ready to enforce the new Online Safety Act, which became law in October.
The Financial Times reported that the regulator has established a new team of nearly 350 personnel focused on online safety, including recent hires from senior positions at Meta, Microsoft, and Google. Ofcom also plans to recruit an additional 100 staff this year.
Selected Ofcom Recruits from Big Tech
In June 2022, Jessica Zucker, a former Meta and Microsoft employee, joined Ofcom as the Head of Online Safety Policy. Zucker has seen a growing interest in roles at Ofcom, a trend she attributes to layoffs in the tech industry over the past two years.
Zucker commented:
“Those still motivated by online safety and proportionality see Ofcom as the alternative. You could do it for one company, or you can do it for an entire industry.”
Almudena Lara, now a Policy Director for Child Safety at Ofcom after her stint at Google, shared her motivation for joining. She continues to focus on the safety of children online.
“Most in the tech sector have the right motivations, but the reality is that it’s very hard sometimes to prioritize user safety.”
Other recent additions to Ofcom from major tech companies include:
- Caitlin Muir, previously with Meta, now VSP Regime Programme Manager at Ofcom.
- Madhu Ramankutty, former Meta employee, currently Principal for the Online Safety Policy.
- Anis Chebbi, formerly with X (previously known as Twitter), now an Online Safety Policy Manager.
Overview of the Online Safety Act’s Provisions and Impact
The Online Safety Act was a hot topic in Parliament, especially after pleas from parents who lost teenagers to suicide due to harmful online content. They called for quick action and greater accountability from companies.
The Act sets out rules for online businesses and service providers in the UK, mainly to keep users, especially children, safe. Here’s what it involves:
- Who It Affects: The Act applies to various online services like websites, apps, and platforms where users can create, share, or interact with content, as well as search services. It’s for all businesses and individuals, no matter their size or location, if they have many UK users or focus on the UK market.
- Getting Ready and Following the Rules: These services must identify and handle risks from illegal and harmful online content, particularly those affecting children. The rules will start in stages, with the first set coming in by the end of 2024. Services need to evaluate risks, manage them, outline their safety rules, let users report illegal content, and balance safety with the rights to freedom of expression and privacy.
- Help and Guidelines: The Act provides guidelines and practices to help services understand and reduce online risks. This includes how to assess risks, implement safety steps, and comply with legal responsibilities.
- Rules and Fines: Ofcom will oversee these rules, offering advice but also imposing fines for not following them. Penalties can go up to £18 million or 10% of the company’s global income, and in serious cases, court orders to disrupt the business.
- Illegal Content and Child Safety: The Act focuses on a range of illegal content and emphasizes protecting children from harmful materials like pornography, violence, bullying, and drug use. Services are required to evaluate risks and put in place specific safety measures for children.
- Different Rules for Some Services: A few services will be placed in special categories based on their user numbers and features. These will have extra responsibilities, such as reporting on their safety practices, giving users more control, and protecting news content.
- Balancing Rights: When putting in safety measures, services must consider and respect users’ privacy and freedom of speech. Ofcom will include these aspects in their guidelines and codes of practice.
Financial and Legal Challenges in Implementing the Act
Ofcom expects that putting the Online Safety Act into action will cost about £166 million by April 2025, with £56 million already spent by April this year. The regulator is planning to make companies pay fees to help cover these costs.
Ofcom also anticipates defending many of its decisions in court, as tech companies may challenge ambiguous aspects of the act for legal clarity. This situation will challenge Ofcom’s ability to stand up to the powerful legal teams of big global tech firms.
Suzanne Carter, Director of Enforcement at Ofcom, said:
We are fully prepared to take risky cases in terms of our own legal exposure. We will be up against some big companies; there could be a very hostile environment here.