Online Safety in the world of GenAI, with Richard Wronka, Director in Ofcom's Online Safety Policy Team
After years of debate, the UK’s landmark Online Safety Act finally became law in 2023, ushering in a new era of digital regulation. This was followed by new codes of practice around illegal content (in December 2024) and protecting children (in April this year).
Meanwhile, the pace of tech progress has only accelerated - in particular the GenAI wave, which has hugely increased the scale and type of risks that can present online. How can Ofcom - the regulator responsible for overseeing and enforcing the Act - stay on top of this technological shift, without throttling the businesses we need to drive innovation and growth? And what do the new rules mean for startups trying to build fast and stay compliant?
We sat down with Richard Wronka, a Director in Ofcom’s Online Safety Policy team, to find out.
FORM: Richard, tell us about your role at Ofcom and what your team does?
RW: I’m at Ofcom, the UK's communications regulator. Recently, we've been tasked by Parliament with setting up new rules to protect people from illegal content online and protecting under 18s from harmful content. I'm a supervision director at Ofcom and it's the role of my team to lead Ofcom's engagement with the services we regulate under the Online Safety Act.
At the moment, our focus is on explaining the new online safety rules to industry; working with online services to understand how they protect their users; assessing how effective those approaches are; and then driving improvements where we think that's necessary. And, of course, as a regulator we have enforcement powers that we can lean on when needed, but our approach is very much to work constructively with all services who are willing to comply with the new rules.
FORM: We've had a pretty long process of developing rules on online safety. Where are we now and what should startups and investors be aware of?
RW: Yes, the new set of regulations has been in the works for some time now - the online safety laws have been making their way through Parliament, while Ofcom has simultaneously been consulting on our approach to implementing those laws. Specifically, back in December last year, we published our codes of practice around illegal content, and in April this year, we published our codes of practice around protecting children. And there is more to come as we implement this quite complex new set of regulations.
Our message to startups, investors and all online services, is that now is the time to act. Services are already expected to have conducted a risk assessment covering illegal content; and services with child users are expected to have conducted a children’s risk assessment by the end of July. Those are the main action points at this stage.
FORM: Regulation is often seen as at odds with startup growth and innovation. What's your case to the sceptics in the startup community that think that the new rules are only going to slow things down?
RW: Yeah, we hear that. My starting point for the sceptics would be that safety is a core part of every business’ customer value proposition. That is: no users want unsafe experiences. Nobody wants to be confronted by scammers when they're online; no parent wants their child stumbling across porn.
We see the Online Safety regulations as an opportunity to create a mindset and culture where safety is baked in from the start - rather than late in the day, in the aftermath of some event or tragedy. In the long run, we hope this will make safety processes both more effective and less costly for all market participants. Increasingly, we’re hearing from investors that the robustness of safety processes and the compliance plans that sit alongside them are becoming significant considerations when making decisions. So the market is shifting.
At the same time, the arrival of regulation is creating specific opportunities for innovation. Take the players in safety tech providing age assurance services - I’ve already heard that our focus on protecting children online has created a seismic shift in attitudes of online porn services to age assurance, and in turn that is creating a huge opportunity for those age assurance providers.
All of that said, I acknowledge the concerns and Ofcom does have a responsibility to act proportionately and to really carefully consider the impact of our work on innovation, investment and competition. That's a responsibility we take very seriously, in particular through the impact assessment work that we undertake whenever we're developing new rules.
FORM: The rules catch a huge number of players, from the very big to the very small. How does Ofcom think about dealing with such a breadth of services?
RW: Yes, we estimate that over 100,000 services are now in scope of the Online Safety Act - and I think the sheer scale and diversity of services we regulate are the most challenging features of our role.
It's also something we think very carefully about in terms of how our rules apply to services of different sizes and risk levels. For instance, some of our rules will only apply to large services or to services with particular risks in certain harm areas. On the opposite end of the spectrum, we've invested a lot of time and energy in making our rules as accessible as possible, especially to startups who might not have dedicated compliance teams or professionals poring over the detail of what is quite a complex regulatory system.
Just one example of how we’re trying to engage the full breadth of players caught by the new rules: we're holding an industry conference online on the 4th of June (link to be added) and that's going to be focused on our new rules on protecting children. And it's open to anyone in industry who'd like to attend.
FORM: Many startups may not have been aware of the new rules until reading this - what can they do to check their exposure and work out what they need to do?
RW: The legal duties around illegal content are already fully in force and the duties around protecting children will be fully in force by the end of July, so companies who have child users will need to be thinking about their children's risk assessment right now.
To help with that, we've developed a number of guides and tools, all of which are available through the Ofcom website. That includes a regulation checker to help services work out in the first instance if they are in scope of the Online Safety Act. For services who are in scope of the act, we've launched a tool to help guide them through their risk assessment duties and the other steps they need to take to keep their users safe. That tool has been co-designed with the help of nearly 50 companies, most of whom are SMEs - so hopefully it’s a good place to start.
FORM: It's hard to regulate disruptive technology because the fastest moving spaces - like GenAI - make progress daily. How should we think about online safety rules keeping pace with such rapid change?
RW: That's completely true, and GenAI is a perfect case study. GenAI has huge potential for consumers, society and industry, but we also know that GenAI can be used by bad actors. Sadly it has been spawning entire services dedicated to harming people, like the recent spike in “nudify” apps.
GenAI probably wasn't top of mind for many policymakers when the online safety laws were being developed, but the Online Safety Act and Ofcom's implementation of the new rules are largely tech neutral, and in particular, the risk assessment duties in the act will go a long way to ensuring that GenAI services are safe for users.
At the same time, we understand it’s our role to swiftly clarify how the rules apply to a new technology specifically. In response to emerging concerns about the safety risks that arise from Gen AI and chatbots, we published an open letter (link to be added) in the autumn of last year. This was intended to help GenAI services understand if they were covered by the Online Safety Act and if they were, summarised what they needed to do. That clarification took days, not months or years.
FORM: Businesses are increasingly having to manage disparate online safety regimes in different markets. How does Ofcom think about how it works to reduce the burden for those players?
RW: We completely appreciate that most, if not all of the online services we touch have a global dimension, and while a few jurisdictions have recently implemented laws that are similar to ours, others have taken a different approach.
It obviously makes things tougher for companies - when the national laws themselves can be quite complex as well. That’s why we try to align our asks on industry, wherever that's possible, within the constraints of national laws. Ofcom has been a driving force behind the Global Online Safety Regulators Network, which is the forum that we use to formalise those relationships with the likes of our counterparts in Ireland and Australia - countries that have relevant, similar regimes.
We also see something of an opportunity for global leadership here for the UK. If we get it right with a proportionate regime that also delivers good safety outcomes, then we're really optimistic that other countries will follow, and that'll make it easier for businesses everywhere.