Speaking to the VLV Autumn Conference, she said expectations need to be set over the scope of the government's upcoming Online Safety Bill.
The pending legislation, which has proved controversial, is expected to force big technology firms to provide a duty of care to users. Furthermore, it would require them to take action against illegal, and most controversially, legal but harmful content or else face severe financial penalties.
She said the regulator was working “very constructively” on the legislation with the Department for Digital, Culture, Media and Sport (DCMS), but added: “We do need to set expectations, we’re not going to be able to manage everything and that is just the nature of the beast... I always say it’s a bit like whack-a-mole where, with the broadcasting code, you can whack every mole. But online there are far too many moles to whack."
“What I do think we’re going to do is to make a big difference and we’re going to measure that so I am hopeful… but we’re not going to be able to wipe everything that’s illegal or harmful off the internet.”
She added that Ofcom has a good working relationship with the government and DCMS saying, “All of us are working very constructively to achieve a very effective Bill so I am optimistic about that.”
“We have quite a bit of water to go under the bridge but the Ofcom board is working really hard on that.
“We are working with DCMS and Government and they are listening to us.”
Andy Burrows, head of child safety online policy at the NSPCC said: “If the Online Safety Bill is to be judged a success it needs to prevent abuse and end the current whack-a-mole approach platforms take to harmful content.
“To do this the legislation needs to be significantly strengthened and compel platforms to work together to stop abuse spreading across different sites and apps before children come to harm.
“This means changing the culture at the top of firms.
“The Bill should put a legal duty on every social media platform to name a senior manager who is responsible for children’s safety and give Ofcom the power to hold them criminally liable for failure.”
However, the legislation has faced criticism from some lawyers and campaign groups. For instance, campaign group Big Brother Watch has released a report, titled The State of Free Speech Online, which dissects the Online Safety Bill and warns that it poses a serious threat to online free speech in the UK.
Big Brother Watch’s Legal and Policy Officer, Mark Johnson, said: “The Bill does absolutely nothing to help police deal with real crime online but rather, focuses the lens on ordinary people’s conversations. The Bill will force social media companies to suppress lawful content which is controversial, counter-cultural or offensive.”
The Bill will force social media companies to suppress lawful content which is controversial, counter-cultural or offensive.
Mark Johnson
He added, “These new rules will leave us with two tiers of speech, where speech that’s permitted on the street is not allowed online. This framework for control of lawful speech will do untold damage to free expression that may be impossible to reverse.”
He suggests the government “should remove powers over lawful speech from the Bill altogether.”
Likewise barrister Francis Hoar, in his 50-page report titled In Protection of Freedom of Speech, also criticises the legislation: “Disturbingly, the Online Safety Bill imposes upon websites a duty to remove content that could ‘harm’ not just children but adults: harm caused by words and (quite possibly) ideas. A state that assumes responsibility for protecting its subjects (the word ‘citizen’ does not seem apt) from what it determines is harmful is a dangerous edifice indeed.”
“The UK government appears to have formed the view...that it is responsible for the safety of every scared and vulnerable individual from risks that were once treated as part of living,” he said.
Dr Anna Loutfi, one of the report’s contributors, told InPublishing: “My understanding is that Ofcom’s Maggie Carver continues to emphasise the purpose of the Bill with respect to protecting children from harmful content online, but is reticent to address concerns about censorship.
“The key question continues to be what is “harm”? How is it defined; is it adequately defined in this Bill? The draft legislation casually equates “harm” and “misinformation”; the latter is presented as a “real danger” to society, which is under threat from unnamed “hostile actors”. Will the Bill effectively hand power over to tech giants to define real information as opposed to misinformation?”
She adds, “Pornography undoubtedly presents huge risks for children and young people who currently have unlimited access to it - but porn is a wholly different class of thing to “misinformation”, and to conflate the two can only, again, encourage abuse of power by giving politically unaccountable entities the right to decide on what is harmful to society and what is not.”
Keep up-to-date with publishing news: sign up here for InPubWeekly, our free weekly e-newsletter.