The proposals will give Ofcom the power to fine tech giants up to £18m or 10% of their annual global turnover.
Committee chair Damian Collins said: “The era of self-regulation for Big Tech has come to an end. The companies are clearly responsible for services they have designed and profit from, and need to be held to account for the decisions they make.”
Collins told CNBC: “We’re bringing in a lot more offences onto the face of the bill that deal with things like promoting self harm, racial abuse and other forms of abuse.” He explained that Ofcom should have the power to tell tech companies what minimum standards are required of them and to fine them should they breach those standards.
“These would be quite big changes to the way this [bill] works,” Collins said. “The core thing is it’s taking existing offences and law and applying them online and that the regulator has the legal power to do that.”
Collins has also argued that tech giants should name the individuals who are responsible for online safety.
“The company has to say this is the group of people that is responsible,” he said. “There’s no excuse.”
In a statement posted on his website in August, Collins explained how the “new draft law proposes to put a legal framework around hate speech and harmful content and empower an independent regulator to hold the tech giants to account.”
He wrote, “Creating a legal framework around what constitutes harmful content online is extremely challenging, not least because freedom of speech is at the heart of our democracy and must be protected.
“The new joint parliamentary committee is charged with examining this Bill line by line to make sure it is fit for purpose and can both tackle harmful content as well as preserving people’s right to freedom of expression.”
His comments echo those of prime minister Boris Johnson who, speaking to MPs earlier this year, spoke of the need for the government to adopt a tougher stance towards tech giants.
“It’s time the online giants realised that they simply cannot think of themselves as neutral pieces of infrastructure,” he told the Liaison Committee. “They are publishers and they have responsibility for what appears on their systems and the online harms bill is designed to give effect to that distinction.”
The Online Safety Bill is not without its critics, however, especially from amongst civil rights groups and lawyers.
Big Brother Watch for instance has released a report titled The State of Free Speech Online, dissecting the Bill and warning of the threats it poses to free speech in the UK.
The campaign group’s Legal and Policy Officer, Mark Johnson, said: “The Bill does absolutely nothing to help police deal with real crime online but rather, focuses the lens on ordinary people’s conversations. The Bill will force social media companies to suppress lawful content which is controversial, counter-cultural or offensive.”
He added, “These new rules will leave us with two tiers of speech, where speech that’s permitted on the street is not allowed online. This framework for control of lawful speech will do untold damage to free expression that may be impossible to reverse.”
He suggests the government “should remove powers over lawful speech from the Bill altogether.”
Similarly, barrister Francis Hoar, in a 50-page legal analysis titled In Protection of Freedom of Speech, criticises the Bill for imposing “upon websites a duty to remove content that could ‘harm’ not just children but adults: harm caused by words and (quite possibly) ideas.”
He argues that a government “that assumes responsibility for protecting its subjects…from what it determines is harmful is a dangerous edifice indeed.”
“The UK government,” he writes, “appears to have formed the view…that it is responsible for the safety of every scared and vulnerable individual from risks that were once treated as part of living.”
Keep up-to-date with publishing news: sign up here for InPubWeekly, our free weekly e-newsletter.