Social media companies are facing mounting pressure to reform and accept sweeping new regulation following a string of high-profile reports criticising the industry.

The range of reports come ahead of an expected Government white paper on online harms, due to be published before the end of March, which could include plans to introduce regulation on how social networks operate and the duty of care they have to their users.

Here is a look at the wide range of reports and experts calling for tighter rules on social media.

– What have politicians said?

The latest findings from MPs urge social media addiction to be classed as a disease. The All-Party Parliamentary Group (APPG) on Social Media and Young People’s Mental Health and Wellbeing said social networks such as Facebook and Twitter should be regulated by Ofcom and forced to adhere to a statutory code of conduct.

This is just one of several reports published in the last eight weeks with a similar message. Earlier this month, the House of Lords Communications Committee said a Digital Authority should be created to oversee the regulation of internet companies by the likes of Ofcom, as well as impose a “duty of care” on the firms.

In February, the House of Commons Digital, Culture, Media and Sport (DCMS) Committee labelled Facebook and others “digital gangsters” in its own major report, calling for the Government to bring forward legislation that would set out ethics guidelines for what was and was not acceptable on social media.

It said that when breaches of these guidelines occurred, an independent regulator should have the power to launch legal proceedings against them and issue large fines.

Also in February, the Cairncross Review – commissioned by the Prime Minister and which looked into the future of the UK news industry – said regulation was needed to ensure quality news content was hosted by the likes of Google and Facebook.

This was then supported by a European Commission study published at the end of last month, which said Facebook, Google and Twitter were all failing to provide enough detail on the steps they had taken to cut disinformation on their platforms, in line with a new Code of Practice they had each signed up to.

– Who else has spoken out against the firms?

Children’s charity the NSPCC backed the suggestion of large fines for companies that fail to remove harmful content that is a risk to younger users.

It said named directors should be held personally liable for upholding a legal duty of care to children, banning them from other directorial roles if they are found to be in breach. It also announced support for fines of up to 4% of global turnover in the event of breaches.

Earlier this month the charity also published figures which suggested that the number of children targeted for grooming and abuse on Instagram had tripled in just 18 months between April 2017 and September 2018.

NSPCC chief executive Peter Wanless accused social media firms of “10 years of failed self-regulation”.

Ian Russell, the father of teenager Molly Russell, who was found to have viewed content linked to self-harm and suicide on Instagram before taking her own life in 2017, has also suggested the time of social media self-regulation has passed.

“Up until now they have chosen their own course. Governments have allowed social media platforms to be self-regulated, but remember this really is a matter of life and death and it’s getting worse,” he said earlier this month.

“Now is the time for the UK Government to bring effective internet regulation, with strong sanctions as back-up.

“Now is the time for the UK to lead the world in making the online world a safer place, especially for the young.”

– What have social media firms said in response?

Sir Nick Clegg, the former Liberal Democrat deputy prime minister and Facebook’s new head of global affairs, has said the site is willing to work with governments on regulation.

“I think it’s perfectly legitimate for governments to say, ‘Look, we want to take our responsibility as governments or legislators to set boundaries’,” he said.

“I don’t think it is sustainable for tech companies to just say, ‘No, we don’t like all regulation’.”

The Internet Watch Foundation (IWF) has also offered some support, warning the Government against “rushing into knee-jerk regulation” which could harm victims of online sex abuse through “unintended consequences”.

Instead, the foundation urged the Government to work with social media on a regulatory framework, rather than imposing it on them.

– What happens next?

The Government has confirmed its white paper on online harms will be published by the end of the month, and digital minister Margot James has said it will set out “clear expectations” for companies on how they should keep their users safe.

“We in Government have a duty to act in a way which will compel, as well as encourage, companies to put the protection of children and the security of users at the heart of their corporate culture,” she said.

“What’s unacceptable offline has to have the same unacceptability and the sanctions and the force of law behind it online, just as it does offline.”