Seeking the "right regulation" of digital services: Lords' Communications Committee articulates its vision

11 March 2019

In 2018, Mark Zuckerberg, Facebook's CEO, told the US Senate Judiciary Committee that the question was no longer whether there should be regulation of the internet, but what is the right regulation.

In advance of the highly anticipated publication of the government White Paper on 'Online harms', which the Secretary of State for Culture, Media and Sport has indicated will set out legislative measures to ensure social media platforms protect their users from a range of online harms, including online hate speech, on Saturday 09th March 2019 the House of Lords' Communications Committee published its report 'Regulating in a digital world'. We doubt that Mr Zuckerberg will think that this is one of the occasions when the Europeans have got it all right…

Overall, the Committee's recommendations appear to go further than what we anticipate the government will put forward as part of its White Paper, and include a number of high level aims which demonstrate little apparent thought to how they might be implemented or the impact this would have on the provision of services or the experience of users. We wait to see whether the government adopts any of the Committee's recommendations as part of its proposals, and comment on some of the Committee's key recommendations below.

In summary, relying on the Cambridge Analytica scandal, online hate speech directed toward female MPs, the suicide of Molly Russell, and media reports of the sexual activities of the celebrity referred to as 'PJS', as well as what it perceived as challenges posed to competition, media pluralism, privacy, consumer protection, and common decency, the Committee sets out a radical vision to govern the online space, drawing heavily from principles contained within the GDPR, and advocating certain key proposals including:

  • the creation of a new over-arching super-regulator, the Orwellian-sounding Digital Authority, to oversee and instruct existing regulators in the area;

  • the regulation of user services, including search engines, marketplaces, social media, gaming and content-sharing platforms, using a system of principles-based regulation;

  • the withdrawal of the safe-harbour provisions in the E-Commerce Directive, while the Committee rejected the imposition of strict liability;

  • increased transparency obligations toward users, regulators, and the public, with clearer terms and conditions, and what is effectively an enhanced subject access regime in relation to the use of algorithms and created behavioural data;

  • enhanced regulation of algorithms by the Information Commissioner's Office, which should create a code of conduct;

  • a requirement for maximum privacy and safety settings to be imposed by default;

  • a stricter competition law regime, to focus on the impact of the accumulation of data with a new public interest test; 

  • an enhanced right of data portability, to be enforced by way of the regulation of interoperability;

  • the imposition of a duty of care on online services which host and curate content which can openly be uploaded and accessed by the public, to be upheld by a regulator with powers of enforcement;

  • greater investment in moderation by online services, with Ofcom to adjudicate upon appeals;

  • regulation of online services' compliance with their terms of use and the power to impose fines for failure to comply;

  • increased powers for the ICO, CMA and Ofcom; and,

  • the imposition of a classification system, similar to the BBFC, for websites, consistent with the platform's age policy.

    Many of the proposals present challenges to established principles and lack detail as to their practical implementation.

Many of the proposals present challenges to established principles and lack detail as to their practical implementation.

While the proposals are intended to ensure that unlawful conduct is treated consistently whether it takes place online or offline, and there is a professed commitment not to stifle free speech or lead to unjustified censorship, they extend far beyond the regulation of what is unlawful and trespass on what is deemed to be harmful or anti-social. The proposals would thereby impose more stringent restrictions on the online space than would apply in other forums for public discourse, potentially threatening an undue restriction on freedom of expression. The lack of any attempt to articulate what constitutes an 'online harm' also serves to increase the risk of mission creep. It is objectionable to put online service providers in the position of legal adjudicators, with the threat of sanction if they are deemed not to be delivering in that role in the desired manner.

While any attempt to achieve a comprehensive system of regulation with clarity around the powers of regulators may be welcomed by some, the imposition of such a broad range of new controls in one fell swoop would present significant challenges, and risks having a disproportionate impact which may be difficult to reverse.

By proposing to regulate the terms and conditions of user services, apparently without seeking to set minimum standards, the Committee risks subjecting the most responsible platforms to the greatest regulation by virtue of seeking to enforce their terms and conditions.

Background

In January 2018, the Government published its Digital Charter, a rolling programme of work to establish "norms and rules" online guided by the principles that: (i) the internet should be free, open and accessible; (ii) people should understand the rules that apply to them when they are online; (iii) personal data should be respected and used appropriately; (iv) protections should be in place to help keep people safe online, especially children; (v) the same rights that people have offline must be protected online; and, (vi) the social and economic benefits brought by new technologies should be fairly shared. The Charter identified the government's priorities as including protecting people from harmful content and behaviour, the legal liability of online platforms, and data and artificial intelligence.

The government subsequently published its response to the Internet Safety Strategy green paper, which reiterated its commitment to the principles of ensuring that what is unacceptable offline should be unacceptable online, that all users are empowered to manage online risks and stay safe, and tech companies have a responsibility to their users and for the content they host. Despite Google, Twitter and Facebook stating that they would work with Government to establish a social media code of practice and transparency reporting, the government announced that new laws would be created to "make sure the UK is the safest place in the world to be online" and committed to the publication of the forthcoming online harms white paper, which it is anticipated will address a number of topics covered by the Committee's report, including age verification for social media companies.

Regulation

The Lords' Committee posited that existing law and regulation affecting the provision and use of digital services was piecemeal and inadequate, being governed by the Information Commissioner's Office, Ofcom, and the Competition and Markets Authority and affected by the GDPR, DPA 2018, law of misuse of private information, breach of confidence, E-Commerce Directive, Computer Misuse Act 1990, and the Malicious Communications Act 1988 inter alia.

The Lords Committee proposes the creation of an overarching super-regulator, the Orwellian-sounding 'Digital Authority', which would not only co-ordinate non-statutory organisations and existing regulators but have over-arching powers in relation to the latter.

In concert, a new joint select committee is proposed, the remit of which would cover all matters related to the digital world and which would specifically oversee the Digital Authority, "to create a strong role for Parliament in the regulation of the digital world".

It is proposed that this and all other regulators would be governed by a commitment to 10 key principles, many of which appear to be drawn from the obligations imposed under the General Data Protection Regulation and the Data Protection Act 2018:

  1. Parity

  2. Accountability

  3. Transparency

  4. Openness

  5. Privacy

  6. Ethical design

  7. Recognition of childhood

  8. Respect for human rights and equality

  9. Education and awareness raising

  10. Democratic accountability, proportionality and evidence-based approach

The principle of parity was illustrated with the example that social media platforms should face the same obligations in relation to the imposition of age-based access restrictions as providers of online pornography. Such an approach would appear to fail to take account of the risk associated with any given online service and instead result in the imposition of the strongest restrictions on all services. It is not clear, therefore, how the Committee envisages the principle of parity would interact with that of proportionality.

Liability of social media platforms

In a bold rejection of the protections afforded to online intermediaries by the E-Commerce Directive 2000/31/EC, which recognises that platforms ought not to be required to proactively monitor content and that information society services which are a "mere conduit" or are simply involved in "caching" content ought not to be held responsible for unlawful content and those services which are "hosting" unlawful content ought not to be held responsible for content of which they are not on notice or - once on notice - have acted expeditiously to remove, the Lords Committee considered that the hosting and curation of content which can be uploaded and accessed by the public meant that a notice and takedown model was no longer appropriate. The Committee recommends revising or replacing the protections, but rejected the imposition of strict liability. It is not clear what the Committee envisages would be an appropriate standard, but it may be that specific timescales for responding to notice and takedowns coupled with its proposals for the enforcement of terms and conditions would be sufficient. The Committee referenced the Australian model in this regard, and the powers of the Office of the e-Safety Commissioner to resolve complaints.

Obligations of social media platforms

Arguing that the moderation processes employed by social media platforms "are unacceptably opaque and slow", the Lords Committee recommends that online services hosting UGC "should be subject to a statutory duty of care and that Ofcom should have responsibility for enforcing this duty of care, particularly in respect of children and the vulnerable in society", which should incorporate moderation services and an obligation to achieve safety by design. The Committee did not accept the evidence calling for external adjudications of complaints or even judicial review of online moderation. Although the Committee does not seek to articulate the scope of the duty, in February the Children's Commissioner published a draft statutory duty of care proposed to be applicable to any online service provider which proposes a duty to "take all reasonable and proportionate care to protect [anyone under the age of 18] from any reasonably foreseeable Harm", which is defined as "a detrimental impact on the physical, mental, psychological, educational or emotional health, development or wellbeing" of children, and from which liability for the acts of third parties can only be avoided if the provider has done "all it reasonably can to prevent Harm". The factors by which the discharge of the duty should be determined, such as the speed of responding to complaints (legitimate or otherwise), are not proposed to be limited to their application to children, and would therefore have the effect of imposing wider obligations vis-à-vis all users of the service regardless of impact. The imposition of a duty of care to the provision and operation of online services would significantly extend the circumstances in which such a duty has been imposed by law, impacting not only on the acts of online service providers but also on their omissions, as well as imposing liability on a blanket basis regardless of whether that would appear to be fair and just in the circumstances of a given case.

Competition

Concerned about the impact of the creation of data monopolies and the consequences for consumer protection, and (perhaps surprisingly) comparing online service providers to utility providers, the Committee recommended that the consumer welfare test needs to be broadened to move away from a focus on consumption and price and that a public interest test should be applied to data-driven mergers.  One could envisage that this could encapsulate privacy, protection of democracy, and media pluralism issues, and could even lead to conditions under these heads being imposed on any approved merger.

Algorithms

Despite the ability of users to request information regarding whether their personal data has been processed by automated means and the logic behind such processing, the design and transparency of algorithms was of particular concern to the Committee.

In an example of a differentiation between acceptable conduct online and offline, the Committee disapproved of the use of technology to take advantage of psychological insights to manipulate user behaviour, for example to encourage time spent using a service. While psychological insights have long been a tool utilised by the retail sector, for example, and even the government itself with David Cameron's 'nudge unit', the Committee suggested that ethical design required that "individuals should not be manipulated but free to use the internet purposefully”. The Committee recommended that the ICO should not only produce a code of conduct on the design and use of algorithms, potentially working with the Centre for Data Ethics and Innovation to establish a kitemark scheme, but also suggested that it should have powers of audit and that its powers should be supported by sanctions.

The Committee also recommended that greater transparency around the use of algorithms and the data generated be achieved by requiring service providers to publish information about the data being generated and its use, as well as by affording users an enhanced right of subject access. The Committee proposed that the former be applicable to both data controllers and data processors, although it would only seem appropriate to apply any such obligation to data controllers.

Terms and conditions

The transparency, fairness and age appropriateness of terms and conditions was also a key focus for the Committee and, given what it considered to be the imbalance of power between users and service providers, the Committee suggested that these should be subject to regulatory oversight with any service provider which breached its terms of service being subject to enforcement. This would not appear to incentivise service providers to provide gold standard service commitments for fear of being penalised for failing to meet them and could result in a lower common standard.

Conclusion

While many of the Committee's proposals are likely to be lauded in some quarters, the practicality of designing and implementing them, and the impact they would have on the majority of users and the provision of services, means that they warrant at least further scrutiny, if not revision or rejection, if the government is to achieve the "right regulation".

Stay connected and subscribe to our latest insights and views 

Subscribe Here