‘Legal and illegal’ speech for content-sharing platforms.

Batting for shared responsibility, tech giant Google has stated that governments need to draw clear lines between ‘legal and illegal’ speech for content-sharing platforms, as lack of clear definitions could lead to ‘arbitrary’ enforcement that limits access to legitimate information. Google said content-sharing platforms are already working on developing content policies that set up baseline expectations for users, and articulates a clear basis for removal of content as well as for suspension or closure of accounts.

“But, it’s also important for governments to draw clear lines between legal and illegal speech, based on evidence of harm and consistent with norms of democratic accountability and international human rights. Without clear definitions, there is a risk of arbitrary or opaque enforcement that limits access to legitimate information,” Google senior vice president, global affairs, Kent Walker said in a recent blogpost

Noting that tackling this issue is a ‘shared responsibility’, he said there are many laws that govern online content covering areas such as consumer protection, defamation and privacy.

Walker said Google is part of the ‘Christchurch Call’ where governments and tech companies have come together to commit to eliminating terrorist and violent extremist content online.

In March, a terrorist attack on two mosques in Christchurch, New Zealand, was live streamed. The live stream of the attack, which left 51 people dead and 50 injured, was viewed some 4,000 times before being removed.

“As with any new information technology, societies and cultures are developing new social norms, institutions, and laws to address new challenges and opportunities. We look forward to contributing to that extraordinarily important project,” he added.

Walker said Google has been using digital tools and human reviewers to identify and stop a range of online abuse — from ‘get rich quick’ schemes to disinformation to child sexual abuse material. The company also responds promptly to valid notices of specific illegal content.

He highlighted that it is important for oversight frameworks to recognise the different purposes and functions of different services.

Citing an example, he said rules that make sense for social networks and video-sharing platforms may not be appropriate for search engines, enterprise services, file storage and communication tools, among others.

“Different types of content may likewise call for different approaches,” he said.

Walker also stated that international coordination should strive to align on broad principles and practices.

“While there is broad international consensus on issues like child sexual abuse imagery, in other areas individual countries will make their own choices about the limits of permissible speech, and one country should not be able to impose its content restrictions on another,” he said.

Leave a Reply

Your email address will not be published. Required fields are marked *