TOPIC : SHOULD GOVERNMENT RE-EVALUATE ITS ROLE IN THE REGULATION OF SOCIAL MEDIA PLATFORMS?

THE CONTEXT: Recently, the arresting of Alt news Journalist and before that, many arresting around the country raises questions that the content that are posted on social media is not following the community standard (as claimed by these platforms). Although many of these cases are controversial, the development is opined by certain experts to be a start to further elaboration of legislation and government control of these dynamic new media sectors. In this article, we will analyze whether these platforms should be regulated by the government.

RECENT CASES OF ARRESTING FOR SOCIAL MEDIA POSTS

  • In June 2022, Altnews Journalist was arrested for a controversial post.
  • A tailor in Udaipur was beheaded for a controversial post on the prophet Mohammad.
  • for In June 2022, 18 persons from Saharanpur were arrested for social media post
  • A man was arrested in Rajasthan’s Alwar for posting objectionable content with communal overtones on social media
  • In Chhattisgarh, a 34-year-old was booked under Section 67(A) of the Information Technology Act, which deals with online obscenity.
  • In Assam, two-person arrested for allegedly putting up derogatory social media posts.
  • In Kerala, 149 cases were registered against several people for allegedly posting objectionable remarks.
  • The cybercrime sleuths of Rachakonda arrested three persons for posting obscene content through fake social media accounts to harass known women and a girl.

All these raised the question of whether the government should regulate online content or should give it free hand.

CHALLENGED POSED BY SOCIAL MEDIA IN RECENT TIMES?

  • Extreme content: It is one of the major causes of extremism in society. Recently beheading of a tailor in Udaipur shows how extreme content affects society.
  • Hate Speech: In India, legal provisions around hate speech have been previously misused to target marginalized communities and dissenting voices. Numerous hate speech cases have been brought against individuals for posts they made on social networking websites.
  • Rumour/ Fake news: It has been observed that in the recent cases of Mob lynching, fake news/Rumor about cattle theft/ Child kidnapping was the major cause. Misinformation about the COVID-19 pandemic has been on the rise in India since January 2020.
  • Data Piracy: Cambridge Analytica case in 2018 and recent bank fraud raised the concern of data safety.
  • Pornography: Pornography, Child pornography is a serious issue.
  • Objectionable content: Objectionable content in different shows, especially in web series, has posed a serious challenge to society.

WHAT IS THE PRESENT REGULATION FOR ONLINE CONTENT?

Although there is no specific legislation in India, that deals with social media, and in maximum cases, it is self-regulated. Still, there are several provisions in the existing so-called cyber laws which can be used to seek redress in case of violation of any rights in cyberspace, the Internet and social media. The legislation and the relevant provisions are specifically enumerated as under:

1.The Information Technology Act, 2000

  • Sections 65, 66, 66A, 6C, 66D, 66E, 66F, 67, 67A and 67B contain punishments for computer-related offences which can also be committed through social media.
  • Section 69 of the Act grants power to the Central or a State Government to issue directions for monitoring of any information.
  • Section 69A grants power to issue directions to block public access to any information.
  • Section 69B grants power to issue directions to authorize any agency to monitor.
  • Section 79 provides for the liability of the intermediary. An intermediary shall not be liable for any third-party information, data or communication link made available or hosted by him in the following cases-
  • His function is limited to providing access to a communication system over which such information is transmitted, stored or hosted.
  • He does not initiate, select the receiver and select the information contained in the transmission.
  • He observes due diligence and other guidelines prescribed by the Central Government while discharging his duties.
  • He has conspired, abetted, aided or induced by threats, promises or otherwise in the commission of the unlawful Act.
  • He fails to expeditiously remove or disable access to the material which is being used to commit the unlawful Act upon receiving actual knowledge or on being notified by the government.
  • If any intermediary fails to assist, comply with the direction and intentionally contravenes provisions under Sections 69, 69A and 69B, respectively, he shall be liable to punishment.
  • Section 43A provides that where a body corporate possesses, dealing or handles any sensitive personal data.
  • Section 70B provides for an agency of the government to be appointed by the Central Government called the Indian Computer Emergency Response Team, which shall serve as the national agency for performing functions relating to cyber security.

2. The Information Technology Rules, 2009: Procedure and Safeguards of Interception, Monitoring and Decryption of Information- The interception or monitoring or decryption of information under Section 69 shall be carried out by an order issued by the competent authority.

3. The Information Technology Rules, 2009: Procedure and Safeguards for Blocking for Access of Information by Public- The Central Government in the exercise of its powers under Section 87(2) with regard to the procedure and safeguards for blocking access by the public under Section 69A.

4. The Information Technology Rules, 2009: Procedure and Safeguard for Monitoring and Collecting Traffic Data or Information-

  • Directions for monitoring
  • The competent authority (the Secretary of the Government of India in the Department of Information Technology) may issue directions for monitoring for purposes related to cyber security.

5. The Information Technology Rules, 2011: Intermediaries Guidelines- It is mandatory for the intermediary to inform the users by clearly stating that under the rules and regulations

6. The Information Technology Rules, 2011: Reasonable Security Practices and Procedures and Sensitive Personal Data or Information- The disclosure of sensitive personal data or information by a body corporate to any third party shall require prior permission from the provider of such information.

Ø  Code of Ethics for Social Media: At the time of the general election 2019, social media platforms and the Internet and Mobile Association of India submitted the “Voluntary Code of Ethics for the General Election 2019” to Election Commissioner.

Ø  The companies agreed to create a high-priority dedicated reporting mechanism for the ECI and appoint dedicated teams during the period of General Elections to take expeditious action on any reported violations.

WHY IS THERE A NEED TO REVIEW THE PRESENT REGULATIONS?

1.The challenges posed by the Internet activism

The power of the Internet is precisely the reason that governments want to regulate it.

The fears of governments about the Internet:

  • National security (instructions on bomb-making, illegal drug production, terrorist activities);
  • Protection of minors (abusive forms of marketing, violence, pornography);
  • Protection of human dignity (incitement to racial hatred or racial discrimination);
  • Economic security (fraud, instructions on pirating credit cards);
  • Information security (malicious hacking);
  • Protection of privacy (unauthorized communication of personal data, electronic harassment);
  • Protection of reputation (libel, unlawful comparative advertising);
  • Intellectual property (unauthorized distribution of copyrighted works, software or music).

2. Misuse of section 66A: Section 66A was inserted through an amendment to the Act in 2008. It provides punishment for sending offensive messages through communication services.

The issue:

  • There is an inherent inconsistency between the phraseology of Section 66A and Article 19 (1) (a) of the Constitution, which guarantees freedom of speech and expression to every citizen.
  • Under Article 19(2), restrictions on freedom of speech and expression are reasonable if they pertain to any of the listed grounds, such as sovereignty and integrity of India, security of the state, friendly relations with foreign states, public order, decency or morality, or in relation to contempt of court, defamation or incitement to an offence.
  • But under Section 66A, restrictions have been placed on freedom of speech and expression on several other grounds, apart from those mentioned in the Constitution. To add to the fear that this provision could be hugely misused, several incidents in the recent past bear testimony to the same.
  • The Supreme Court found this arbitrary, disproportionate and unreasonable restriction on the right to free speech. The court also said that the speech available online should have the same level of constitutional protection of free speech as that available offline.

GOVERNMENT ATTEMPT FOR THE ONLINE REGULATION

On April 4 2018, the government issued an order seeking to establish content regulations for the Internet, modelled on the ones currently applicable to traditional media like print and television.

Major Timeline:

  • April 25, 2018: The Ministry of Information and Broadcasting, Government of India, posted a tender online for the creation of a ‘Social Media Communications Hub’. As per this tender, the selected company would be required to monitor Twitter, YouTube, LinkedIn, internet forums and even e-mail in order to analyze sentiment and identify “fake news”.
  • June 16, 2018: The government was planning to get help from social media platforms, including WhatsApp and Facebook, to filter out fake text messages and videos.
  • July 3, 2018: The Ministry of Electronics and Information Technology accused WhatsApp of allowing the circulation of irresponsible and explosive messages.
  • August 3, 2018: The central government withdrew the proposal to create a ‘Social Media Communications Hub’ following mainstream media unrest and the filing of a plea before the Supreme Court.
  • August 21, 2018: The IT Minister urged Whatsapp to create a mechanism through which the source of fake news could be traced.
  • December 24, 2018: The Ministry of Electronics and Information Technology has prepared the draft Information Technology (Intermediary Guidelines) Rules 2018 to replace the rules notified in 2011 under the Information Technology Act, 2000. Intermediary refers to platforms such as Facebook and Twitter. There were five major guidelines for intermediate:
  • The intermediary was enabled to trace out the originator of information on its platform as may be required by government agencies.
  • Any platform with more than five million users in India would be required to register a company and have a permanent registered office in India.
  • Platforms would be required to preserve information for at least 180 days for investigation purposes.
  • The platforms would be required to “deploy technology-based automated tools” for “proactively identifying and removing or disabling public access to unlawful information or content.”
  • The platform informed its users on a monthly basis about the rules and regulations of the platform and warned of immediate termination in case of violation.
  • January 7, 2019: The I&B Minister called for self-regulation by social media platforms to deal with fake news.

INFORMATION TECHNOLOGY (INTERMEDIARY GUIDELINES AND DIGITAL MEDIA ETHICS CODE) RULES 2021

  • To protect users from incorrect takedowns and account suspensions by social media platforms, the need was felt to institute effective grievance redressal mechanisms (GRM).
  • In India, before May 2021, GRMs of social media platforms, if any, were designed as per the concerned platform’s terms of service. There was no standardization, in terms of resolution and timelines, in the design of these GRMs.
  • If one was to make a complaint, the process would typically consist of filling out an online form, which would usually solicit an automated response.
  • The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules (or IT Rules), 2021, streamlined this by bringing in uniformity.
  • Social media platforms now have to appoint a “grievance officer” before whom a user may file a complaint.
  • The grievance officer is required to acknowledge the complaint within 24 hours and resolve it within 15 days. If unsatisfied with the officer’s order, the user may approach the high court/Supreme Court.

After that

  • Accessing the writ jurisdiction of the court can be a time and cost-intensive process. And all users cannot afford that. In this light, it was important to create an appellate that is not as resource intensive to engage with.
  • The government’s motivation behind creating this appellate committee seems to come from other factors as well. According to the government, it created this tier because “currently there is no appellate mechanism provided by intermediaries nor is there any credible self-regulatory mechanism in place”.
  • But the government and social media platforms saw convergence on a self-regulatory approach being the most optimal design for an appellate mechanism, even as the bare minimum structure is unclear.

CONCERNS ABOUT A SELF-REGULATORY MODEL AFTER GRM 2021

POLITICAL BIASNESS

  • Social media platforms have not been paragons of objectivity in deciding which content they want to host or take down. Their political biases have become visible through their decisions to either amplify or restrict certain kinds of content.
  • For example, while Twitter is commonly understood to be more partial to liberal/Leftist views, Facebook has been alleged to be partial to Rightist stances. An internal appellate mechanism will likely toe the line of the organization and carry and reinforce the same biases in deciding whether a piece of speech should be allowed or not.

APPELLATE MECHANISM IS NOT TRULY INDEPENDENT

  • Even if a number of social media platforms come together to form an appellate tier instead of individual appellate mechanisms, the members of this appellate tier will not have functional independence.
  • As long as social media platforms control the members’ appointments, terms of employment and service conditions, they will be wary of taking decisions that may hurt the platform.

TRUST ISSUE

  • A self-regulatory approach to adjudicating speech is likely to be riddled with trust issues. Consider the case of Facebook. The platform’s solution for ensuring transparency and impartiality in its content moderation decisions was to constitute the Oversight Board. Facebook created a $130 million irrevocable trust to maintain the Board’s independence and the latter did overturn many of Facebook’s content moderation decisions. But now, the Board has come under severe criticism that its existence has not substantially improved Facebook’s content moderation practices.

These concerns are amplified if, at a later stage, social media platforms are made subject to penalties for wrongfully suspending or terminating a post or user account. It can hardly be expected that social media platforms will design self-regulatory mechanisms in a manner that will encourage them to be held liable and penalized for their decisions.

THEN, SHOULD THE GOVERNMENT REGULATE ONLINE CONTENT IN INDIA?

There are two ways for the regulation of social media

1.Self-regulation

2.State regulation

Let’s discuss the Pros and Cons of both regulations.

1.SELF REGULATION

Pros

  • Moral pressure
  • Effective in ensuring freedom of expression.
  • No state interference.
  • In house regulatory mechanism by every company as per their suitability.
  • Unaffected media is necessary for a democratic society.

Cons

  • Lack of accountability.
  • No transparency.
  • No clear roadmap for the content.
  • Illegal activities.
  • No codification for action against unlawful content.

2.STATE REGULATION

Pros

  • A clear roadmap for the content.
  • It limits criminal activity.
  • It helps protect children.
  • It strengthens online security.
  • It sets standards for what should not be published.

Cons

  • Freedom of freedom will be at stake: the state may use absolute power
  • It limits access to important information.
  • Limits economic opportunities.
  • Freedom of media will be under threat: State interference in every manner will affect the freedom of media.

The above analysis shows that self-regulation is the most suitable mechanism for social media. And online content should be self-regulated, but there is a need for improvement in the present situation, which the following steps can hold:

  • There will be a clear codification of moral regulation, and all the stakeholders should follow them.
  • All the Websites, Companies and other shareholders should respect the IAMAI initiative norms.
  • There should be a proper security mechanism for data security.
  • Companies can be aware the people about data security, fake news, hate speech and other dangers.
  • Content should be lawful and should be as per the societal norms.
  • Companies should develop a strong mechanism against objectionable content.

THE WAY FORWARD:

Although self-regulation is better, but the state should fulfil its duty in some cases, such as:

  • Data protection
  • Stop rumours
  • Stop Cybercrime
  • Resolve the issue of IT Act 66A and freedom of expression

1.The issue of IT section 66A and freedom of speech

  • Some parts, such as Section 66A, were successfully challenged in court and struck down as unconstitutional in 2015 in the Shreya Singhal case. But the Act still empowers the government to block, filter and take down content online.
  • The government is empowered to turn off internet access completely. These are options which are, in fact, exercised by the Indian government on a regular basis (between January 2010 and April 2018, there have been more than 164 incidents of internet shutdowns in different parts of India.).
  • Section 66A of the IT Act wanted to ambitiously crackdown on information online which could cause “annoyance, inconvenience, danger, obstruction, insult, injury, criminal intimidation, enmity, hatred or ill will.” It imposed a punishment of up to three years along with a fine.

2. Solutions to Tackle Hate Speech:

  • Need to amend Information Technology Act 2000 with good implementation.
  • There can be an internationally accepted law that places the responsibility on social media companies like Facebook to tackle hate speech by deleting obviously illegal content within 24 hours if there is a request from the government of a particular nation.
  • Generating contra-narratives on social networks and raising public awareness through campaigns to tackle extremism.
  • Social media platforms need to take responsibility to ensure transparency, accountability and a system of rules and guidelines that users can recognize as standards and which, when enforced in a regularized fashion, can begin to act as precedents.
  • The Indian government has been pushing for internet platforms to locate their servers in the country, which might help address dangerous speech in real-time.

3. Solution for data protection: The recommendations of B.N Srikrishna committee should be implemented

  • The processing (collection, recording, analysis, disclosure, etc.) of personal data should be done only for “clear, specific and lawful” purposes.
  • The government may process the personal data if this is considered necessary for any function of Parliament or State Legislature.
  • ‘Right to be forgotten.
  • This right is one of several given to data principals, including the right to confirm what information is being held or disclosed about them and to get this corrected if necessary.
  • Personal data will need to be stored on servers located within India, and transfers outside the country will need to be subject to safeguards.
  • Critical personal data, however, will only be processed in India.
  • “Sensitive” personal data should not be processed unless someone gives explicit consent, which factors in the purpose of processing.
  • Setting up a Data Protection Authority.

4. For Cybercrime, a Cyber cell should be established:

  • The cyber police station in Maharashtra is located every in district SP headquarters under the Police Commissioner.
  • The team will investigate all offences related to the Internet, consisting of one police inspector and three sub-inspectors.

Global practices in this regard

AUSTRALIA

  • Australia’s government agreed to make the News Media Bargaining Code law.
  • Australia passed the Sharing of Abhorrent Violent Material Act in April 2019, introducing criminal penalties for social media companies, possible jail sentences for tech executives for up to three years, and financial penalties worth up to 10% of a company’s global turnover.

USA

  • Recently US government released an executive order to revisit a law that gave absolute immunity to social media platforms. The US government is currently in the process of determining what exactly should happen to Section 230 of the Communications Decency Act (the federal law that gives internet companies protection from liability for user-generated content disseminated on their platforms).

UK

  • Under the new Online Safety Bill by the UK government, social media sites, websites, apps, and other services which host user-generated content or allow people to talk to others online that fail to remove and limit the spread of such harmful content will face fines of up to £18 million ($24 million) or ten percent of their annual global turnover.

GERMANY

  • It introduced the NetzDG Law in 2018, which states that social media platforms with more than two million registered German users have to review and remove illegal content within 24 hours of being posted or face fines of up to €50m (£42m).

NEW ZEALAND

  • Aotearoa New Zealand Code of Practice for Online Safety and Harms. It is a pact agreed to be signed by tech giants in New Zealand to curb harmful online content. 

THE CONCLUSION: It is clearly evident that social media is a very powerful means of exercising one’s freedom of speech and expression. However, it is also increasingly used for illegal acts, which has given force to the government’s attempts at censoring social media. While, on the one hand, the misuse of social media entails the need for legal censorship, on the other hand, there are legitimate fears of violation of the civil rights of people as an inevitable consequence of censorship.

Keeping all this in mind, it is suggested that the government should form a Committee including technical experts to look into all the possible facets of the use and misuse of social media and recommend a suitable manner in which it can be regulated without hindering the civil rights of citizens.

QUESTIONS TO PONDER

  1. “Freedom of expression on social media is integral to a healthy, thriving democracy. We will be stronger by enabling and cultivating it, not curtaining it.” Analyze the statement.
  2. “It is imperative for the government to recognize the menace of hate speech and ensure that there is proper regulation in place to tackle the issue”. In the light of the statement, discuss what should be the structure for online content regulation in India.
  3. “Regulation of social media content should be best left to the tech companies themselves”. Do you agree with the statement? Justify your view.
  4. Should social media be self-regulated or state-regulated? Analyze your opinion.

ADDITIONAL INFORMATION

GUIDELINES RELATED TO SOCIAL MEDIA ARE TO BE ADMINISTERED BY THE MINISTRY OF ELECTRONICS, AND IT

Ø  Due Diligence To Be Followed By Intermediaries: The Rules prescribe due diligence that must be followed by intermediaries, including social media intermediaries. In case, due diligence is not followed by the intermediary, safe harbour provisions will not apply to them.

Ø  Grievance Redressal Mechanism: The Rules seek to empower the users by mandating the intermediaries, including social media intermediaries, to establish a grievance redressal mechanism for receiving and resolving complaints from the users or victims. Intermediaries shall appoint a Grievance Officer to deal with such complaints and share the name and contact details of such officer. Grievance Officer shall acknowledge the complaint within twenty four hours and resolve it within fifteen days from its receipt.

Ø  Ensuring Online Safety and Dignity of Users, Specially Women Users: Intermediaries shall remove or disable access within 24 hours of receipt of complaints of contents that exposes the private areas of individuals, show such individuals in full or partial nudity or in sexual Act or is in the nature of impersonation including morphed images etc. Such a complaint can be filed either by the individual or by any other person on his/her behalf.

Ø  Two Categories of Social Media Intermediaries: To encourage innovations and enable the growth of new social media intermediaries without subjecting smaller platforms to significant compliance requirements, the Rules make a distinction between social media intermediaries and significant social media intermediaries. This distinction is based on the number of users on the social media platform. Government is empowered to notify the threshold of user base that will distinguish between social media intermediaries and significant social media intermediaries. The Rules require the significant social media intermediaries to follow certain additional due diligence.

Ø  Additional Due Diligence to Be Followed by Significant Social Media Intermediary:

  • Appoint a Chief Compliance Officer who shall be responsible for ensuring compliance with the Act and Rules. Such a person should be a resident in India.
  • Appoint a Nodal Contact Person for 24×7 coordination with law enforcement agencies. Such a person shall be a resident in India.
  • Appoint a Resident Grievance Officer who shall perform the functions mentioned under Grievance Redressal Mechanism. Such a person shall be a resident in India.
  • Publish a monthly compliance report mentioning the details of complaints received and action taken on the complaints as well as details of contents removed proactively by the significant social media intermediary.
  • Significant social media intermediaries providing services primarily in the nature of messaging shall enable identification of the first originator of the information that is required only for the purposes of prevention, detection, investigation, prosecution or punishment of an offence related to sovereignty and integrity of India, the security of the state, friendly relations with foreign States, or public order or of incitement to an offence relating to the above or in relation with rape, sexually explicit material or child sexual abuse material punishable with imprisonment for a term of not less than five years. Intermediary shall not be required to disclose the contents of any message or any other information to the first originator.
  • Significant social media intermediary shall have a physical contact address in India published on its website or mobile app or both.
  • Voluntary User Verification Mechanism: Users who wish to verify their accounts voluntarily shall be provided an appropriate mechanism to verify their accounts and provided with a demonstrable and visible mark of verification.
  • Giving Users An Opportunity to Be Heard: In cases where significant social media intermediaries removes or disables access to any information on their own accord, then a prior intimation for the same shall be communicated to the user who has shared that information with a notice explaining the grounds and reasons for such action. Users must be provided an adequate and reasonable opportunity to dispute the action taken by the intermediary.

Ø  Removal of Unlawful Information: An intermediary upon receiving actual knowledge in the form of an order by a court or being notified by the Appropriate Govt. or its agencies through authorized officer should not host or publish any information which is prohibited under any law in relation to the interest of the sovereignty and integrity of India, public order, friendly relations with foreign countries etc.

Ø  The Rules will come in effect from the date of their publication in the gazette, except for the additional due diligence for significant social media intermediaries, which shall come in effect 3 months after publication of these Rules.

Spread the Word
Index