This article throws the light on the ‘Analysis of The Regulation of Deepfake Technologies under India’s IP Regime‘ Introduction It can be said that Deepfake Technology[1] facilitates the synthesis of electronic media, as it utilizes artificial intelligence[2] driven algorithms to alter or doctor images, videos, and/or audio. These systems are crucially dependent on data sampling[3], and the quality… Read More »

This article throws the light on the ‘Analysis of The Regulation of Deepfake Technologies under India’s IP Regime

Introduction

It can be said that Deepfake Technology[1] facilitates the synthesis of electronic media, as it utilizes artificial intelligence[2] driven algorithms to alter or doctor images, videos, and/or audio. These systems are crucially dependent on data sampling[3], and the quality of the output is often determined[4] by the standard, variation, and volume of data being sampled. The technology allows for the potential of cheap and efficient[5] fabrication of digital footage and has been widely regarded as a tool that could be used [6]in a variety of practical and constructive ways.

But, the fact of the matter is that it has been largely propagated as an instrument for misrepresentation and misappropriation, especially with regards to media available on public figures, as the same is more susceptible to fabrication, as a result of the relatively sizeable amount of data available on such individuals.

As far as regulation of such technologies in India is concerned, the issue tends to be a mix of disregard and incomprehension on the part [7]of lawmakers, as well as perceived inadequacy [8]of India’s existing Intellectual-property regime in dealing with the problems posed by the prospective proliferation of deepfake technology within the nation. Address of the same may require an analysis of both the current laws, as well as alternative recourses.

Examination of the protections available under the Copyright Act

The Indian Copyright Act of 1957 is designed to prevent infringement upon artistic productions. Unauthorized adaptations also fall under the purview of such protection, as Section 14 of the act assigns the exclusive right of adaptation to the owner. Thus, only said individuals can convert, edit or bring alterations upon their work, which should ideally hinder the kind of unauthorized audio-visual sampling that is often required in the production of contested deepfake content.

But, the fact of the matter is that Section 51 of the Act sets definite parameters in regards to what can be considered as an infringement, and as provided in the case of R.G. Anand v. Deluxe Films (1978), the works in question need to possess fundamental similarities, and since deepfake technologies by-design are intended to effect principal alterations onto the original work, they may be argued as being outside of the purview of the section.

The aspects of the data that is often used in sampling are often non-copyrightable[9] and the end-product is often unique enough to qualify as original content. This qualification is based on the principle[10] that the nature of an adaptation is dependent on the product itself, and not the method.

When analyzing the validity of the technology on the basis of the Fair Use doctrine, as proposed under Section 52 of the act, it is apparent that the utilisatisation of protected content for data-sampling algorithms is not specifically protected under the same. This is underlined by the limited allowance of unauthorized use of original works, regardless.

But, because of the excessive alterations that the technology often permits, proper identification of the infringement itself might end up being a difficult ordeal. Even under the fair use doctrine, there are certain exceptions, as evidenced by the provisions of Clause (1)(b) of the section, the same specifies that the use of protected material for the public propagation of news, or other factual information, would qualify as ‘fair use.

Section 57 of the act specifies moral rights that are available to the creators of original works, as any alterations or adaptations that may negatively affect their reputation would be considered as being violative of the section, and the proliferation of the same can be restricted. But as already discussed, the nature of the altered content is often such that the identification of the violation is difficult

Specific Limitations of the Copyright regime in the regulation of Deepfake Technologies

Copyright protections are often assigned to individuals, for the purpose of protecting their economic interests[11], in regards to the monetization of their works. But the fact of the matter is that the manner in which the deepfake creators utilize the protected works are often such that the interests of the creator remain unaffected, and are usually not monetized and therefore, can be argued as being non-infringing under Section 52(1) of the Act. This combined with the inherent difficulty associated with the identification of deepfaked material makes the diagnosis of infringement particularly arduous.

The character of the copyright protections that are often afforded to audio-visual works, especially films and other similarly propagated media, is often fragmented[12], as the protections are divided amongst broadcasters, producers, and other entities.

This is particularly crucial since the individual on-screen, who’s usually adversely affected by such fabricated content, often doesn’t actually possess the right to claim damages on the grounds of copyright infringement. A similar principle was advanced in the case of Fortune Films v. Dev Anand (1974), where it was held that an individual did not automatically gain rights over their performance in a film.

Conclusion

Thus, it can be concluded that the existing IP regime is not capable of adequately regulating unauthorized production of deepfakes within the country, or protecting the rights of the individuals that are affected by the dissemination of the same. But a blanket ban on the technology itself might not be advisable either, as deepfakes are capable of being used for educational, and welfare purposes and the technology itself is revolutionary enough to warrant further research and development.

Section 17 of the Data Protection Bill of 2021 does provide the individual with the right to be forgotten[13], which may allow the aggrieved party to invoke the same against any deep faked footage of them that may be proliferated, provided that the concerned individual holds the copyright for the original work.

Other steps that the state can take in regards to the regulation of the technology, maybe the formulation of regulation similar to the prospective Deepfakes Accountability Act[14] from the US, which makes it compulsory for the assignment of an identifying watermark on all deep faked works, and also mandates the accreditation of all footage that was sampled for the creation of such works.

The introduction of such regulations, as well as specific amendments to the existing copyright act, may be necessary for developing a productive and accountable regime for the regulation of deepfakes in the country.


[1] What are deepfakes – and how can you spot them? Available here

[2] Stephen F. DeAngelis, Artificial Intelligence: How Algorithms Make Systems Smart, Available here

[3] Sindhu Seelam, What Is Data Sampling and Statistical Techniques for Effective Sampling in Machine Learning, Available here

[4] Mika Westerlund, The Emergence of Deepfake Technology: A Review, Available here

[5] Johannes Langguth, Konstantin Pogorelov & others, Don’t Trust Your Eyes: Image Manipulation in the Age of DeepFakes, Available here

[6] Simon Chandler, Why Deepfakes Are A Net Positive For Humanity, Available here

[7] John Xavier, Deepfakes enter Indian election campaigns, Available here

[8] Purvi Nema, Understanding copyright issues entailing deepfakes in India, Available here

[9] ivy Attenborough, Voices, Copyrighting and Deepfakes, Available here

[10] Copyright Law in India, Available here

[11] Arathi Ashok, Economic Rights of Authors under Copyright Law: Some Emerging Judicial Trends, Available here

[12] Copyright of Cinematograph Films and Sound Recording, Available here

[13] Data protection Bill has provisions for ‘right to be forgotten’, Centre tells HC, Available here

[14] DEEP FAKES Accountability Act, Available here


  1. Law Library: Notes and Study Material for LLB, LLM, Judiciary and Entrance Exams
  2. Legal Bites Academy – Ultimate Test Prep Destination
Updated On 19 Feb 2022 2:37 PM IST
Melvin Joseph

Melvin Joseph

National University of Advanced Legal Studies (NUALS), Kochi, Kerala

Next Story