Deepfake Technology as a Threat and Challenge to Evidentiary Proceedings
Marcin Galiński, M.A., University of Szczecin
This article is a paper delivered during the seminar “Evidentiary Procedure in Civil Law and Common Law Systems– Challenges and Pitfalls” on the 27th of September 2022 within “The Effective Justice. International and Comparative Approaches” project.
It is a truism that in the modern era technical development is faster than in the previous ages. Technical development causes changes in every aspect of human life at previously unknown scale and speed.These changes connect with some opportunities in modern societies, and some of them can be dangerous, especially if they get out of control.
One of the newest signs of technological changes is deepfake technology. The term “deepfake” comes from the underlying technology term “deep learning” which is one of the forms of Artificial Intelligence (briefly, deep learning allows the computer to learn and perform tasks that are natural to the human brain, such as voice recognition, identifying of pictures or processing of natural language). That knowledge, gained by deep learning, raises possibilities for modification of existing videos (or precisely manipulations) which consist of a change of a human voice, face or body moves, or synchronization of a human face with added sound. Thus, deepfake technology is a set of algorithms allowing the creation of ultra-realistic fake films in which people talk and do things that have not happened in objective reality (Wasiuta, Wasiuta, 2019, p. 21-22; Kietzman et al., 2019, p. 1-2). Deepfake can be used for various purposes, such as entertainment (including satirical purposes), education, disinformation, discrediting, or to commit a crime[1].
What is crucial, deepfake in the current stage of development, cannot be treated as finished technology. Like other technologies, it will become more popular, available, and easier to use by casual users. Moreover, the development of that technology will increase the possibility of creating videos containing false pictures of reality, which eradicate a fine line between reality and fiction.
So, the issue of deepfake cannot be indifferent in the area of law. Deepfake has an impact on both substantive (e.g. deepfake can be one of the forms of committing a given crime) and procedural law (when deepfake videos impact certain decisions or verdicts in a particular case). Further considerations will be limited to procedural law, namely to an indication of the impact of deepfake technology on the evidentiary procedure.
Provisions that regulate the matter of legal proceedings differ depending on the country. Nevertheless, in democratic states, a crucial element of every legal procedure are proceedings based on evidence. Every legal system has its definition of evidence and a catalog of sources that can be treated as evidence. Hence, for the clarity of deliberations, the abovementioned issue will be presented on the ground of Polish criminal and civil proceedings.
Both of Polish Code of Civil Procedure and the Polish Code of Criminal Procedure express a principle of no hierarchy of evidence. Namely, in the Polish legal system there are no existing legal norms establishing hierarchy or value of given evidence (however, the legislator foresees some exceptions from that principle). Thus, for instance, a testimony of a witness, from a legal point of view, has the same value as an audio or video record (Rudkowska-Ząbczyk, 2021, Legalis; Sakowicz, 2021, Legalis). An assessment of the trustworthiness and usefulness of a piece of evidence relies on the judge. Although that assessment is made freely, it must include all evidence gained during the proceedings and rely on the judge’s convictions. The Code of Criminal Procedure requires, in Article 7, including knowledge, indications, and life experience. Nevertheless, in practice, video or audio recordings are better means of proof than e.g. witness testimony, as video recordings used to be complicated to fabricate. As a consequence, that evidence is reliable, obvious, and hard to refute for the existence or non-existence of facts that are the subject of the proceedings, which makes that type of evidence significant (and sometimes even the most fundamental) in evidentiary proceedings.
After these remarks, it should indicate the negative impact of deepfakes on the evidentiary procedure. First and foremost, mentioned technology questioned the assumption of video recordings' full credibility. Because of using a technology such as deepfake, video recording is fabricated, so it cannot fully illustrate a past occurrence which is the issue of the proceedings. Parties, other participants, or even another person prima facie not connected with a particular case can use deepfakes to create a false picture of reality in the minds of judges, juries, or other competent organs, which will lead to passing an injustice and counter-factual verdict. It is possible to distinguish various forms of using deep fake in the evidentiary proceeding. A major (but not the only) way is:
presenting in the fabricated video of a situation that is real, however, the author of deepfake changes some element of reality (e.g. behavior of a certain person, subject of behavior, or circumstances of the presented situation);
eradicating proof of existing situations which is the subject of the certain proceeding by deepfake video;
creating proof of the existence of facts that are not happening in objective reality (McPeak, 2021, p. 435-440; Sloot et al., 2021, p. 3 and further; Delfino, 2022, p. 10 and further).
The first group of presented situations is a modification of a picture of reality by changing some elements. Forms and goals are different and depend on circumstances that are the subject of the proceeding. These modifications can be aimed to change the content of some civil-law transactions, the act of will (e.g. modifying a testator's words to gaining undue inheritance) or to gain a milder penalty for committing a crime or avoid punishment because of the alleged existence of circumstances eliminating criminal responsibility (despite there is no reason to the liberal treatment of perpetrator).
The second group of activity with using deepfake tools can take place to remove the evidence on some facts important for a particular proceeding. An example of the usage of deepfake technology can remove from a video recording a criminal activity or modify activity presented on video in such a way that it cannot be considered a crime. Another example can be a situation when a video recording is fabricated to avoid legal responsibility for a tort.
The last group contains behaviors that have not happened in objective reality but were imagined by the author of deepfake. For instance, video recordings can be fabricated to falsely accuse some person of allegedly committing an unreal crime. Deepfake video presenting unreal situation can be used in proceedings also for purpose of obtaining undue compensation for unreal tort or contract delict or presentation of fabricated last will of the testator (complied in video form) to become unrightfully heir.
As can be seen, the range of possible situations for making use of deepfake in evidentiary proceedings is very wide. Taking into account the importance and factual weight of a video recording as evidence gathered from a document (on the ground of Polish law), it is necessary to indicate a direction of the solution to problems connected with the issue of deepfake in evidentiary proceedings. Because of the complexity and interdisciplinary character of the deepfake issue, in my opinion, specific solutions and means of counteracting the negative impact of deepfake should not be merely a result of one's thoughts and reflections. So, it is necessary to reduce considerations to the indication of possibilities of anti-deepfake actions.
A means of counteraction to deepfake technology's negative impact on evidentiary proceedings can be various. The first form can be a legal reaction. The legal reaction can vary in the particular domestic legal system and relies on the amendment of a procedural provision giving more tools for checking the reliability of video recordings for judicial organs. As such we can indicate the possibility of appointing an expert to prepare the opinion about the possible use of deepfake technology or even impose an obligation to check video evidence for possible modifications. Another idea is penalization of the intentional usage of fabricated video recordings during legal proceedings to mislead organs and pass a counter-factual verdict[2].
Apart from the domestic measures, also provisions of international law (especially of international organizations such as the EU) create legal frames for cooperation between countries, member states of given international organizations, officials, judicial organs, scientists, and researchers for research, prevention, detection of deepfake usage and transfer of knowledge.
The second form of possible action against using deepfake videos in evidentiary proceedings can be factual cooperation between scientists, experts, officials, judges, NGO members, and other interested people. Tangible cooperation is crucial because legal frameworks are insufficient and must be supplemented by other activities. The abovementioned cooperation should be concerned about technical issues connected with deepfake, such as methods of video fabrication, recognizing using a deepfake or possibilities of restoring the original content of fabricated video recordings, exchange of experience, transfer of knowledge, and working out international rules and even law regulations helping to counter-act usage of deepfake in legal proceedings.
The third possibility of anti-deepfake activity is education. Education on deepfake technology should include information on: what is deepfake technology, threats associated with this technology, methods of recognition of fabricated content, the impact of deepfake on administration and legal proceedings, how to counter-act misinformation caused by deepfake and where to report cases of video materials containing fake information or presentations as well as legal and factual consequences of creating and publishing a deepfake videos. Mentioned education should be broad – it should not be limited to only one form of conveying knowledge, but include all possible forms. Education should be directed to every interested person. However, special attention should be given to education in the mentioned range of knowledge for officials, clerks, judges (or other judicial organs), and other legal professions. People who make decisions or pass verdicts must be aware of the possibility of tries misleading them on the ground of particular cases.
In the title, I described deepfake technology as a threat and challenge. Indeed, deepfake technology is a real danger for legal proceedings but also to other disciplines and branches of life. Mentioned technology blurs the lines between truth and lie, making it difficult to distinguish them. Consequently, every current dogma, view, and belief about the world may be undermined easily and irreversibly. Deepfake can be reflected also as a challenge - a challenge for authorities, a challenge for lawyers, a challenge for scientists, and experts. As presented, there are plenty of methods of counteraction to deepfake’s negative impact on evidentiary proceedings. Application of appropriate actions and measures will reduce the danger of distortion of reality by fabricated videos during e.g. civil or criminal proceedings. From the activity of judges, other lawyers, officials or scientists, and researchers and taking perspective depends that deepfake will be more of a threat or will be more of a challenge, and this second attitude should more preferably because thinking of deepfake only as a threat can paralyze mankind and will be cause refrain from taking needed actions.
[1] Should be added that among deepfakes is distinguish deep porn. Deep porn is a fabricated pornographic video in which a face of certain person is putted on actor’s body (Ziobroń, 2021, p. 226 and further; Meskys et. Al., 2020, p. 24-31). [2] About prepositions of legal measures see Meskys et al., 2021, 24-31 and Sloot et al., 2021, 7-17.
References
Act of 17th of November 1964 – Code of Civil Procedure. Journal of Law 2021, item 1805 with amendments.
Act of 6th June of 1997 – Code of Criminal Procedure. Journal of Law 2022, item 1375.
Delfino, R.A. (2022). Deepfake on Trial: a Call to Expand the Trial Judge’s Gatekeeping Role to Protect Legal Proceedings from Technological Fakery. Loyola Law School, Los Angeles Legal Studies Research Paper, 2, 1-51. DOI: 10.2139/ssrn.4032094.
Kietzman, J., et al. (2019). Deepfakes: Trick or threat. Business Horisons, 63(2), 1-12. DOI: 10.1016/j.bushor.2019.11.006.
McPeak, A. (2021). The Threat of Deepfakes in Litigation: Raising the Authentication Bar to Combat Falsehood. Vanderbilt Journal of Entertainment and Technology Law, 2, 433-450.
Meskys, E, et al. (2020). Regulating deep fakes: legal and ethical considerations. Journal of Intellectual Property Law & Practise, 15(1), 24-31. DOI: 10.1093/jiplp/jpz167.
Rudkowska-Ząbczyk, E. (2021). Commentary on Article 233 of Code of Civil Procedure. In: E. Marszałkowska-Krześ (ed.), Code of Civil Procedure. Commentary. Warsaw: C.H. Beck.
Sloot, B.v.D., et al. (2021). Summary: Deepfakes: The Legal Challenge of a Synthetic Society. Tilburg: Tilburg Institute for Law, Technology and Society. https://www.tilburguniversity.edu/sites/default/files/download/Deepfake%20EN.pdf.
Sakowicz, A. (2021). Commentary on Article 7 of Code of Criminal Procedure. In: A. Sakowicz (ed.), Code of Criminal Procedure. Commentary (Legalis). Warsaw: C.H. Beck.
Wasiuta, O., Wasiuta, S. (2019). Deepfake as a complicated and deeply false reality. Annales Universitatis Paedagogicae Cracoviensis. Studia de Securitate, 9(3), 19-30. DOI: 10.24917/26578549.9.3.2.
Wasterlund, M. (2019). The Emergance of Deepfake Technology: A Review. Technology Innovation Management Review, 9(11), 39-52. DOI: 10.22215/timreview/1282.
Ziobroń A. (2021). Deepfake and the criminal law. De lege lata and de lege ferenda remarks regarding fake pornography. Student Journal of Law, Administration and Economics, 21, 225-238. DOI: 10.19195/1733-5779.37.15.
Comments