Available via license: CC BY 4.0
Content may be subject to copyright.
(2025) 5 Turf Law Journal 1-14 | 2789-746X | 1
The Inuence of ChatGPT-generated Data on the Administration
of Justice in South Africa
Isiphile Petse * and Usenathi Phindelo**
* LLB, LLM (University of the Western Cape), Lecturer (Walter Sisulu University, School of Law) <https://
orcid.org/0000-0002-4079-5544> e-mail: ipetse@wsu.ac.za
** LLB student (Walter Sisulu University, School of Law) <https://orcid.org/0009-0003-8940-3542> e-mail:
221135952@mywsu.ac.za
Received: 28 January 2025, Accepted: 23 March 2025, Published: 25 April 2025, Doi: https://doi.org/10.62726/tlj.v5.60
Abstract
ChatGPT, as a medium for South African legal practitioners to conduct legal research, is a
threat to the courts and the administration of justice. ChatGPT has become a manipulative
tool that deceives legal professionals. For instance, the chatbot has generated false or mislead-
ing legal documents. e information generated by ChatGPT not only misleads the court
but also misleads legal practitioners who use the information without verication. Failing to
verify the information generated by ChatGPT means that legal practitioners could be found
guilty of committing the common-law crime of perjury, especially when a legal practitioner
presents incorrect information during court proceedings. is article examines the cases of
Mavundla v MEC: Department of Cooperative Governance and Traditional Aairs and
Others; Michelle Parker v Amanda Forsyth NO and Others and Roberto Mata v Avianca
to illustrate the impact of ChatGPT-generated information on the courts and the adminis-
tration of justice. e article also refers to various methods and guidelines that South Afri-
can legal practitioners can use to verify the accuracy of information produced by ChatGPT.
Furthermore, the article examines existing codes of conduct and suggests sanctions that can
be used when incorrect or biased information is presented in court. In conclusion, the article
suggests that the EU AI Act is a framework South Africa can consider when developing its
legal guidelines for articial intelligence.
Keywords
Chat Generative Pre-trained Transformer (ChatGPT), administration of justice, South African
courts, legal practitioners, perjury, code of conduct, Legal Practice Council
1. Introduction
Chat Generative Pre-trained Transformer (ChatGPT)1 is a language model that was
1 Wilson, M ‘ChatGPT Explained: Everything You Need to Know About the AI Chatbot’ (2023)
<https://www.techradar.com/news/chatgpt-explained> accessed 31 August 2023; Michelle
Parker v Amanda Forsyth NO and Others case no 1585/20 (29 June 2023) para 86. e court
dened a chatbot as a computer program designed to simulate conversation with human users
over the Internet.
2 (2025) 5 Turf Law Journal 1-14 | 2789-746X |
released in November 2022 by OpenAI.2 ChatGPT uses advanced articial intelligence
(AI) techniques to generate coherent, natural-sounding, human-like language responses
to prompts or inputs.3 ChatGPT can full a wide range of requests made via text, including
generating a gratitude letter.4 and assisting with legal research.5 Despite its ability to
answer questions and help with legal research, the chatbot has changed how society
obtains legal information.6 is has heightened exposure to legal claims arising from AI-
related violations of fundamental human rights.7 South African lawyers’ use of ChatGPT
to conduct legal research has become an increasing problem. In a regional court case,
Michelle Parker v Amanda Forsyth NO and Others,8 the plainti’s attorney explained that
his partner had sourced cases using ChatGPT.9 South African lawyers and judges in other
jurisdictions, such as the United States of America,10 Colombia11 and the Netherlands,12
have also used ChatGPT.
ChatGPT has proven to be a signicant problem and a threat to the integrity of
the courts and the quality of the legal information accessed and used. For example, a
judge in a subdistrict court in Gelderland, Netherlands, based a verdict on information
generated by ChatGPT. is reliance on chatbot-generated content astonished experts,
2 Perlman, A ‘e Implication of ChatGPT for Legal Services and Society’ <https://papers.ssrn.
com/sol3/papers.cfm?abstract_id=4294197> accessed 1 September 2023; Schulman, J, Zoph,
B & Kim, C et al ‘Introducing ChatGPT’ (2022) <https://openai.com/blog/chatgpt> accessed
1 September 2023.
3 Kalla, D & Smith, N ‘Study and Analysis of ChatGPT and its Impact on Dierent Fields of
Study’ (2023) 8 International Journal of Innovative Science and Research Technology 828; Lo
Chung, K ‘What is the Impact of ChatGPT on Education? A Rapid Review of the Literature’
(2023) 13 Education Science 1.
4 Lund, BD ‘A Brief Review of ChatGPT: Its Value and the Underlying GPT Technology’ (2023)
<https://www.researchgate.net/publication/36680957> accessed 1 September 2023; Michelle
Parker (note 1) para 86. ChatGPT can also compose various forms of written content, including
articles, social media posts, essays, code and email.
5 Michelle Parker (note 1) para 87 (in this case, the plainti’s attorneys used ChatGPT to conduct
legal research); Perlman (note 2) 2.
6 Perlman (note 2) 1–2. e author also conrms that ‘AI tools like ChatGPT hold the promise of
altering how we generate a much wider range of legal documents and information’.
7 Ka Mtuze, S & Morige, M ‘Towards Draing Articial Intelligence (AI) Legislation in South
Africa’ (2024) Obiter 163.
8 Note 1.
9 Para 86.
10 Roberto Mata v Avianca Inc case 22-cv-1461 (PKC) (the plainti’s counsel submitted six cases
that were generated by ChatGPT); Maruf, R ‘Lawyer Apologizes for Fake Court Citations
from ChatGPT’ (2023) <https://edition.cnn.com/2023/05/27/business/chat-gpt-avianca-mata-
lawyers/index.html> accessed 3 September 2023.
11 Luke, T ‘Colombian Judge Says He Used ChatGPT in Ruling’ (2023) <https://www.theguardian.
com/technology/2023/feb/03/colombia-judge-chatgpt-ruling> accessed 3 September 2023
(Judge Manuel, a judge in Colombia, used ChatGPT to make a ruling); Janus, R ‘A Judge Just
Used ChatGPT to Make a Court Decision’ (2023) <https://www.vice.com/en/article/k7bdmv/
judge-used-chatgpt-to-make-court-decision> accessed 3 September 2023.
12 Quekel, S ‘Disbelief at Dutch Judge Using ChatGPT in Verdict: “is is really unacceptable”’
(2024) <https://www.ad.nl/binnenland/ongeloof-om-nederlandse-rechter-die-chatgpt-gebruikt-
in-vonnis-dit-kan-echt-niet~ae3288e10/?referrer=https%3A%2F%2Fwww.dutchnews.nl%2F>
accessed 31 April 2025.
Isiphile Petse and Usenathi Phindelo: e Inuence of ChatGPT-generated Data on the Administration of Justice... 3
who voiced signicant concerns regarding the potential for inaccuracies. ey
emphasised that employing such technology in the judiciary raises serious ethical and
practical challenges.13 e use of ChatGPT by lawyers and judges has shown that the
chatbot can be discriminatory14 and misleading.15 is article analyses the legal impact
of ChatGPT-generated information on the courts and the administration of justice. e
analysis identies and discusses existing codes of conduct prohibiting South African legal
practitioners from presenting incorrect and biased information in court.
Secondly, the article explores existing sanctions that could be imposed by the courts,
especially when legal practitioners provide biased and incorrect information. Lastly, to
protect the independence, impartiality, dignity, and accessibility of the courts, the organs of
state bear a paramount duty to ensure that legislative measures are executed accordingly.16
is article calls for ChatGPT to be human-centric and to maximise the benets of AI
solutions while minimising lawyers’ risk and exposure to legal claims arising from AI-
related violations of fundamental human rights.17
e article suggests that the EU’s AI Act is a framework that South Africa can consider
in developing its legal guidelines for AI. It recommends dierent methods that South African
legal practitioners can use to verify the accuracy of the information generated by ChatGPT.
2. The impact of ChatGPT on the courts and the administration of justice
ChatGPT has generated false or misleading legal information that has been used in the
courts.18 In Michelle Parker,19 the court heard that an attorney used information generated
by ChatGPT20 without verifying the accuracy of the information generated by the
chatbot.21 e court inquired whether the information that was generated by the chatbot
was, in fact, accurate. e court asked the defendant and the plainti to make a concerted
eort to locate the suggested information22 or authority.23 e plainti’s counsel conceded
13 Ibid.
14 Janus, R ‘Facebook’s AI Chatbot: “Since Deleting Facebook My Life has been Much Better”’
(2022) <https://www.vice.com/en/article/qjkkgm/facebooks-ai-chatbot-since-deleting-facebook-
my-life-has-been-much-better> accessed 3 September 2023: ‘In 2016, Microso unleashed
anAI chatbot called Tay, which was shut down aer it turned into a racist, holocaust-denying
conspiracy theorist aer less than a day of interacting with users on Twitter.’
15 Janus, R ‘Stack Overow Bans ChatGPT for Constantly Giving Wrong Answers’ (2022) <https://
www.vice.com/en/article/wxnaem/stack-overow-bans-chatgpt-for-constantly-giving-wrong-
answers> accessed 3 September 2023; Roberto Mata v Avianca (note 10) para 4 (the court stated
that ‘ve decisions submitted by plainti’s counsel contain similar deciencies and appeared to
be fake as well’); Michelle Parker (note 1) para 87.
16 Constitution of the Republic of South Africa, 1996, s 165(4).
17 Ka Mtuze & Morige (note 7) 163.
18 Perlman (note 2) 18.
19 Note 1. e case was heard on 22 May 2023.
20 Para 86: ‘the plainti ’s counsel explained that his attorney had sourced the cases through the
medium of ChatGPT’.
21 Para 87. e court noted that the plainti ’s attorneys used the AI medium to conduct legal
research and accepted the results without satisfying themselves about the accuracy of the
information generated.
22 Para 79.
23 Para 80. e information or authority referred to was case law.
4 (2025) 5 Turf Law Journal 1-14 | 2789-746X |
that the information could not be sourced,24 which resulted in the defendant’s attorneys
being unable to access the information generated by the chatbot.25 e court found that the
citations, facts, and decisions generated by the chatbot were ctitious.26 e information
generated by the chatbot can not only mislead the court,27 it can also mislead any reckless
legal practitioner who uses and presents the generated information without verication.
In the United States, a similar incident occurred in Roberto Mata v Avianca Inc,28
where an attorney submitted six cases generated by the chatbot.29 e court conrmed
that the citations and decisions in the submitted cases did not exist.30 e Roberto Mata
case showed that the use of information generated by ChatGPT impacts the courts, and
any legal practitioner who uses the chatbot must be aware that the information it generates
could be false.31
In South Africa, the East London Circuit Local Division held in Mzayiya v Road
Accident Fund 32 that legal practitioners must ensure that court documents such as
pleadings, adavits, and heads of arguments are authentic. If they mislead the court by
using ChatGPT, they risk committing the common-law crime of perjury.33 It is important
to note that when legal practitioners rely on information generated by ChatGPT, they
must remember their paramount duty to the courts and the administration of justice.34 In
General Council of the Bar of South Africa v Geach and Others35 it was correctly emphasised
legal practitioners’ duty towards the courts:
[Legal practitioners] are the beneciaries of a rich heritage and the
mantle of responsibility that they bear as the protectors of our hard-
won freedoms is without parallel. As ocers of our courts lawyers
play a vital role in upholding the Constitution and ensuring that our
system of justice is both ecient and eective. It, therefore, stands to
reason that absolute personal integrity and scrupulous honesty are
demanded of each of them ...36
Legal practitioners not only have a positive duty to be honest in the information they
present, but they must also uphold the Constitution, 1996 and ensure that justice is
24 Ibid.
25 Para 85.
26 Para 87.
27 Para 88. e defendants’ counsel submitted that the attempt to provide ctitious information
misleads the court and such an act must be met with an appropriate punitive cost order.
28 Note 10.
29 Paras 25, 27, 29; Maruf (note 10).
30 Para 2. e court stated that one of the six cases, Varghese, contained a decision with internal
citations and non-existent quotes.
31 Maruf (note 10). e attorney admitted that he was unaware that ChatGPT’s information could
be false.
32 Mzayiya v Road Accident Fund [2021] 1 All SA 517 (ECL).
33 Ibid.
34 Ibid. Chapter 8 of the Constitution provides for the protection of the courts and the
administration of justice.
35 2013 (2) SA 52 (SCA).
36 Para 87.
Isiphile Petse and Usenathi Phindelo: e Inuence of ChatGPT-generated Data on the Administration of Justice... 5
correctly administered even outside the courts.37 Section 165(4) of the Constitution
states that ‘[o]rgans of state, through legislative and other measures, must ensure that
independence, impartiality, dignity, accessibility and eectiveness of the courts are
protected’.38 In Mzayiya, the court explained the extent to which legal practitioners have a
legal duty towards the court as follows:
(i) ey are encouraged to refuse cliental mandates that would
jeopardise the administration of justice.39
(ii) ey have a legal duty not to use evidence and legal points that
mislead the court.40
(iii) ey are required to conduct themselves with honesty and
integrity.41
Legal practitioners must not commit an act of perjury or sabotage the administration of
justice42 by using the chatbot. In South Africa, information obtained or presented in a
manner that violates any right contained in the Bill of Rights of the Constitution must be
excluded or considered inadmissible if the admission of such information would render
the trial unfair or detrimental to the administration of justice.43 is means that when
a legal practitioner presents or relies on incorrect information generated by ChatGPT
without proper verication, they increase the risk of violating the public trust44 and the
rights protected by the Bill of Rights, such as a client’s right to a fair trial. Even though
such information can be excluded and deemed inadmissible by the courts, no procedural
guidelines or court precedents exist to emphasise that ChatGPT’s false information can
render a trial unfair or undermine the administration of justice, as illustrated in this
article.
Moreover, in many cases, excluding information obtained or presented in ways
that violate Bill of Rights protections is not absolute. Consequently, there is a signicant
likelihood that courts may accept evidence or data generated by ChatGPT that is either
unconstitutional or illegally obtained. is may happen when the court cannot determine
whether the legal practitioner has veried the information.
37 Section 2 of the Constitution not only invalidates law and/or conduct that is inconsistent with
it but requires that any obligation imposed by it must be fullled.
38 Section 165(4) of the Constitution.
39 Mzayiya (note 32) para 93. See Van Eck, M ‘Duties of Practitioners to Court?’ (2022) available
at <https://www.derebus.org.za/duties-of-practitioners-to-court/> accessed 9 September 2023.
40 Mzayiya (note 32) para 83. In addition, the court stated that legal practitioners should not be
‘permitted to knowingly oer or rely on false evidence or to misstate evidence’. See Van Eck
(note 39).
41 Mzayiya (note 32) para 82; Van Eck (note 39).
42 Mzayiya (note 32) para 83. Kroon AJ conrmed that ‘[t]o suppress evidence or worse still to
suborn perjury, is to sabotage the administration of justice and it strikes at the heart of the legal
practitioner’s duty to court’.
43 Section 35(5) of the Constitution.
44 Mavundla v MEC: Department of Cooperative Governance and Traditional Aairs and Others
[2025] ZAKZPHC 2 para 24.
6 (2025) 5 Turf Law Journal 1-14 | 2789-746X |
Currently, South African courts and the Legal Practice Council have the discretion to
verify and investigate the accuracy of information generated by ChatGPT and presented
by legal practitioners. In a recent South African case, Mavundla v MEC: Department of
Cooperative Governance and Traditional Aairs and Others,45 the High Court emphasised
the need to scrutinise and investigate articially generated ctional citations presented
by a legal practitioner in court. In investigating the accuracy of the information, Judge
Bezuidenhout appointed two legal researchers to verify the presented information, and
they discovered that the citations were from ChatGPT.46 In many cases, courts usually
regard information presented by legal practitioners as authoritative, meaning that the
courts usually trust that the information provided during court proceedings has been
veried before being presented.47
us, the court in the Mavundla case viewed the inclusion of false citations by legal
practitioners as a serious threat to fundamental trust in the legal system.48 As a result, due
to the implications of ChatGPT for the legal profession, the court referred the matter to the
Legal Practice Council for further investigation,49 without providing a precedence or court
judgment that can assist or inuence the decisions made by courts in the future.
e Mavundla case illustrates that while precedent holds signicant weight in
common-law systems, the fabricated information generated by ChatGPT undermines the
principles of legal reasoning and academic integrity, especially if the data is unveried
and uncorrected by the courts. erefore, as part of the investigation, courts should
also examine the conditions under which the information was acquired and potentially
establish common-law rules that govern the use of ChatGPT by legal practitioners without
infringing on rights, in accordance with section 36 of the Constitution.
Section 36(1) of the Constitution states that ‘the rights contained in the Bill of
Rights may be limited only in terms of the law of general application and to the extent
that such limitation is reasonable and justiable in an open and democratic society based
on human dignity, equality, and freedom’.50 In S v Naidoo,51 the court held that the law
of general application can be a statutory or a common-law rule52 that permits illegal
or unconstitutional information to be admitted.53 is is regarded as a constitutionally
permissible limitation of the rights enshrined in the Bill of Rights.54 However, in the
absence of statutory and common-law rules that regulate the use of ChatGPT-generated
data in South African courts, legal practitioners will continue to waste valuable judicial
time and resources.55
45 Note 44.
46 Para 50.
47 Para 20.
48 Para 24.
49 Ibid.
50 Section 36(1) of the Constitution, 1996.
51 1998 (1) SACR 479 (N).
52 S v Naidoo 1998 (1) SACR 479 (N) 500.
53 Schwikkard, P & Van der Merwe, S Principles of Evidence 4 ed (Juta & Co, 2015) 240.
54 Ibid.
55 Mavundla (note 44) para 16.
Isiphile Petse and Usenathi Phindelo: e Inuence of ChatGPT-generated Data on the Administration of Justice... 7
3. Ethics of Legal Practitioners’ Use of ChatGPT
In South Africa, the professional conduct of legal practitioners must be regulated,
and accountability must be ensured.56 e South African Legal Practice Council (the
Council) must uphold and advance the rule of law, the administration of justice and
the Constitution.57 e Council has developed a Code of Conduct that can be reviewed
and amended and applies to all legal practitioners in South Africa,58 particularly when
addressing the use of ChatGPT-generated data by legal practitioners in court.
In the Mzayiya case, regarding the Council’s Code of Conduct (the Code), the court
emphasised that legal practitioners should maintain the highest standards of honesty
and integrity by upholding the principles and values enshrined in the Constitution.59
is means that legal practitioners must constantly be subjected to their legal duty to the
court,60 the interests of justice,61 and the observance of the law.62 Legal practitioners must
bear in mind article 57.1 of the Code before they use misleading information generated by
ChatGPT in courts. Article 57.1 of the Code states:
A legal practitioner shall take all reasonable steps to avoid, directly
or indirectly, misleading a court or a tribunal on any matter of fact or
question of law. In particular, a legal practitioner shall not mislead a
court or a tribunal in respect of what is in papers before the court or
tribunal, including any transcript of evidence.63
Even though the Mzayiya case did not refer to the Code, it correctly discussed principles
similar to those in the Code. Legal practitioners in South Africa are governed by the
Constitution, court judgments, the Legal Practice Act, and the Code; apart from the
mentioned sources, legal practitioners are accountable for recklessly presenting misleading
information generated by ChatGPT during court proceedings.64 Recklessness aects the
courts and the administration of justice; therefore, this conduct infringes on the rights
conferred by the Constitution.65
Even though the Legal Practice Act and the Code do not make provision for an
incident where the legal practitioner provides misleading ChatGPT information in court,
the Mzayiya case conrmed that a counsel who is dishonest is guilty of misleading the
court and is thus party to perjury.66 For example, in the Michelle Parker case, the defendant’s
counsel alleged that the plainti’s counsel attempted to mislead the court; the court
56 Legal Practice Act 28 of 2014, long title – summary of the scope and purpose of the legislation.
57 Section 5 of the Legal Practice Act.
58 Section 36(1) of the Legal Practice Act.
59 Articles 3.1 and 3.2 of the Legal Council’s Code of Conduct 2019.
60 Articles 3.3 and 3.3.1 of the Code.
61 Article 3.3.2 of the Code.
62 Article 3.3.3 of the Code.
63 Article 57.1 of the Code.
64 Wilson (note 1) refers to Michelle Parker (note 1).
65 Section 2 of the Constitution, 1996 not only invalidates law and/or conduct inconsistent with it
but requires that any obligation imposed by it must be fullled.
66 Mzayiya (note 32).
8 (2025) 5 Turf Law Journal 1-14 | 2789-746X |
highlighted that such an act must be met with a punitive costs order, and the attorney’s
conduct must be reported to the Legal Practice Council.67
While the court has mentioned the possibility of a lenient sanction, it is essential to
recognise that cases involving AI must be addressed on a case-by-case basis. is need
stems from the complex nature of AI and its unique inuence on the administration
of justice. For example, although legal practitioners relied on incorrect information
generated by ChatGPT in the Michelle Parker case, that information was not presented
in court. In contrast, the Mavundla case involved a candidate attorney using the chatbot,
with the principal presenting that information in court. Each case presents a distinct issue,
indicating that dierent sanctions may be appropriate. e question remains: can the use
of misleading information generated by ChatGPT be considered perjury?
4. Sanctions for the use of misleading information generated by ChatGPT
In the Michelle Parker case, the court warned that sometimes legal practitioners can become
careless and overly zealous. is can lead them to place excessive trust in the integrity of
legal research generated by AI, oen recklessly.68 is leaves the court with no option but
to question the intent of the legal practitioner presenting the misleading information.69
Legal practitioners must be held liable even if they did not intend to mislead the court.
is was elaborated upon in the Michelle Parker case, where the court stated:
Although the plainti’s attorneys did not intend to mislead anyone,
the inevitable result of his debacles was that the defendant’s attorneys
were indeed misled into thinking that these authorities were real. As
a result, they would have invested a signicant amount of time and
eort in their futile attempts at tracking down these cases ...70
In the Michelle Parker case, the court did not decide whether a legal practitioner who shares
incorrect and/or misleading information should be found guilty of perjury. Still, the court
conrmed that the embarrassment associated with a legal practitioner relying on non-existent
information generated by the chatbot does call for punishment.71 e acts of dishonesty and
misleading the court may be seen as perjury.72 Snyman states that ‘[p]erjury consists in the
unlawful and intentional making of a false statement in the course of a judicial proceeding by
a person who has taken the oath or made an armation before, or who has been admonished
by, somebody competent to administer or accept the oath, armation or admonition’.73
Snyman’s denition of perjury does not extend to false statements, adavits,
declarations or case law authority, especially when a legal practitioner shares a list of
information generated by ChatGPT with the other party.74 Section 9 of the Justice of the
67 Michelle Parker (note 1) para 89.
68 Ibid.
69 Para 91.
70 Ibid.
71 Ibid.
72 Mzayiya (note 32).
73 Snyman, C Criminal Law 6 ed (LexisNexis, 2014) 332.
74 e court in Michelle Parker (note 1) para 81 stated that the plainti’s attorneys forwarded a list
of cases to the defendant’s attorney as constituting the authorities.
Isiphile Petse and Usenathi Phindelo: e Inuence of ChatGPT-generated Data on the Administration of Justice... 9
Peace and Commissioners of Oaths Act (JPCOA)75 extends the scope of false statements
to adavits and declarations.
Section 9 of the JPCOA states that ‘[a]ny person who, in an adavit, armation or
solemn or attested declaration made before a person competent to administer an oath or
armation or take the declaration in question, has made a false statement knowing it to
be false, shall be guilty of an oence and liable upon conviction to the penalties prescribed
by law for the oence of perjury’.76 Section 9 of JPCOA is broader than Snyman’s denition
as it includes adavits, armations and declarations but does not state whether these
adavits, armations, declarations and false statements should be made during judicial
proceedings or outside of court.77 In providing clarity, a statement constituting perjury
can be verbal or an adavit.78 Snyman conrms that the statement (an adavit or verbal
statement) does not have to be material to the issue to be decided in the court proceedings;
however, there must be an inference from the evidence presented during the judicial
proceedings.79
In the Michelle Parker case, the court conrmed that the plainti ’s legal team did
not submit the information (list of cases) generated by ChatGPT to the court as binding
authorities.80 e list of non-existent cases was only sent to the other party’s legal
practitioner as the authority that the plainti’s legal team would rely on during the court
proceedings.81 e court conrmed that if the court had been satised that the plainti ’s
legal team had attempted to mislead the court, the outcome would have been far more
severe.82 Perhaps such an act would have constituted perjury.
Unlike the Michelle Parker case, in the Roberto Mata83 case in the USA, the plainti’s
legal representatives submitted a legal brief that included six non-existent judicial
opinions and citations before receiving an order from the court.84 e court armed
that the court was presented with an unprecedented circumstance.85 A legal practitioner
unlawfully and intentionally used false information generated by ChatGPT during the
75 16 of 1963.
76 Section 213(6) of the Criminal Procedure Act 51 of 1977 states: ‘Any person who makes a
statement which is admitted as evidence under this section and who in such statement wilfully
and falsely states anything which, if sworn, would have amounted to the oence of perjury, shall
be deemed to have committed the oence of perjury and shall, upon conviction, be liable to the
punishment prescribed for the oence of perjury.’
77 Raith Gourmet Trading (Pty) Ltd v Halligan; In re: FOCSWU obo Davids and Others v De Kock
and Others [2014] ZALCJHB 259 para 25. A false statement made in the course of judicial
proceedings has been held to include an adavit ‘which the law permits to be used in judicial
proceedings as evidence,for example,in motion proceedingsor in actions where the Court or
the law permits evidence to be given by means of adavits’. See Snyman (note 77) 333.
78 Snyman (note 77) 332.
79 Ibid 332; R v Matakane 1948 (3) SA 384 (A) paras 391–393; R v Wallace 1959 (3) SA 828 (R)
paras 829–830.
80 Michelle Parker (note 1) para 89.
81 Ibid.
82 Ibid.
83 Roberto Mata (note 10).
84 Paras 24, 25, 27 and 29.
85 Para 21; Maruf (note 10). Judge Kevin Castel of the Southern District of New York wrote: ‘e
court is presented with an unprecedented circumstance.’
10 (2025) 5 Turf Law Journal 1-14 | 2789-746X |
judicial proceedings. e court found that the lawyers acted in bad faith and made false
and misleading statements to the court, while they could have avoided such conduct.86 e
judge ordered the legal practitioners to pay a ne of $5,000.87
In the Michelle Parker case, the court also granted a costs order, which the
defendant sought as an appropriate sanction.88 Even though the dispute was not about
legal practitioners misleading the court, this case is relevant. In South Africa, the only
appropriate sanction for the use of incorrect, non-existent and misleading information
generated by ChatGPT in court is a sanction dealing with perjury89 or possible referral
to the Legal Practice Council for misconduct,90 which are too lenient. In both criminal
and civil proceedings,91 a person who knowingly makes a false statement in an adavit
is guilty of an oence, with penalties that are the same as those for perjury.92 Section 28
of the Criminal Procedure Act (CPA), partly governs the substantive consequences of the
unlawful act of giving false information.93 Section 28(2) of the CPA states that
Where any person falsely gives information on oath under section
21 (1) or 25 (1) and a search warrant or, as the case may be, a warrant
is issued and executed on such information, and such person is in
consequence of such false information convicted of perjury, the court
convicting such person may, upon the application of any person who
has suered damage in consequence of the unlawful entry, search or
seizure, as the case may be, or upon the application of the prosecutor
acting on the instructions of that person, award compensation in
respect of such damage, whereupon the provisions of section 300
shall mutatis mutandis apply with reference to such award…94
is means that evidence led during court proceedings assists the court in determining
the amount to be awarded. Perjury is sanctioned on a case-by-case basis, depending on
the matter in dispute, and is evidence-led.95 Legal practitioners should never present
86 Merken, S ‘New York lawyers sanctioned for using fake ChatGPT cases in legal brief’ (2023)
<https://www.reuters.com/legal/new-york-lawyers-sanctioned-using-fake-chatgpt-cases-
legal-brief-2023-06-22/> accessed 12 September 2023.
87 Ibid. $5,000 equals R94762.60.
88 Michelle Parker (note 1) para 91. e court stated that the embarrassment associated with an
incident of a legal practitioner relying on non-existent cases was sucient punishment for the
plainti’s attorney.
89 Mzayiya (note 32); Michelle Parker (note 1); s 9 of the JPCOA; s 319(3) of the CPA.
90 Mzayiya (note 32) para 94; Michelle Parker (note 1): ‘If a legal practitioner makes himself guilty
of misconduct, then an obligation rests on the professional organisation which has jurisdiction
over the practitioner, the Legal Practice Council, to bring the errant practitioner to book, it is
the watchdog of the profession.’
91 Talacar Holdings (Pty) Ltd v City of Johannesburg Metropolitan Municipality and Others [2023]
ZAGPJHC 250 para 16.
92 Pete, S, Hulme, D & Du Plessis, M et al Civil Procedure: A Practical Guide (Juta & Co, 2016) 194.
93 Swanepoel, JP, Joubert, JJ & Terblanche, SS Criminal Procedure Handbook 12 ed (Juta & Co,
2020) 188.
94 Section 28(2) of the CPA.
95 S v Ncamane [2019] ZAFSHC 220. e accused was convicted of a contravention of s 9 of
the JPCOA. e court sentenced the accused to six months’ imprisonment which was wholly
suspended for ve years on condition that he was not convicted of perjury again.
Isiphile Petse and Usenathi Phindelo: e Inuence of ChatGPT-generated Data on the Administration of Justice... 11
AI-generated information, such as that produced by ChatGPT, under oath without proper
verication, as doing so could amount to perjury and undermine the integrity of the legal
profession in South Africa. In Rondel v Worsley,96 Lord Denning emphasised that legal
practitioners must be loyal to a higher cause: truth and justice.97 is means that a legal
practitioner must not consciously misstate the facts or knowingly conceal the truth.98 e
only way a legal practitioner can be loyal to his or her legal duty to the courts is to verify
the information before presenting it.
5. Limited methods to verify information generated by ChatGPT
ChatGPT can be regarded as a helpful tool,99 but the answers provided by the chatbot are
not always reliable.100 In the age of instant gratication from technology, legal practitioners
should not forget good old-fashioned independent reading.101 Independent reading must
be infused with the eciency of modern technology.102 at will enable legal practitioners
to verify the accuracy of the information generated by the chatbot.
In the Parker case, the court conrmed that legal practitioners must come to court
with a legally independent mind that does not merely repeat the unveried research of a
chatbot.103 Legal practitioners must also use ChatGPT as a starting point rather than the
nal product; legal practitioners must ensure that they clearly dene tasks and validate
information against credible sources involving human expertise and interpersonal
interaction,104 such as the Bing search engine, which improves ChatGPT.105 Apart from
using Bing to improve the quality and accuracy of information generated by ChatGPT,106
technical measures should also be considered. Whenever data collection and processing
96 [1966] 3 All ER 657 (Eng CA).
97 At 665; Mzayiya (note 32) para 89.
98 Ibid.
99 Frackiewicz, M ‘e Advantages and Limitation of Using ChatGPT for Legal Research and
Analysis’ (2023) <https://ts2.space/en/the-advantages-and-limitations-of-using-chatgpt-for-
legal-research-and-analysis/> accessed 17 September 2023: ‘ChatGPT can be used to automate
the summarisation of legal documents, allowing researchers and analysts to quickly grasp the
key points of a document without spending hours reading through it.’
100 Ibid; Manie, A ‘My Learned Bot – ChatGPT can be a Powerful Tool for Lawyers, but Caution
is the Name of the Game’ (2023) <https://www.dailymaverick.co.za/opinionista/2023-02-12-
my-learned-bot-chatgpt-can-be-a-powerful-tool-for-lawyers-but-caution-is-the-name-of-the-
game/> accessed 17 September 2023: ‘ChatGPT’s responses can be incomplete, inaccurate and
dangerously misleading at times.’
101 Michelle Parker (note 1) para 90.
102 Ibid.
103 Ibid.
104 Manie (note 107). e author sets out strategies to enhance the eectiveness and address the
limitations of ChatGPT in the legal industry.
105 Perlman (note 2).
106 Ibid; Ortiz, S ‘What is ChatGPT and Why Does It Matter? Here’s What You Need to Know’ (2023)
<https://www.zdnet.com/article/what-is-chatgpt-and-why-does-it-matter-heres-everything-
you-need-to-know/> accessed 17 September 2023. Due to ChatGPT’s data being limited up to
2021, the chatbot was unaware of events or news that had occurred since then. is problem
was addressed through ChatGPT’s integration into Bing as its default search engine. Bing is a
plugin that gives ChatGPT the ability to index and provide citations.
12 (2025) 5 Turf Law Journal 1-14 | 2789-746X |
are deemed illegal, such data should not be used and must be excluded, particularly if it
produces discriminatory results.107 is indicates that legal practitioners require not only
a solid understanding of the law but also interdisciplinary skills to adapt to technological
advancements. ey should be mindful of when to employ tools like ChatGPT,
acknowledging that it may occasionally contain outdated or fabricated information.108
e data produced by ChatGPT represents a serious threat to the integrity of our courts
and the administration of justice, as well as to the work of legal researchers and practitioners.
To protect the legal profession from these dangers, the courts must take decisive action
by imposing signicant sanctions on legal practitioners who irresponsibly mislead them.
However, without laws, legal professionals, including the South African courts, will be
unable to maintain the integrity of the legal system and uphold justice for all. ChatGPT’s
harmful and misleading information will remain unregulated, and legal practitioners who
use ChatGPT will not know how to safeguard their rights and obligations.
6. Steps for developing a regulatory framework for ChatGPT in the
legal profession
In formulating a regulatory framework for ChatGPT, we should consider the key regulatory
issues in the European Union (EU), the rst jurisdiction to enact AI legislation.109 e
EU’s AI framework can provide valuable insights for developing an AI legal framework
in South Africa.110 European authorities have implemented a comprehensive approach to
safeguarding ChatGPT users and upholding their rights. is strategy not only protects
users, including legal professionals but also ensures that the chatbot’s developers and
providers are held accountable.
e EU AI Act111 does not explicitly dene ChatGPT; however, it generally denes AI
systems as machine-based systems designed to operate with varying levels of autonomy.112
e Act also oers a neutral denition of a general-purpose AI system, which refers to an
AI system based on a general-purpose AI model capable of serving various purposes, for
both direct application and integration into other AI systems.113
According to the denitions, ChatGPT is a versatile general-purpose AI system. Its
ability to generate text tailored to user prompts in diverse contexts showcases its exibility
and power. Moreover, its functionality allows for seamless integration with other AI
systems, making it an invaluable tool for various applications.114
107 Terzidou, K ‘Generative AI for the Profession: Facing the Implications of the Use of ChatGPT
rough an Intradisciplinary Approach’ (2023) <https://www.medialaws.eu/generative-ai-for-
the-legal-profession-facing-the-implications-of-the-use-of-chatgpt-through-an-intradisciplinary-
approach/> accessed 23 January 2025.
108 Ibid.
109 Hickman, T, Lorenz, S & Teetzmann, C ‘Long Awaited EU AI Act Becomes Law aer Publication
in the EU’s Ocial Journal’ (2024) <https://www.whitecase.com/insight-alert/long-awaited-
eu-ai-act-becomes-law-aer-publication-eus-ocial-journal> accessed 24 January 2025.
110 Ka Mtuze & Morige (note 7) 168.
111 European Union Articial Intelligence Act 2024/1689.
112 Article 3(1) of the EU AI Act.
113 Article 3(66) of the EU AI Act.
114 Terzidou (note 111).
Isiphile Petse and Usenathi Phindelo: e Inuence of ChatGPT-generated Data on the Administration of Justice... 13
e denitions are essential as South Africa starts to establish a legal framework for
AI. Incorporating ChatGPT into forthcoming South African legislation will assist legal
practitioners in evaluating the associated risks and enable them to determine whether
the chatbot has an unacceptable, highly limited or minimal risk,115 as outlined in article
6 of the EU AI Act. Article 6 of the EU AI Act eectively outlines the classication rules
for AI systems deemed to be high-risk, making clear reference to Annex III of the EU AI
Act.116 is structured approach enhances transparency and accountability in deploying
advanced AI technologies. ese classications do not explicitly label ChatGPT as a high-
risk AI system. However, since the chatbot is employed by judicial authorities to research
and interpret facts and legal matters, the Act recognises that these intended uses can
aect the administration of justice.117 When a chatbot produces frivolous information, a
considerable risk is involved.
We recommend that South Africa should prioritise several essential components
in developing its AI legal framework: transparency, human oversight, mandatory risk
management and enhanced content moderation.118 Furthermore, as outlined in article
16 of the EU AI Act, chatbot providers should be held liable and accountable for the
information their systems generate.119
7. Conclusion
Legal practitioners’ increasing reliance on ChatGPT in South Africa presents a serious
challenge that cannot be ignored. Without appropriate legislation, regulations or ocial
policies governing the ethical use of this technology, the legal profession is at risk.120
Developing frameworks to protect the integrity of legal practice is crucial. To achieve this,
we urgently need more legal literature to inform and guide the creation of robust policies
and ethical standards. Acting now is essential for the future of the country’s legal profession.
Strict measures and sanctions are needed to hold legal practitioners accountable121 for
any negligent and misleading actions. A coordinated approach is essential to ensure the
eective use of chatbots while safeguarding the administration of justice and human rights.
e Michelle Parker and Mavundla cases emphasise that legal practitioners must remember
their obligations to the courts and the administration of justice during legal research.
South Africa should use the EU AI Act as a model to create a robust AI framework. is
framework will ensure accountability for legal practitioners and those who provide and
develop chatbots. Embracing this approach will foster responsibility and trust in our society
regarding AI technology.
115 Ibid.
116 Article 6(2).
117 Annexure 8(b) of the EU AI Act.
118 Article 2 of the EU AI Act.
119 Article 16 of the EU AI Act.
120 Ka Mtuze & Morige (note 7) 167.
121 Brand, D ‘Responsible Articial Intelligence in Government: Development of a Legal Framework
for South Africa’ (2022) 14 eJournal of eDemocracy 146; Gless, S ‘AI in the Courtroom: A
Comparative Analysis of Machine Evidence in Criminal Trials’ (2020) 51 Georgetown Journal
of International Law 251. Gless believes that the authenticity and legitimacy of AI should be
human-centred.
14 (2025) 5 Turf Law Journal 1-14 | 2789-746X |
How to cite:
Isiphile Petse and Usenathi Phindelo ‘e Inuence of ChatGPT-generated Data on the
Administration of Justice in South Africa’ (2025) 5 Turf Law Journal 1-14.