Content uploaded by Nathan Lindberg
Author content
All content in this area was uploaded by Nathan Lindberg on May 02, 2024
Content may be subject to copyright.
Report on AI Chatbots’ Impact on Writing Centers
Nathan Lindberg, Amanda Domingues, Thari Zweers, Selin Goktas
Cornell’s English Language Support Office
ChatGPT’s rapid rise has surprised educators. Conversations about its impact are seemingly
ubiquitous at conferences, on social media, and on listservs. Some educators worry that students
are using ChatGPT to complete their assignments, and teachers have no reliable way of
determining authorship. However, some argue educators must promote the use of artificial
intelligence (AI) chatbots, such as ChatGPT, because students who do not use them are at a
disadvantage compared to those who do.
Undoubtedly, AI chatbots will have—or already are having—an effect on writing centers.
Potentially, AI chatbots could replace tutoring/consulting sessions, and writing centers could see
a drop in appointments made. Tutor/consultant training may need to include working effectively
with AI chatbots and also raising awareness of ethical issues, such as using AI chatbots without
acknowledgment. Understanding current sentiments of writing center interested parties can help
with the assessment of the situation and provide information for planning. However, so far few
studies have focused on such sentiments.
This report provides perspectives on how writing centers are being affected by AI chatbots. It
presents data from an IRB approved survey, distributed on professional listservs and social
media groups as well as through personal contacts. To ensure the report’s timely release, it has
not gone through peer review. Later, the data from this study will be considered with findings
from an ongoing study, which is focused on students’ perspectives. The result will be a research
article intended for publication.
From this current study, the main findings are as follows:
• Writing center administrators are feeling generally negative about the impacts of AI
chatbots.
• Generally, writing center administrators do not personally use AI chatbots or only use
them a little.
• Writing center administrators predict AI chatbots will change the way writing centers
work with clients and how tutors/consultants are trained; however, the changes will not
be major.
• Institutions are still developing policies for AI chatbots. Most have policies for divisions
(e.g., programs, departments)—or professors have created their own—but not university-
wide.
• Institutions are providing support for faculty to understand and work with AI chatbots.
• Writing center clients are using AI chatbots for a variety of reasons, none of which are
dominant.
This report continues with a brief overview of germane literature, followed by the survey
methods and findings. At the end of the report, the Conclusions and Discussions section includes
some ideas on how writing center administrators can approach working with AI chatbots. These
ideas will be expanded later in the article written for publication. In the meantime, we welcome
any suggestions about changes or additions to this report that could be made for the article.
Please, send them to Nathan Lindberg at nwl26@cornell.edu.
Relevant Literature
Invented in the 1960s, AI chatbots are computer programs that use natural language to
communicate with users (Shawar & Atwell, 2007). Recently, AI chatbots have exploded in
popularity due to the success of ChatGPT, a large language model chatbot. Launched in
November 2022, ChatGPT reached a million users in five days, a feat that took the social media
site Instagram 15 times longer (Chartr, 2022). While a variety of other AI chatbots have been and
are being introduced (e.g., Google’s Bard), currently, ChatGPT is the most popular.
ChatGPT has had a significant impact on education, in particular writing. Students use it to come
up with ideas, create outlines, and even write content. In reaction, some schools have banned
using ChatGPT (Yadava, 2023), but, as of yet, teachers have no reliable way of detecting AI-
generated text (Terry, 2023). Trends indicate that AI chatbots are only going to become more
effective and ubiquitous writing tools.
Using AI chatbots to assist with writing can be problematic. The texts they generate can look like
academic writing, but contain mistakes or even fabricated information (Ali & Singh, 2023).
Beyond inaccuracies, teachers and students have expressed concerns that AI chatbots are
negatively impacting education and replacing humans (Shoufan, 2023). Additional concerns are
that they could create a divide between those who can afford the technology and those who
cannot (Yan, 2023) and that they may further solidify the use of colonial languages (e.g.,
Standard English) and, thus, the coloniality of power (Madianou, 2021).
Despite these challenges, using AI chatbots for writing has advantages. AI chatbots can function
like a personal tutor, accessible any time for any question (Rahman & Watanobe, 2023). Having
such a tutor can create a more equitable environment for those who have difficulty adjusting
their written accent, such as an English as additional language (EAL) student writing in North
American academic English (Teubner et al., 2023).
Though there are fears that AI chatbots might replace writing instructors, we might instead look
at them as a powerful tool that we can enable our students to use. In fact, it has been argued that
teachers are ethically obligated to teach the use of AI chatbots, or they risk disadvantaging their
students (Jeon & Lee, 2023).
Methods of this Study
Data for this study were gathered using an IRB approved survey. The survey was sent out in
August 2023 on several writing center professional listservs and posted on a writing center
Facebook group. There were 98 responses, but 23 were deleted because either no input was
given or only the first item was addressed (i.e., “What is your role with the writing center(s) at
your institution?”) Nine others were only partially completed; however, they contained relevant
responses, so they are included in this study. The other 66 responses were complete.
Survey participants primarily identified as writing center administrators (n=62). In addition, nine
identified as consultants or tutors, and three identified as interested parties (e.g., writing faculty).
The participants are affiliated with 65 institutions, five of which were listed twice. However, the
responses from duplicated universities were often different, even for questions that have a
definitive answer, such as Does the Institute You Are Affiliated with Have Policies on Using AI
Chatbots? We felt answers reflected opinions and perceptions, not definitive truths, so we
included all responses from duplicate institutions.
Participants were asked to identify the primary and non-primary clients of the writing centers
they are affiliated with. Most primarily serve undergraduate students (n=55) and, second,
graduate and professional students (n=13). Their non-primary clients are graduates and
professionals (n=38), postdocs (n=23), faculty (n=23), other (n=17), none (n=14), and
undergraduates (n=7).
Results
Participants’ Perceptions and Experiences with AI chatbots
Participants were asked how they generally felt about the impact of ChatGPT and other AI
chatbots on the writing center. As seen in Figure 1, most sentiments were negative (n=36),
though 18 were neutral. Only 12 felt positive. None of the participants chose “there is no
impact.”
Figure 1
Question: Generally, How Do You Feel About the Impact of ChatGPT and Other AI Chatbots on
Writing Centers?
Participants were asked to elaborate. A little more than half of the 47 who responded believe that
AI chatbots are useful and can allow writing centers to focus on different tasks, such as spending
less time on looking for “mistakes” and more time on using language creatively. Approximately
half of the others are not as optimistic. These respondents are worried AI chatbots will negatively
impact the learning process and fear that students will increasingly rely on AI chatbots to do their
work. One respondent even argued that the reliance on these kinds of technologies may diminish
people's cognitive skills.
Participants were asked if they used ChatGPT or other AI chatbots personally. The majority do
not or only use them a little (n=55). Only four reported using them a lot (Figure 2).
Figure 2
Question: Do You Personally Use ChatGPT or Other AI Chatbots?
10
26
18
9
3
0
Very negative A little negative Neutral A little positive Very positive There is no impact
Respondents were asked to elaborate. Some indicated that they use ChatGPT to learn how
it works. Some stated they use it to brainstorm and/or help with designing classes or creating
training materials. Concerning those who do not use AI chatbots, most declared they do not have
a need to. However, two raised ethical implications and others stated AI chatbots are too limited
in output and reliability. Two others raised issues of privacy and were reluctant to create an
account.
Institutions’ Approach to AI Chatbots
Participants were asked if the institution they are affiliated with has policies for using AI
chatbots. While the most common answer was yes (n=24), the policies were only for specific
“parts” (e.g., programs, individual professor). Thirteen indicated that there are no policies, but 11
thought policies were being developed. Only seven indicated their institution has university-wide
policies (Figure 3).
Figure 3
Question: Does the Institution You Are Affiliated with Have Any Policy/Policies About Using AI
Chatbots?
Participants were invited to elaborate. Of the 46 who responded, most stated that the institution
they are affiliated with allows individual faculty members to decide their own policies. Some
stated that administrators have developed policies both in favor of and against the use of AI
chatbots, highlighting ethical concerns and the need for transparency. Most stated that the
23
32
84
No Yes, a little Yes, sometimes Yes, a lot
13 11
24
7
1
12
No No, but being
developed.
Yes, but only for
parts
Yes, university-wide I'm not sure Other
policies being developed, or already in practice, go hand in hand with existing policies, such as
those on plagiarism.
Even though policies at institutions may not have been fully developed, 42 participants indicated
that their institution provides support, while only eight indicated theirs does not (Figure 4).
Figure 4
Question: At the Institution That You Are Affiliated With, Is There Support (e.g., Workshops,
Handouts, Lists of Resources) for Using ChatGPT or Other AI Chatbots?
Respondents were asked to elaborate. Of the 46 who responded, most mentioned support from
teaching centers (e.g., center for teaching innovation), and some from IT departments. A few
mentioned that the library is involved. The most common form of support seems to be
workshops, but some respondents mentioned informal conversations with faculty members.
Clients Using AI Chatbots
Participants were asked if their writing center clients were using AI chatbots. Fifteen indicated
that they were not, 37 thought they were, and 20 were not sure. It is notable that 20 were not
sure, indicating that administrators had either not explored the issue or that they had and their
findings were inconclusive.
The 37 respondents who thought clients were using ChatGPT were asked what they were using it
for. The most popular answer was “coming up with ideas” (e.g., brainstorming topics, outlining,
drafting text) (n=28), followed by “adjusting for second language issues” (e.g., non-native
phrasing/vocabulary, non-native syntax) (n=26), and then “writing low-stakes texts” (e.g.,
emails, minor class assignments) (n=25). Generally, however, no single answer was dominant
(Figure 5).
Figure 5
Question: What Do You Think Clients Are Using Them for? (Choose as Many as Apply.)
811
42
6
No No, but support is
being developed.
Yes I'm not sure.
Note: Choices not discussed above and their parentheticals are as follows:
“proofreading” (e.g., checking for mistakes),
“tone or style” (e.g., formal/informal, academic/general audience),
“high-stakes texts” (e.g., major class assignments, job applications, research articles, grant
proposals), and
“reading” (i.e., summarizing/explaining texts).
Participants were asked what evidence they based their answers on, and 12 indicated it was
anecdotal, i.e., from conversation with faculty, clients, and/or tutors/consultants. Only four
indicated they had surveyed clients.
Impacts of AI Chatbots on Writing Center Appointments
If AI chatbots can be used as a personal writing assistant (Rahman & Watanobe, 2023), then it
stands to reason that clients might be using an AI chatbot instead of going to a
tutoring/consulting session. Thus the number of appointments made would decline. However,
this idea was only partially supported. For the academic year 2022-2023, 23 participants reported
a decrease in appointments made, while 24 reported an increase, and 22 indicated the number of
appointments was about the same.
Those who saw a decline were asked how much. Rates ranged from 1% to 70%, with an average
decline of 24%. Participants were asked if the decline was caused by clients using ChatGPT
instead of tutoring/consulting; five indicated no, six indicated yes, and 12 were not sure. When
asked to further explain declines, 12 indicated they were due to the pandemic (e.g., burnout,
disruption, services slow to recover). Other reasons included a decrease in student enrollment
and budget cuts, both of which led to fewer appointments being offered.
Those who saw an increase in appointments made were asked to estimate how much. Rates
ranged from 3% to 41%, with an average of 16%. Eleven participants surmised the cause of the
increase was again primarily pandemic related (e.g., services were rebounding when physical
spaces were opened). Other reasons included increased budgets and student enrollment.
20
26
15
28
25
21 19
3
8
Proofreading Adjusting for
second
language
issues
Adjusting
“tone” or
“style”
Ideas Writing low-
stakes
content
Writing high-
stakes
content
Reading Other I'm not sure.
Determining the cause of an increase or decrease in appointments is difficult because there are
many variables involved, including some outside the writing center (e.g., school enrollment,
budgets). For future studies, rewording the survey item could help eliminate some variables. For
example, asking if occupancy compared to appointments offered declined or rose might yield
more germane data.
AI Chatbots’ Impact on Working with Clients and Training Tutors/Consultants
Participants’ answers to two survey questions indicate that they believe AI chatbots will change
aspects of the writing center. In the first question, participants were asked if they thought AI
chatbots would change the way writing centers work with clients. Fifty-three indicated they did,
16 were not sure, and only two reported they did not (Figure 6). Again, it is notable that so many
were not sure, implying they had not investigated the matter or that they had and the results were
inconclusive.
Figure 6
Question: Do You Think ChatGPT or Other AI Chatbots Will Change the Way Writing Centers
Work with Clients?
Participants were asked to elaborate. Of the 49 who responded, almost all mentioned that
tutors/consultants will need to discuss with clients the ethics of using AI chatbots, specifically,
being transparent about using them and university policies on the matter. In terms of the writing
center, there seems to be a consensus that AI chatbots will significantly change the way things
are done, especially on how tutors/consultants are trained and the feedback that clients expect or
need. For example, there will be more need for feedback on ideas and structure and less for line-
by-line editing. The consensus also seems to be that writing centers need to make students aware
of the limitations of the AI chatbots and teach clients how to use them critically. The most
pessimistic sentiments were from two respondents who thought that AI chatbots will bring the
end of writing centers. Another particularly notable response was, "It's not just AI itself that
changes the situation; it's how faculty react to it and what policies institutions adopt about it.”
In the second question, participants were asked if they thought AI chatbots would change the
way writing center tutors/consultants are trained. Fifty-five indicated yes, 11 were not sure, and
three indicated that they would not (Figure 7). Notably, those who were unsure represent nearly
20% of the respondents.
Figure 7
2
53
16
No Yes I'm not sure.
Question: Do You Think ChatGPT or Other AI Chatbots Will Change the Way Writing Center
Consultants/Tutors Are Trained?
Participants were asked to elaborate. Of the 47 who responded, some highlighted the need to
understand the ethical implications of AI and make sure tutors are following the policy of the
university. Others highlighted the need for tutors to understand the strengths of AI and its
limitations. Some emphasized the need for the development of guidelines for tutors and possibly
spending more time on invention and structure than on grammar.
Though participants felt AI chatbots will change the writing center, the change may not be
substantial. When asked how much AI chatbots will change the way tutors are trained, the
majority indicated “some”; only three indicated a lot or everything will change (Figure 8).
Figure 8
Question: How Much Will AI Chatbots Change the Way Tutors/Consultants Are Trained?
Conclusions and Discussions
When this study was conducted, ChatGPT had only been released nine months prior, so
situations will most likely change. However, based on the data gathered, it appears that writing
center administrators are personally not widely using AI chatbots, but they recognize that their
clients are. They also feel that AI chatbots will have an effect on the writing center, specifically,
how tutors/consultants are trained and how they work with clients. However, perhaps these
changes will not be major.
So far, evidence is lacking that AI chatbots are replacing writing center appointments. It might
be too early to ascertain. When participants who witnessed a decline in appointments were asked
3
55
11
No Yes I'm not sure.
3
37
10 3
Not much Some A lot 100%--everything has
to change
what caused it, the majority were unsure. Future studies should probably word survey items
differently. This study asked if there was a decline or increase in appointments made in the
academic year 2022-2023. However, appointments can vary due to budgets, school enrollment,
and other factors outside the writing center. Participants instead might be asked if writing centers
experienced a decline in occupancy compared to appointments offered, which could provide
more relevant data. However, as noted, there are multiple variants to consider.
Moving forward, this study gives evidence that writing centers will most likely need to make
adjustments for the use of AI chatbots. What adjustments should be made? The article we will
write for publication that follows this report will attempt to address this by considering clients’
perspectives (e.g., how they use AI chatbots). Here, though, the notion can be put forth that if we
believe that AI chatbots will change the way writing centers work, but we are not using AI
chatbots, we are faced with a dilemma. How can we guide writing centers if we do not know
where to go?
One notion is that writing center administrators will need to become more familiar with AI
chatbots—even using them with their own work. (In fact, the writers of this report consulted
ChatGPT 4 for style suggestions and proofreading.) Another notion is that if clients are readily
using AI chatbots, we might look to them for guidance. What are they using them for? What
techniques have they developed? Which techniques can we pass on to others? Asking these
questions can lead to knowledge that can be shared with tutors/consultants and clients. We may
even learn for ourselves.
About the Researchers
Dr. Nathan Lindberg is a senior lecturer for the English Language Support Office (ELSO) at
Cornell University and Director of ELSO's Writing & Presenting Tutoring Service.
Amanda Domingues is a PhD candidate in Science and Technology Studies at Cornell University
and a former tutor for the English Language Support Office (ELSO).
Thari Zweers (MA, M. Ed.) is a PhD candidate in the Medieval Studies Program and a tutor for
the English Language Support Office (ELSO) at Cornell University.
Selin Goktas is a PhD candidate in Psychology and a tutor at the English Language Support
Office (ELSO) at Cornell University.
References
Ali, M. J., & Singh, S. (2023). ChatGPT and scientific abstract writing: Pitfalls and caution.
Graefe's Archive for Clinical and Experimental Ophthalmology.
https://doi.org/10.1007/s00417-023-06123-z
Chartr. (2022, December). ChatGPT: The AI bot taking the tech world by storm.
https://www.chartr.co/stories/2022-12-09-1-chatgpt-taking-the-tech-world-by-
storm#:~:text=Built%20on%20the%20architecture%20of,spits%20something%20back%
20to%20you
Fang, T., Yang, S., Lan, K., Wong, D., Hu, J., Chao, L. S., & Zhang, Y. (2023). Is ChatGPT a
highly fluent grammatical error correction system? A comprehensive evaluation. ArXiv.
https://doi.org/10.48550/arXiv.2304.01746
Heaven, W. D. (2023). The education of ChatGPT. MIT Technology Review, 126 (3), 42-47.
ISSN: 2749-649X
Jeon, J., & Lee, S. (2023). Large language models in education: A focus on the complementary
relationship between human teachers and ChatGPT. Education and Information
Technologies. https://doi.org/10.1007/s10639-023-11834-1
Madianou, M. (2021). Nonhuman humanitarianism: When ‘AI for good’ can be harmful.
Information Communication and Society 24(6), 850–868.
https://doi:10.1080/1369118X.2021.1909100.
Rahman, M., & Watanobe, Y. (2023) ChatGPT for education and research: Opportunities,
threats, and strategies. Applied Sciences, 13(9) https://doi.org/10.3390/app13095783
Shawar, B. A. & Atwell, E. (2007). Chatbots: Are they really useful? Journal for Language
Technology and Computational Linguistics. 22(1), 29-49. ISSN 0175-1336
Shoufan, A. (2023). Exploring students' perceptions of ChatGPT: Thematic analysis and follow-
up survey. IEEE Access, 11, 38805-38818.
Terry, O. K. (2023). I'm a student. You have no idea how much we're using ChatGPT: No
professor or software could ever pick up on it. Big Bot on Campus: The Perils and
Potential of ChatGPT and other AI. The Chronicle of Higher Education.
Teubner, T., Flath, C. M., Weinhardt, C., Van Der Aalst, W., & Hinz, O. (2023). Welcome to the
Era of ChatGPT et al.: The prospects of large language models. Business & Information
Systems Engineering, 65(2), 95–101. https://doi.org/10.1007/s12599-023-00795-x.
Yadava, O. P. (2023). ChatGPT—A foe or an ally? Indian Journal Thoracic Cardiovascular
Surgery 39, 217–221. https://doi.org/10.1007/s12055-023-01507-6
Yan, D. (2023). Impact of ChatGPT on learners in a L2 writing practicum: An exploratory
investigation. Education and Information Technologies, 4(1),1.
https://doi.org/10.1007/s10639-023-11742-4