Debugging code. Fixing grammatical errors. Summarizing research papers. Composing entire essays. Artificial intelligence has significantly broadened academic resources, prompting professors to resist, adapt or embrace an educational shift.
A July 2023 report by a University committee on generative artificial intelligence in education urged educators to consider GAI when developing class objectives, providing a flexible framework with options to prohibit, allow with attribution or actively encourage the use of GAI.
The committee, co-chaired by Kavita Bala, dean of the Bowers College of Computing and Information Science, and Alex Colvin Ph.D. ’99, dean of the School of Industrial and Labor Relations, broadly recommended that instructors rethink the purpose of education in a GAI-enabled world, address the pitfalls of new technology and explicitly state policies for GAI use in their courses.
Bala, who conducts research in AI and machine learning, hoped the report would address faculty’s uncertainty around the rising influence of AI, especially with ChatGPT, a popular AI chatbot.
In an email to The Sun, Bala explained that some instructors were excited about the opportunity for customized and elevated learning experiences that new AI systems offered, while others feared its replacement of human intelligence.
“There was a lot of misunderstanding around what the technology could or could not do, and it was important to educate our faculty and students to understand the technology so their expectations were realistic,” Bala wrote. “It is important that [students] learn to work with this technology and incorporate it into their professional lives ethically and responsibly.”
Leaderboard 2
The Sun interviewed teaching assistants and professors about how they have decided to institute AI policies according to the report’s recommendations as well as students’ responses to these course policies.
Prof. Jeffrey Perry, global development, who teaches several education classes, described the varied approaches to incorporating new technology into classes.
“We’ve got to be aware, from the negative perspective, that it’s going to influence how we change our assessment tools, because students’ writing [may use AI to cheat] … and almost work against what AI can do for us,” Perry said. “Or we lean in using a positive perspective and use it as a tool, but add the expectation for higher order thinking.”
Newsletter Signup
Musckaan Chauhan grad, a Ph.D. candidate in the Department of Government, is a graduate teaching assistant in GOVT 2817: America Confronts the World. She discussed the balanced approach employed in the class regarding AI-use stipulations.
The class has utilized both assignments that ask students to complete work without AI and with AI as a brainstorming resource.
“We decided on creating assignments that would ask them to treat AI as an interlocutor while writing as opposed to something that supplants writing,” Chauhan said. “They can develop their own relationship with [large language models] as opposed to us dictating what they should think about LLMs. It was a very reflective and open relationship that allowed for this kind of experimentation to occur.”
Prof. Christopher Byrne, communication, has adopted a similar technique, clarifying whether AI should be used in each of his assignments in his course COMM 2010: Writing and Producing the Narrative for Digital Media. Byrne has his students prompt AI to produce creative content, like public service announcements and scripts.
He has been both critical and accepting of the technology, asking his students to consider how bland ChatGPT’s output can be.
“It’s a great way to reinforce the problems with cliches,” Byrne said. “When you ask ChatGPT to create a script for a [public service announcement], you’re going to see every cliche out there. You really need to come up with something that’s more unique and more creative.”
Despite professors’ goal to challenge students to surpass ChatGPT’s abilities, some Cornell students simply use ChatGPT to critique its initial output — even when instructed not to.
One student, who requested anonymity due to fears of academic disciplinary action, explained how they used ChatGPT to complete work beyond the portion of the assignment they were allowed to use it for.
“I’m in [ANTHR 2400: Cultural Diversity and Contemporary Issues] and my first assignment had us use ChatGPT to write out an answer to the essay prompt and then we had to critique it,” the anonymous student said. “I used ChatGPT to both produce the answer to the prompt and then to critique itself.”
Jasmine Samadi ’25, who studies operation research and information engineering, told The Sun that her professors, while taking different approaches depending on their course objectives, always provide clear instructions as to when and how AI can be used.
“In some classes, they say that it is acceptable to use ChatGPT, but at the beginning of the homework, you have to say that you used a large language model, the same way you would let them know if you had studied with a friend,” Samadi said.
A couple of Samadi’s professors deter students who might be tempted to cheat with AI on certain assignments by claiming that, as they tested their questions on ChatGPT, AI models would produce incorrect answers.
The commission’s report stated that professors should emphasize the pitfalls of GAI including hallucinations — the presence of inaccuracies in GAI output due to biases in the data used to train models — and ask students to use critical analysis of systems’ output before accepting it and using it in their own work.
Despite his weariness of AI’s writing abilities, Byrne emphasized how crucial it is for current students to use AI to their benefit, rather than avoiding it due to the fear of its potential to dominate certain industries.
“I heard somewhere recently that AI may not replace you in your job, but what will replace you will be someone who knows AI,” Byrne said. “No matter what field you’re going into, you’re going to have to have some sort of working understanding of how artificial intelligence can help you.”