Using AI for class is cheating only if professor says so
- GNP
- Dec 16, 2025
- 5 min read
Adam Gottlieb, Benjamin Barnes III, Caleb Johnson, Dakota Hamm | GNP contributors
It’s been three years since ChatGPT changed the world, and universities are struggling to keep up.
One of the biggest concerns for teachers is how students are using Artificial Intelligence for coursework. ECU and other universities are leaving that up to professors.
That can create a confusing policy climate for students like ECU sophomore Georgia Wann.
“I actually don’t really know about the AI policy that much,” she said. But she does know how it is being enforced in her classes.
One of her professors chooses not to outlaw AI use as an incentive for students to be upfront about using it for assignments in that class, Wann said. Others ban AI for coursework altogether.
Still, “I think letting them decide is a good option,” she said.
Officially, ECU’s AI policy was made part of its ”University Regulation on Academic Integrity” last August. It classifies AI as “cheating” but with a catch. It’s only cheating if it’s an “unapproved use … to generate assigned written material.”
The Greenville News Project set out to find out how other universities in the UNC System and on ECU’s list of “peer institutions” are dealing with AI.
It combed through their AI policies and spoke with officials. It found that although it’s been three years since ChatGPT popularized AI, many universities are still struggling to get ahead of it.
Making policy
ECU adapted its AI policy by drawing on the policies of its peer university. ECU lists 11 peers across the country to which it compares itself on a variety of metrics.
Many of its peers have chosen to leave it to individual teachers to decide how AI will be used in their classes. Opinions vary about whether and how much AI use is suitable for students.
At ECU, its writing center director, Will Banks, thinks using AI can be helpful to students when it is used ethically. But, of course, students can use AI to cheat by asking it to write an entire essay or assignment for them.
“We’re teaching a set of skills that you can’t learn if AI does it for you,” Banks said. “So, in that case, you may not want to use any AI or you may only use AI in a very limited way.”
Banks was on a committee four years ago that helped craft ECU's AI policy. The policy is now posted on the website of the Office for Faculty Excellence, and it encourages professors to add a section to their syllabi about AI use.
Such a statement “makes students aware of the faculty member's policy and alerts them to the importance of academic integrity in the ECU community,” the policy says.
ECU also has put it a website titled “Artificial Intelligence at ECU” to help faculty and students better understand AI and ethical uses of it.
English Professor Michelle Eble is the driving force behind the website, and she wants it to serve as a hub for generating ideas on how to use AI.
The site includes links to workshops on various uses of AI. One workshop, presented by social work professor Michael Daniels, shows teachers how to create quiz questions with AI.
Eble cautions her students how to best use AI—and how to think critically about the information it returns in answer to their prompts. “If you can’t verify [the information] yourself, and if you can’t see where it comes from, then I would not use it because you cannot verify it,” she said.
Joyner Library is part of students’ AI support network as well. It “plays an important role because of the LibGuide we maintain,” said Jan Lewis, the library’s director.
The library’s “Guide to Artificial Intelligence for Students” guide “focuses on helping students understand the use of generative AI products,” she said. It’s updated regularly to help users learn how to cite AI properly, address ethical concerns, and create effective prompts.
“We help them learn how to make good prompts, how to generate images, and how to evaluate AI outputs,” Lewis said, noting that librarians tailor their instruction to different academic disciplines.
Still, none of these efforts are definitive solutions, and faculty say AI is already creating problems in the classroom.
Math Professor Jorge Montero Vallejo has had issues with students using AI for assignments. He said even though he has seen AI used in math coursework a couple of times, it’s more problematic in the humanities and social sciences.
“Projects and papers do have room for AI to actually undermine the education you’re supposed to be getting,” he said.
Even some students believe that using AI for schoolwork is unethical. “We're in college now, and there’s a lot more writing, so people just ask AI to do it for them,” said musical theater major Corian Simicare.
But that makes students lazy, she said. They lose the motivation to think and learn for themselves.
Simicare said that although she occasionally uses AI to look up definitions or for fun activities like generating pictures, it shouldn't replace genuine effort in learning. College is meant for studying something you’re passionate about, she said, not letting technology do the work for you.
‘There’s no paved road’
ECU’s peer institutions also lack a blanket AI policy. And like ECU, they also place their AI guidelines under the academic integrity or honor code. So do other schools in the North Carolina University System.
“There's no paved road. You've had to sort of make your own trail” on AI policy, said Wade Maki, a UNC-Greensboro professor and member of the UNC System AI oversight committee.
“We recognize from a policy perspective that you can't come in and put in a whole lot of rules. You have to see what's happening and then be responsive,” he said.
Even though universities do not have blanket policies, GNP found they’re trying new and creative ways to regulate AI.
Arizona State University took the path of creating a task force. “I believe [it] has created a set of principles that support thoughtful deployment of AI applications across ASU currently,” said Professor Diana Bowman, a task force member.
She said it should be up to universities to decide how to handle AI because every university is different.
“Universities and schools that are looking to adopt entity-wide updates of AI should give thought to their ethical responsibilities of rolling out new models and tools,” she said. “This looks different for each entity individually.”
Universities across the country have partnered AI platforms or created their own. Schools like ASU and Duke have created their own platforms, such as Create AI and DukeGPT.
“We're trying to create a space where people can build educational and scholarly AI experiences in an informed way,” said ASU Vice Provost Anne Jones. “The goal of the environment is to carry out regular AI functions in a scholarly way.”
Schools in the UNC system also are working to build relationships and put deals in place with various AI platforms.
“You should expect additional products being approved at UNC-G as they become available through the UNC system,” Maki said. “The system is doing all of that negotiating for all of our campuses.”
Mississippi State University published a 20-page-long AI report in August 2023, and it includes updates to the school’s academic integrity language.
The MSU policy on AI was influenced by other universities, said Tommy Anderson, the Honors College dean who had a hand in writing it. “UNC (Chapel Hill) is someone that I spoke to … to see what they’re doing,” he said.
Gottlieb, Barnes, Johnson and Hamm produced this story for the Fall 2025 course, In-depth Reporting.
Comments