Universities: are the texts generated by ChatGPT plagiarism?

Universities are starting to think about changes in plagiarism when students use tools such as ChatGPT in their thesis writing. Students and professors still disagree on whether the Artificial Intelligence (AI) “chatbot” is a research tool or a cheating engine. There, a great dispute was opened where the trials have more and more participation in this type of virtual tools.

“AI could be dumber than humans,” espoused a Rutgers University student. Kai Cobbs put this tool to the test with an essay on the history of capitalism. The surprise was the realization of a generic and poorly written article, which no one would dare to claim as their own, the student said. The student, who spoke to Wired, remarked that “the writing quality was horrible. The writing was awkward and lacked complexity”.

In November, the OpenAI chatbot was launched. Educators have since found themselves in a new struggle, not knowing how to handle the surge of student work produced by AI. Many educational institutions, such as New York public schools, have banned the use of ChatGPT. The intention is to curb cheating, although universities have been reluctant to follow this guideline.

The university view of ChatGPT

The arrival of this AI tool at the educational level has generated the need for a complete rethinking. The digital world, in terms of research, is an alternative that is always a variant to be taken into account by teachers and students. But this chatbot is not the kickoff to the concerns. As early as 2001, with the rise of Wikipedia, universities had to decipher so-called “honest scholarly work.”

This is where the first limits of policies to “match” the pace of technological innovation began to be drawn. But now the game has become even more complicated. In addition to plagiarism, schools must now “discover” whether a work is their own or the product of a bot.

Plagiarism and the new reality

The definition of plagiarism refers to the act of using someone else’s work or ideas without giving due credit to the original author. Of course, the problem arises when considering when a work has been generated by something, rather than someone. Students using generative AI leads to a critical point of contention.

“If [el plagio] is stealing from a person then I don’t know if we have a person who is being stolen from.”

Emily Hipchen, member of the Brown University Academic Code Committee.

Other members of academic student centers have been grappling with the grading of an algorithm’s work, particularly if it was text generated. Universities must begin to work on this fundamental change and rethink technology and academic integrity.

No Comments Yet

Leave a Reply

Your email address will not be published.