Connect with us

Internet

ChatGPT Usage Banned in French University Over Concerns About Fraud, Plagiarism

Avatar

Published

on

ChatGPT is a free programme that generates original text about virtually any subject in response to a prompt.
By Reuters | Updated: 28 January 2023

Sciences Po, one of France’s top universities, has banned the use of ChatGPT, an artificial intelligence-based chatbot that can generate coherent prose, to prevent fraud and plagiarism.

ChatGPT is a free programme that generates original text about virtually any subject in response to a prompt, including articles, essays, jokes and even poetry, raising concerns across industries about plagiarism.

The university said on Friday the school had emailed all students and faculty announcing a ban on ChatGPT and all other AI-based tools at Sciences Po.

“Without transparent referencing, students are forbidden to use the software for the production of any written work or presentations, except for specific course purposes, with the supervision of a course leader,” Sciences Po said, though it did not specify how it would track usage.

ChatGPT has already been banned in some public schools in New York City and Seattle, according to US media reports, while several US universities have announced plans to do fewer take-home assessments and more hand-written essays and oral exams.

Sciences Po, whose main campus is in Paris, added that punishment for using the software may go as far as exclusion from the institution, or even from French higher education as a whole.

“The ChatGPT software is raising important questions for educators and researchers all around the world, with regards to fraud in general, and particularly plagiarism,” it said.

Microsoft last week announced a further multibillion-dollar investment in OpenAI – the artificial intelligence research lab behind ChatGPT – building on a bet it made on OpenAI nearly four years ago, when it dedicated $1 billion (roughly Rs. 8,200 crore) for the startup co-founded by Tesla’s Elon Musk and investor Sam Altman.

© Thomson Reuters 2023