Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

An ex-colleague is a professor, and he recently caught a student handing in a homework written by chatgpt. The only reason he noticed is because he wanted to check one of the references and ended up nowhere. The student was stupid enough to let chatgpt generate the bibliography too.


Using a paper written by the AI to the point of using a hallucinated reference is cheating.

What if a student writes their own outline, then asks ChatGPT to write an outline and pulls a few ideas from the AI's outline?

What if a student asks ChatGPT to rate his paper, and implements 2 of the AI's suggestions?

I think my 2 examples are ethical use of AI.


There is a difference of using GPT as a tool and as someone to outsource your work too. As a tool it's like a super powered spell checker. I don't think any university in the world bans the use of spell checkers. As someone you outsource your work too, that's cheating by any standard.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: