Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

26
  • Does the term "plagiarism" make any sense for copying from a work not subject to copyright? Commented Dec 12, 2022 at 20:33
  • 9
    @JonathanReez So Plagiarism and Copyright are orthogonal concepts. Works that are not subject to copyright can still be plagiarized (used without crediting the source), and you can commit copyright infringement without committing plagiarism (cited the source but used too much of the copyrighted work and violated Fair Use). Important Distinction.
    – Xirema
    Commented Dec 13, 2022 at 1:08
  • @JonathanReez Beyond the question of whether any of the works used in the training algorithms for these AI are, in fact, subject to Copyright (they might or might not be; and the fact that we're not sure is kind of the root of the problem given that, again, these works are not properly cited), failure to properly cite those works would still constitute plagiarism regardless.
    – Xirema
    Commented Dec 13, 2022 at 1:11
  • 4
    AI writing text by learning from other texts isn’t any different from humans doing the exact same thing. You don’t have to attribute text you write to every single book on the subject that you’ve read. The tricky part is that prior to ~2021 any tool whatsoever was considered fair game to use to help you write text - spell checker, Google Translate, thesaurus, tools that help rephrase things, etc. But all of a sudden it’s claimed that this particular tool goes too far and no longer counts as a “tool”. Commented Dec 13, 2022 at 1:29
  • We consider plagiarism to be bad because it allows credit to be stolen. No such concern exists for tools because they don’t require human input. If you can write amazing novels using ChatGPT, why should you give your tool any credit? Commented Dec 13, 2022 at 1:32
  • 9
    @JonathanReez "AI writing text by learning from other texts isn’t any different from humans doing the exact same thing." This is not true. It's just flat-out completely false. Proselytizers for AI-generated content will sometimes make this claim because they want to capitalize on hype around AI and/or are jonesing for valuable Venture Capital funding, but the neural networks that power these algorithms are extremely unlike human thinking, and should not be treated as though they are performing original thought.
    – Xirema
    Commented Dec 13, 2022 at 1:41
  • 2
    Um… humans don’t learn how to write text or code by reading the works of others? That’s certainly news to me. Humans need a significantly lower number of samples to learn something but the general principle is the same. There’s nothing magical about how our brain works, it’s just a neural network. Commented Dec 13, 2022 at 1:46
  • 8
    @JonathanReez This isn't a debate about how Humans learn, it's about how AI learns, and specifically, how Neural Networks operate. Mass-scale Matrix Multiplication is not analogous to human learning.
    – Xirema
    Commented Dec 13, 2022 at 5:07
  • 5
    The brain uses the biological equivalent of matrix multiplication to achieve the same result. My question is why it's fair for a human to read a few articles and then write their own "original" article on the subject but not fair for an AI to do the same thing. You seem to assume that the human brain does some "magical" process while in reality it "auto completes" text in a fashion not quite dissimilar to ChatGPT. Commented Dec 13, 2022 at 5:14
  • @JonathanReez: I think the point that is being made here (underneath all the "AIs are not like humans" chatter, which IMHO is frankly just irrelevant and distracting), is that humans are normally expected to cite their sources, and ChatGPT is currently unable to do so. If you cite your sources, then as a rule, that is generally understood to be enough to defeat a charge of plagiarism. (There may be copyright issues if the text is very similar to the original, but that's a separate issue as Xirema's comment acknowledges.)
    – Kevin
    Commented Dec 22, 2022 at 9:55
  • @Kevin people rarely cite their sources on Stackoverflow and many sites like Politics have a rampant lack of source attribution by humans Commented Dec 22, 2022 at 12:35
  • @Kevin I wouldn't had to have added that section if I didn't keep getting comments from people insisting otherwise. There's already quite a few deleted comments on this answer from people doing that...
    – Xirema
    Commented Dec 22, 2022 at 17:09
  • @JonathanReez Which is why we don't want to introduce a tool that will make the problem exponentially worse.
    – Xirema
    Commented Dec 22, 2022 at 17:10
  • 2
    @JonathanReez Humans think about things. Transformers do not think about things. The precise mechanics of human learning aren't relevant, because this distinction is enough to explain a lot of the difference between human output and ChatGPT output. (Ask ChatGPT not to plagiarise, and it'll tell you it's not plagiarising while plagiarising just as much.)
    – wizzwizz4
    Commented Dec 29, 2022 at 20:02
  • 1
    @JonathanReez IQ tests don't test critical thinking. If even Nobel prize winners, who have made novel contributions to the body of human knowledge, can fail to apply critical-thinking, how would "capacity to solve IQ puzzles" be indicative? (But I suppose I'm just making your point again. As a bonus, I haven't proof-read this comment, so I betcha I've made it a third time, too.) Regardless, experts tend to think about their subject matter, and those are the people who are answering questions on Stack Exchange.
    – wizzwizz4
    Commented Dec 29, 2022 at 22:40