‘ChatGPT detector’ catches AI-generated papers with unprecedented accuracy - eviltoast
  • kirklennon@kbin.social
    link
    fedilink
    arrow-up
    5
    arrow-down
    2
    ·
    1 year ago

    Why should someone bother to read something if you couldn’t be bothered to write it in the first place? And how can they judge the quality of your writing if it’s not your writing?

    • Deckweiss@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      1 year ago

      Science isn’t about writing. It is about finding new data through scientific process and communicating it to other humans.

      If a tool helps you do any of it better, faster or more efficiently, that tool should be used.

      But I agree with your sentiment when it comes to for example creative writing.

      • sab@kbin.social
        link
        fedilink
        arrow-up
        4
        arrow-down
        2
        ·
        edit-2
        1 year ago

        Science is also creative writing. We do research and write the results, in something that is an original product. Something new is created; it’s creative.

        An LLM is just reiterative. A researcher might feel like they’re producing something, but they’re really just reiterating. Even if the product is better than what they would have produced themselves it is still more worthless, as it is not original and will not make a contribution that haven’t been made already.

        And for a lot of researchers, the writing and the thinking blend into each other. Outsource the writing, and you’re crippling the thinking.

      • Laticauda@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        1 year ago

        Science is about thinking. If you’re not the one writing your own research, you’re not the one actually thinking about it and conceptualizing it. The act of writing a research paper is just as important as the paper itself.

    • agent_flounder@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      To me this question hints at the seismic paradigm shift that comes from generative AI.

      I struggle to wrap my head around it and part of me just wants to give up on everything. But… We now have to wrestle with questions like:

      What is art and do humans matter in the process of creating it? Whether novels, graphic arts, plays, whatever else?

      What is the purpose of writing?

      What if anything is still missing from generative writing versus human writing?

      Is the difference between human intelligence and generative AI just a question of scale and complexity?

      Now or in the future, can human experience be simulated by a generative AI via training on works produced by humans with human experience?

      If an AI can now or some day create a novel that is meaningful or moving to readers, with all the hallmarks of a literary masterwork, is it still of value? Does it matter who/what wrote it?

      Can an AI have novel ideas and insights? Is it a question of technology? If so, what is so special about humans?

      Do humans need to think if AI one day can do it for us and even do it better than we can?

      Is there any point in existing if we aren’t needed to create, think, generate ideas and insights? If our intellect is superfluous?

      If human relationships conducted in text and video can be simulated on one end by a sufficiently complex AI, to fool the human, is it really a friendship?

      Are we all just essentially biological machines and our bonds simply functions of electrochemical interactions, instincts, and brain patterns?

      I’m late to the game on all this stuff. I’m sure many have wrestled with a lot of this. But I also think maybe generative AI will force far more of us to confront some of these things.