this post was submitted on 07 Nov 2023
145 points (82.2% liked)

Technology

59578 readers
2932 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] TropicalDingdong@lemmy.world 1 points 1 year ago (1 children)

If you use chatGPT you should still read over it, because it can say something wrong about your results and run a plagiarism tool on it because it could unintentionally do that. So whats the big deal?

There isnt one. Not that I can see.

[–] Jesusaurus@lemmy.world 8 points 1 year ago (1 children)

At least within a higher level education environment, the problem is who does the critical thinking. If you just offload a complex question to chat gpt and submit the result, you don't learn anything. One of the purposes of paper-based exercises is to get students thinking about topics and understanding concepts to apply them to other areas.

[–] TropicalDingdong@lemmy.world 2 points 1 year ago (1 children)

You are considering it from a student perspective. I'm considering it from a writing and communication/ publishing perspective. I'm a scientist, I think a decent one, but I'm a only a proficient writer and I don't want to be a good one. Its just not where I want to put my professional focus. However, you can not advance as a scientist without being a 'good' writer (and I don't just mean proficient). I get to offload all kind of shit to chat GPT. I'm even working on some stuff where I can dump in a folder of papers, and have it go through and statistically review all of them to give me a good idea of what the landscape I'm working in looks like.

Things are changing ridiculously fast. But if you are still relying on writing as your pedagogy, you're leaving a generation of students behind. They will not be able to keep up with people who directly incorporate AI into their workflows.

[–] KingRandomGuy@lemmy.world 1 points 1 year ago (1 children)

I'm curious what field you're in. I'm in computer vision and ML and most conferences have clauses saying not to use ChatGPT or other LLM tools. However, most of the folks I work with see no issue with using LLMs to assist in sentence structure, wording, etc, but they generally don't approve of using LLMs to write accuracy critical sections (such as background, or results) outside of things like rewording.

I suspect part of the reason conferences are hesitant to allow LLM usage has to do with copyright, since that's still somewhat of a gray area in the US AFAIK.

[–] TropicalDingdong@lemmy.world -1 points 1 year ago

I work in remote sensing, AI, and feature detection. However, I work almost exclusively for private industry. Generally in the natural hazard, climate mitigation space.

Lately, I've been using it to statistically summarize big batches of publications into tables that I can then analyze statistically (because the LLMs don't always get it right). I don't have the time to read like that, so it helps me build an understanding of a space without having to actually read it all.

I think the hand wringing is largely that. I'm not sure its going to matter in 6 months to a year. We're at the inflection (like pre-alpha go) where its clear that AI can do this thing that was thought to be solely the domain of humans. But it doesn't necessarily do it better than the best of us. We know how this goes though. It will surpass, and likely by a preposterous margin. Pandoras box is wide open. No closing this up.