361
this post was submitted on 03 Aug 2025
361 points (97.1% liked)
Technology
73605 readers
4482 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
While I don't fully share the notion and tone of other commenter, I gotta say LLMs have absolutely tanked education and science, as noted by many and as I witnessed firsthand.
I'm a young scientist on my way to PhD, and I get to assist in a microbiology course for undergraduates.
The amount of AI slop coming from student assignments is astounding, and worse of all - they don't see it themselves. When it comes to me checking their actual knowledge, it's devastating.
And it's not just undergrads - many scientific articles also now have signs of AI slop, which messes up with research to a concerning degree.
Personally, I tried using more specialized tools like Perplexity in Research mode to look for sources, but it royally messed up listing the sources - it took actual info from scientific articles, but then referenced entirely different articles that hold no relation to it.
So, in my experience LLMs can be useful to generate a simple text or help you tie known facts together. But as a learning tool...be careful, or rather just don't use them for that. Classical education exists for a good reason, and it is that you learn to get factually correct and relevant information, analyze it and keep it in your head for future reference. It takes more time, but is ultimately much worth it.
Sure, many don’t care and I have also experienced this, but it’s a fabulous way to quickly get a glimpse at a subject, or to get started, or to learn more. It’s not always correct, but for known subjects it’s pretty good
Anything related to law or really specific subjects will be horrible though
Sure, but not everyone teaches well enough, and LLMs are one of the ways to balance this, kinda
And if you don’t understand then… yea it’s still useful as a way to avoid failing a year which is morally questionable but hey, another topic
Alright, we generally seem to be on the same page :)
(Except numerous great books and helpful short materials exist for virtually any popular major, and, while they take longer to study, they provide order of magnitude better knowledge)