this post was submitted on 28 Jun 2024
925 points (98.7% liked)

Science Memes

11161 readers
1685 users here now

Welcome to c/science_memes @ Mander.xyz!

A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.



Rules

  1. Don't throw mud. Behave like an intellectual and remember the human.
  2. Keep it rooted (on topic).
  3. No spam.
  4. Infographics welcome, get schooled.

This is a science community. We use the Dawkins definition of meme.



Research Committee

Other Mander Communities

Science and Research

Biology and Life Sciences

Physical Sciences

Humanities and Social Sciences

Practical and Applied Sciences

Memes

Miscellaneous

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] kogasa@programming.dev 9 points 4 months ago (1 children)

If you fine tune a LLM on math equations, odds are it won't actually learn how to reliably solve novel problems. Just the same as it won't become a subject matter expert on any topic, but it's a lot harder to write simple math that "looks, but is not, correct" than it is to waffle vaguely about a topic. The idea of a LLM creating a robust model of the semantics of the text it's trained on is, at face value, plausible; it just doesn't seem to actually happen in practice.

[–] ignotum@lemmy.world -3 points 4 months ago (1 children)

Prompt:

What is 183649+72961?

ChatGPT:

The sum of 183649 and 72961 is 256610.

It's trained to generate what is most plausible, but with math, the only plausible response is the correct answer (assuming it has been trained on data where that has been the case)

[–] kogasa@programming.dev 4 points 4 months ago (1 children)

ChatGPT uses auxiliary models to perform certain tasks like basic math and programming. Your explanation about plausibility is simply wrong.

[–] ignotum@lemmy.world -2 points 4 months ago (1 children)

It has access to a python interpreter and can use that to do math, but it shows you that this is happening, and it did not when i asked it.

I asked it to do another operation, this time specifying i wanted it to use an external tool, and it did

You have access to a dictionary, that doesn't prove you're incapable of spelling simple words on your own, like goddamn people what's with the hate boners for ai around here

[–] kogasa@programming.dev 4 points 4 months ago

It has access to a python interpreter and can use that to do math, but it shows you that this is happening, and it did not when i asked it.

That's not what I meant.

You have access to a dictionary, that doesn’t prove you’re incapable of spelling simple words on your own, like goddamn people what’s with the hate boners for ai around here

??? You just don't understand the difference between a LLM and a chat application using many different tools.