this post was submitted on 07 Dec 2023
539 points (87.8% liked)

Asklemmy

43962 readers
1456 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy ๐Ÿ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[โ€“] Wheaties@hexbear.net 2 points 11 months ago (1 children)

You are ~30 trillion cells all operating concurrently with one another. Are you suggesting that is in any way similar to a Turing machine?

[โ€“] DahGangalang@infosec.pub 1 points 11 months ago (1 children)

Yes? I think that depends on your specific definition and requirements of a turing machine, but I think it's fair to compare the almagomation of cells that is me to the "AI" LLM programs of today.

While I do think that the complexity of input, output, and "memory" of LLM AI's is limited in current iterations (and thus makes it feel like a far comparison to "human" intelligence), I do think the underlying process is fundamentally comparable.

The things that make me "intelligent" are just a robust set of memories, lessons, and habits that allow me to assimilate new information and experiences in a way that makes sense to (most of) the people around me. (This is abstracting away that this process is largely governed by chemical reactions, but considering consciousness appears to be just a particularly complicated chemistry problem reinforces the point I'm trying to make, I think).

[โ€“] Wheaties@hexbear.net 0 points 11 months ago (1 children)

My definition of a Turing machine? I'm not sure you know what Turing machines are. It's a general purpose computer, described in principle. And, in principle, a computer can only carry out one task at a time. Modern computers are fast, they may have several CPUs stitched together and operating in tandem, but they are still fundamentally limited by this. Bodies don't work like that. Every part of them is constantly reacting to it's environment and it's neighboring cells - concurrently.

You are essentially saying, "Well, the hardware of the human body is very complex, and this software is(n't quite as) complex; so the same sort of phenomenon must be taking place." That's absurd. You're making a lopsided comparison between two very different physical systems. Why should the machine we built for doing sums just so happen to reproduce a phenomena we still don't fully understand?

[โ€“] DahGangalang@infosec.pub 1 points 11 months ago* (last edited 11 months ago) (1 children)

Thats not what I intended to communicate.

I feel the Turing machine portion is not particularly relevant to the larger point. Not to belabor the point, but to be as clear as I can be: I don't think nor intend to communicate that humans operate in the same way as a computer; I don't mean to say that we have a CPU that handles instructions in a (more or less) one at a time fashion with specific arguments that determine flow of data as a computer would do with Assembly Instructions. I agree that anyone arguing human brains work like that are missing a lot in both neuroscience and computer science.

The part I mean to focus on is the models of how AIs learn, specifically in neutral networks. There might be some merit in likening a cell to a transistor/switch/logic gate for some analogies, but for the purposes of talking about AI, I think comparing a brain cell to a node in a neutral network is most useful.

The individual nodes in neutral network will have minimal impact on converting input to output, yet each one does influence the processing of one to the other. Iand with the way we train AI, how each node tweaks the result will depend solely on the past I put that has been given to it.

In the same way, when met with a situation, our brains will process information in a comparable way: that is, any given input will be processed by a practically uncountable amount of neurons, each influencing our reactions (emotional, physical, chemical, etc) in miniscule ways based on how our past experiences have "treated" those individual neurons.

In that way, I would argue that the processes by which AI are trained and operated are comparable to that of the human mind, though they do seem to lack complexity.

Ninjaedit: I should proofread my post before submitting it.

[โ€“] Wheaties@hexbear.net 2 points 11 months ago (1 children)

I agree that there are similarities in how groups of nerve cells process information and how neural networks are trained, but I'm hesitant to say that's a whole picture of the human mind. Modern anesthesiology suggests microtubuals, structures within cells, also play a function in cognition.

[โ€“] DahGangalang@infosec.pub 2 points 11 months ago

Right.

I don't mean to say that the mechanism by which human brains learn and the mechanism by which AI is trained are 1:1 directly comparable.

I do mean to say that the process looks pretty similar.

My knee jerk reaction is to analogize it as comparing a fish swimming to a bird flying. Sure there are some important distinctions (e.g. bird's need to generate lift while fish can rely on buoyancy) but in general, the two do look pretty similar (i.e. they both take a fluid medium and push it to generate thrust).

And so with that, it feels fair to say that learning, that the storage and retrieval of memories/experiences, and that the way that that stored information shapes our sub-concious (and probably conscious too) reactions to the world around us seems largely comparable to the processes that underlie the training of "AI" and LLMs.