fullsquare

joined 4 months ago
[–] fullsquare@awful.systems 1 points 1 day ago (3 children)

funny thing that you say it, because my normal day job is in pharma. nobody serious tries to make people live forever, there's enough real problems with treating and curing diseases as it is. we don't know first thing about primary cause of alzheimers that would be actually useful in treating it or even from keeping it from getting worse; we might have just barely figured out, maybe, what is cause of depression; we are clueless about finer details of other mental diseases, and there's dazzling array of thousands upon thousands of cancers and autoimmune and degenerative diseases, and if you wanted immortality, you'd have to figure it all, and make it work, and then some. no matter if you like it or not, people will keep dying, maybe slightly later and maybe after enjoying more years of healthy life, but it will still happen, as long as climate change doesn't get too hard in the way that is, then even that won't happen

but also there are also grifters and fantasts and downright idiots who thought that their favourite scifi is a documentary and they do sincerely believe, or sell that belief, that cryonics or brain uploading or unrestricted use of magic pills or fusion with holy machine or variety of other overhyped bullshit is real and will save them, and they will become oligarchs eternal. this is especially true of current tech billionaires that grew on these scifi works and took them too seriously, and also have disposable money to be grifted from, and in particular peter thiel, who has downright pathological fear of death after being traumatized as a kid, when he was around a slave operated uranium mine in occupied Namibia that fueled South African nuclear weapons program (i'm not making this up, look this up on your own)

immortality is maybe the last great promise of alchemy that wasn't either solved by modern science or abandoned, and futurists and altmed and others will proudly carry this mantle, as long as cash flows that is

[–] fullsquare@awful.systems 1 points 1 day ago (1 children)

even LLM development, trying to copy the way brains work

no, i'm gonna stop you right there. llms weren't made to mimic human brain, or anything like this, llms were made as tools to study language. it's categorically impossible for llms to provide anything leading to agi; these things don't think, don't research, don't hallucinate, don't have agency, cognition, don't have working memory the way humans do; these things do one thing and one thing only: generate string of tokens that were most likely to follow given prompt, given what was in the training data. that's it; that's all that there's to it; i know you were promised superhuman intelligence in a box but if you're using a chatbot, all intelligence there is is your own; if you think otherwise you're falling for massive ELIZA effect, a thing that has been around for fifty years now, augmented by blizzard of openai marketing propaganda, helped by tech journalists that never questioned these hypesters, funded by fake nerd billionaires of silicon valley that misremembered old scifi and went around building torment nexii, but i digress

Where does intelligence come from? Can it be duplicated in other ways?

i'm not saying that intelligence is exclusively always entirely biological thing, but i do think that state of neuroscience, psychology, and also computational side of research is woefully short of anything resembling pathway to solution to this problem. instead, this is what i think it's going to happen:

llms are dead end in this sense, but also these things take bulk of ai/ml funding now, so all these other approaches are ignored in terms of funding. historically, after every period of intense hype of this nature comes ai winter; this one is bound to happen too, and it might be worse since it looks like it also fueled investment bubble propping up large part of american economy, so when bubble pops, on top of historically usual negative sentiment stemming from overpromising and underdelivering there's gonna be resentment about aibros worming their way to management and causing mass layoffs, replacing juniors with idiot boxes and lobotomizing any future seniors pipeline etc etc.

what typically happened next is that steady supply of research in cs/math departments of many universities accumulated over low tens of years, and when some new good enough development happened, and everyone forgot previous failures, hype train starts again. this step will be slowed down by both current american administration cutting off funding to many types of research, and incoming bubble crash that will make people remember what kind of thing aibros are up to for a long time.

when, not if, most credulous investors' money including softbank thrown into openai gets burnt through, which i think might take couple of years tops, i would be very surprised if any of these overgrown startups doesn't become a smoking crater within five years, very few people will want to have anything to do with this all, and when the next ai spring happens, it might be well into 40s, 50s, and by then i guess that climate change effects will be too strong to ignore and just try and catch another hype train, there are gonna be much more pressing issues. this is why i think that anything resembling agi won't come up during my lifetime, and if you want to discuss gpt41 overlords in year 3107, feel free to discuss it with someone else.

[–] fullsquare@awful.systems 4 points 1 day ago* (last edited 1 day ago)
[–] fullsquare@awful.systems 20 points 3 days ago

ohhhh nooooo did actions happened to have consequences??

[–] fullsquare@awful.systems 29 points 3 days ago* (last edited 3 days ago)

if valley had fresh ideas for profitable business, they wouldn't go full into ai in the first place. lol

big brained sfba ceos try to make reality in the image of scifi that they misinterpreted when they watched it 15 years ago, and go around building torment nexii. behold, disruption! (snow crash|ready player one|who knows what else)

[–] fullsquare@awful.systems 12 points 5 days ago

par of the course for him, i guess, look up akon city

it's just something that attracts grifters to russia ig

[–] fullsquare@awful.systems 3 points 1 week ago (2 children)

otoh E contains active warzone and D two of them or more, depending on how you count

view more: ‹ prev next ›