Oh I see! I'm only really familiar with the normie usage to make soulless remixes of their profile picture or those lawyers who keep getting held in contempt of court for submitting documents with hallucinations to the court.
luchuan
I feel so conflicted about this. On the one hand huge reductions in resource consumption of these things is good for everyone. The Western ones are so wasteful for how quickly and widely pushed they are. On the other hand these things feel like technology in search of a problem.
Society will be radically changed long before AI radically changes society.
Given the mostly white, bourgeois preoccupation with "x-AI risk" (existential/extinction) I think the real "risk" is that the self-legitimating myths of capitalism will fall on muted microphones. Even 10 years ago when AI was still called machine learning and it was much less impressive (its outputs were exclusively categorization of inputs) and it required decades of breakthroughs and to be hooked up to every input in society and multiplexed with every output to do anything "harmful" the x-AI risk people were running around crying (this holds true today of LLMs and other statistically likely to exist content emitters).
The pitch is always that the AI will decide the needs of the many outweigh the needs (private property rights) of the few. This is only scary if you are among that few. Even property rights obsessed liberals don't think themselves among the few who will be exproprAIted but are outraged by the expropriation itself. It's a boogyman spewed by the people who are the problem and we're asked to share their fear. Ridiculous.
Unlike other private property and artifacts of capital accumulation which are inert (the workers may organize against you but the steel mill itself won't), the AI their capital gives birth to might in several decades time maybe organize against you (but not really).
A data scientist can tell most of someone’s life story given their zip code, and we try not to think too hard about why that’s always the most predictive feature in the model.
Hmmmmm
The Silicon Valley attitude of collect now, monetize later, and maybe secure eventually is why. Every one of these people thinks they can make some AI that magically make their app and your life better.
My thoughts exactly. Society will have radically changed before these AIs radically change society.
I will be citing Beavis and Butthead v The Commonwealth of Deez Nutz as I was told by the AI it applies.
I think that's because there are leaks from blind characterizing what the ai org is doing vs openai at which I haven't seen comparable assessments of leadership reactions leak.
Microsoft leaders should also be panicking given their big investment in openai but no one seems to have said anything about it either.