hikaru755

joined 1 year ago
[–] hikaru755@feddit.de 5 points 4 months ago (1 children)

Iirc,mass effect lets you buy anything you miss in a store later, at least

[–] hikaru755@feddit.de 1 points 4 months ago (1 children)

Except if you continue reading beyond your Quote, it goes on to explain why that actually doesn't help.

[–] hikaru755@feddit.de 14 points 4 months ago

Companies and their legal departments do care though, and that's where the big money lies for Microsoft when it comes to Windows

[–] hikaru755@feddit.de 11 points 4 months ago

Training and fine tuning happens offline for LLMs, it's not like they continuously learn by interacting with users. Sure, the company behind it might record conversations and use them to further tune the model, but it's not like these models inherently need that

[–] hikaru755@feddit.de 11 points 4 months ago

And it makes sense to me that a business would leverage that data in ways to benefit themselves.

Big fat nope on that one. This is exactly what the GDPR is about. I'm giving you my data for a specific purpose, and unless I tell you otherwise, you have no fucking business using that data for anything else. Gonna be interesting to see how this one plays out in the EU.

[–] hikaru755@feddit.de 9 points 4 months ago

Happened with Lone Echo for me. It's a VR game where you're in a space station, and you move around in zero g by just grabbing your surroundings and pulling yourself along or pushing yourself off of them. I started reflexively attempting to do that in real life for a bit after longer sessions

[–] hikaru755@feddit.de 10 points 4 months ago

Isn't that the difference between a slur and an insult, that a slur is offensive in itself against a certain group of people*, while an insult depends on context?

*Unless used by people from that group itself

[–] hikaru755@feddit.de 1 points 4 months ago

HTTP is not Google-controlled, you don't need to replace that in order to build something new without Google

[–] hikaru755@feddit.de 26 points 4 months ago (5 children)

I agree with your first point, but the latter two:

—GPS data that could be stored and extracted from the dealership and sold or given to the government, insurance companies, and law enforcement. —GPS data that could be sent in real time if the car has a cellular connection or hijacks the cellular connection in your phone when you connect it to the car.

Why do you think this is more likely to happen with this new regulation, when most modern cars already have a functioning GPS module for navigation and cellular connection for software updates?

[–] hikaru755@feddit.de 1 points 4 months ago
[–] hikaru755@feddit.de 3 points 4 months ago

Yeah, it certainly still feels icky, especially since a lot of those materials in all likelihood will still have ended up in the model without the original photo subjects knowing about it or consenting. But that's at least much better than having a model straight up trained on CSAM, and at least hypothetically, there is a way to make this process entirely "clean".

[–] hikaru755@feddit.de 6 points 4 months ago (2 children)

There are legit, non-CSAM types of images that would still make these changes apparent, though. Not every picture of a naked child is CSAM. Family photos from the beach, photos in biology textbooks, even comic-style illustrated children's books will allow inferences about what real humans look like. So no, I don't think that an image generation model has to be trained on any CSAM in order to be able to produce convincing CSAM.

view more: next ›