this post was submitted on 07 Jun 2024
1229 points (92.8% liked)

Programmer Humor

36161 readers
327 users here now

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

founded 5 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[โ€“] eestileib@sh.itjust.works 29 points 1 year ago (1 children)

LLM system input is unsanitizable, according to NVidia:

The control-data plane confusion inherent in current LLMs means that prompt injection attacks are common, cannot be effectively mitigated, and enable malicious users to take control of the LLM and force it to produce arbitrary malicious outputs with a very high likelihood of success.

https://developer.nvidia.com/blog/securing-llm-systems-against-prompt-injection/

[โ€“] MalReynolds@slrpnk.net 2 points 1 year ago

Everything old is new again (GIGO)