armchair_progamer

joined 1 year ago
[–] armchair_progamer@programming.dev 2 points 3 weeks ago* (last edited 3 weeks ago)
 
[–] armchair_progamer@programming.dev 5 points 2 months ago (2 children)

But is it rewritten in Rust?

[–] armchair_progamer@programming.dev 78 points 3 months ago (2 children)

“I’ve got 10 years of googling experience”.

“Sorry, we only accept candidates with 12 years of googling experience”.

[–] armchair_progamer@programming.dev 29 points 5 months ago* (last edited 5 months ago)

C++’s mascot is an obese sick rat with a missing foot*, because it has 1000+ line compiler errors (the stress makes you overeat and damages your immune system) and footguns.

EDIT: Source (I didn't make up the C++ part)

[–] armchair_progamer@programming.dev 51 points 6 months ago* (last edited 6 months ago) (6 children)
public class AbstractBeanVisitorStrategyFactoryBuilderIteratorAdapterProviderObserverGeneratorDecorator {
    // boilerplate goes here
}
 

A New Zealand supermarket experimenting with using AI to generate meal plans has seen its app produce some unusual dishes – recommending customers recipes for deadly chlorine gas, “poison bread sandwiches” and mosquito-repellent roast potatoes.

The app, created by supermarket chain Pak ‘n’ Save, was advertised as a way for customers to creatively use up leftovers during the cost of living crisis. It asks users to enter in various ingredients in their homes, and auto-generates a meal plan or recipe, along with cheery commentary. It initially drew attention on social media for some unappealing recipes, including an “oreo vegetable stir-fry”.

When customers began experimenting with entering a wider range of household shopping list items into the app, however, it began to make even less appealing recommendations. One recipe it dubbed “aromatic water mix” would create chlorine gas. The bot recommends the recipe as “the perfect nonalcoholic beverage to quench your thirst and refresh your senses”.

“Serve chilled and enjoy the refreshing fragrance,” it says, but does not note that inhaling chlorine gas can cause lung damage or death.

New Zealand political commentator Liam Hehir posted the “recipe” to Twitter, prompting other New Zealanders to experiment and share their results to social media. Recommendations included a bleach “fresh breath” mocktail, ant-poison and glue sandwiches, “bleach-infused rice surprise” and “methanol bliss” – a kind of turpentine-flavoured french toast.

A spokesperson for the supermarket said they were disappointed to see “a small minority have tried to use the tool inappropriately and not for its intended purpose”. In a statement, they said that the supermarket would “keep fine tuning our controls” of the bot to ensure it was safe and useful, and noted that the bot has terms and conditions stating that users should be over 18.

In a warning notice appended to the meal-planner, it warns that the recipes “are not reviewed by a human being” and that the company does not guarantee “that any recipe will be a complete or balanced meal, or suitable for consumption”.

“You must use your own judgement before relying on or making any recipe produced by Savey Meal-bot,” it said.

 

From https://www.reddit.com/r/NoStupidQuestions/comments/14phpbq/how_is_it_possible_that_roughly_50_of_americans/

Question above is pretty blunt but was doing a study for a college course and came across that stat. How is that possible? My high school sucked but I was well equipped even with that sub standard level of education for college. Obviously income is a thing but to think 1 out of 5 American adults is categorized as illiterate is…astounding. Now poor media literacy I get, but not this. Edit: this was from a department of education report from 2022. Just incase people are curious where that comes from. It does also specify as literate in English so maybe not as grim as I thought.