100
‘Eugenics on steroids’: the toxic and contested legacy of Oxford’s Future of Humanity Institute
(www.theguardian.com)
This is a most excellent place for technology news and articles.
This is the best summary I could come up with:
The institute, which was dedicated to studying existential risks to humanity, was founded in 2005 by the Swedish-born philosopher Nick Bostrom and quickly made a name for itself beyond academic circles – particularly in Silicon Valley, where a number of tech billionaires sang its praises and provided financial support.
Bostrom is perhaps best known for his bestselling 2014 book Superintelligence, which warned of the existential dangers of artificial intelligence, but he also gained widespread recognition for his 2003 academic paper “Are You Living in a Computer Simulation?”.
His office, located in a medieval backstreet, was a typically cramped Oxford affair, and it would have been easy to dismiss the institute as a whimsical undertaking, an eccentric, if laudable, field of study for those, like Bostrom, with a penchant for science fiction.
Among the other ideas and movements that have emerged from the FHI are longtermism – the notion that humanity should prioritise the needs of the distant future because it theoretically contains hugely more lives than the present – and effective altruism (EA), a utilitarian approach to maximising global good.
Fifteen months ago Bostrom was forced to issue an apology for comments he’d made in a group email back in 1996, when he was a 23-year-old postgraduate student at the London School of Economics.
Just a month before Bostrom’s incendiary comments came to light, the cryptocurrency entrepreneur Sam Bankman-Fried was extradited from the Bahamas to face charges in the US relating to a multibillion-dollar fraud.
The original article contains 1,246 words, the summary contains 245 words. Saved 80%. I'm a bot and I'm open source!