this post was submitted on 30 Mar 2024
543 points (96.4% liked)
Not The Onion
12410 readers
1621 users here now
Welcome
We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!
The Rules
Posts must be:
- Links to news stories from...
- ...credible sources, with...
- ...their original headlines, that...
- ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”
Comments must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.
And that’s basically it!
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The report actually suggests a new bias and neutrality editing framework with its own edit history, unrelated to existing content editing tools.
In other words, the argument is that the current editing framework does not do enough to specifically address bias and neutrality. That seems pretty clear to me regardless of current events.
I know edits to add and correct bias do happen. I agree it would be nice if power editors, at least, were not anonymous. I wish there was a Wikipedia that could only be edited be verified, trusted experts. The potential is there with the fediverse. And in fact I thought Wikipedia was working on this. I requested an invite but never got one.
Such edits for neutrality (as well as to insert bias) are made. There is a history. It is talked about and recorded. It is searchable. It is distributed. Man, you should hear these Wikipedia editors talk to each other if you haven't, it's like a different language.
Anyway: the source article suggests an extra layer to that system, with public standards and criteria supported by research, which it also proposed, and suggests that editors could be monitored for bias based on such standards.
I see the potential for draconian abuse but this is one website. As I said, I hoped there would be a fediverse instance to consolidate legitimate expert, factual information. Someone shared a website with me the other day that included such technical analysis for current events. I will link it when I get another minute.
E: here's that link https://www.sciencemediacentre.org
Wikipedia do lock articles so that only editors with good standing can change them. But obviously that's not necessary for every article because 99% of articles are not political and are in fact about a type of moss that grows in the Canary Islands.
That's what the world is about, so 99% of articles being about that moss makes sense
A wikipedia written by only verified trusted experts is called an encyclopedia, we have those online now. I think there was once a wikipedia-like online encyclopedia way back when in the late 90s or early 2000s that would only allow verified experts in whichever subject to participate to edit and create articles. I can't find what I'm talking about atm but it basically died from lack of participation and only had a hundred or so entries.
The current platform does enough to address bias and neutrality. If you are doing so bad you want a lopsided view of what you did, you're supposed to fork it and let it die like other free speech oppressors do, not compile PDF with stupid suggestions to mainline.
Everything has to be sourced from a reputable source. So I don't see why this is a huge problem. As long as they're sourcing their edits, and using reputable, verifiable sources, why should it matter if they're anonymous or not?
Also, reading the 3 pages of recommendations again, I don't think that's what it said:
That sounds like normal editing history for everything to me.
There's also an existing template to mark the talk pages of editors suspected of having a conflict of interest based on their edit history.
A 'pedia written by invite only was Nupedia, which has been dead for a very long time. So basically you meant that the article suggests to add a forked history for a more neutral version? Not sure if that makes it dumber or smarter.
Rather than talk about what Wikipedia should or shouldn't do to improve, people should take the initiative of helping to improve it themselves. Wikipedia is ultimately a collective of its volunteer editors, so the best way of enacting change on the platform is getting more people to make informed, unbiased improvements to articles.
!remindme 2 days
https://www.sciencemediacentre.org/