this post was submitted on 20 Jan 2024
168 points (88.2% liked)

Futurology

1854 readers
50 users here now

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] Truck_kun@beehaw.org 1 points 11 months ago* (last edited 11 months ago) (1 children)

Well, maybe Google can add a catered feature (not by them, that would suck), where by users can publish lists of trusted sites to search, and a user can optionally select a catered list of someone they trust, and Google will only search sites on that list.

Possibly allow multiplexing of lists.

So say I am looking for computer security, I can a catered list for sites "Steve Gibson" trusts, and a list of trustworthy sources "Bleeping Computer" uses, and anything I search for will use both lists as a base for the search.

Maybe it isn't something people even publish to the search engine; maybe they publish a file on their site that people can point the search engine to, like in Steve Gibson's case the fictitious file: grc.com/search.sources or create a new file format like .cse (catered search engine), grc.com/index.cse

Maybe allow individual lists to multiplex other lists. Something like this multiplexing two lists added to some additional sites, sub domains, directories, and * all subdomains:

multiplex: grc.com/search.cse

multiplex: bleepingcomputer.com/search.sources

arstechnica.com

*.ycombinator.com

stackoverflow.com

security.samesite.com

linux.samesite.com

differentsite.com/security

differentsite.com/linux

Honestly sounds like a horrible idea, but in a world filled with everything made by AI content, it may become a necessity.

Anyways, I officially put the above idea into the Public Domain. Anyone can use or modify it; feel free Google/Bing.

EDIT: It was posting all fake addresses on the same line, so trying to force them onto separate lines.

[โ€“] Truck_kun@beehaw.org 1 points 11 months ago

Apparently in the time I put thought into, typed up, changed things, etc, someone else posted a curating idea, so maybe it's not such a bad idea after all. AI content internet is going to suck.

To expand on the sounding like a horrible idea, it's mainly because if people rely too much on it, it creates a bubble, and limits the ability to discover new things or ideas outside of that bubble. But if outside of that bubble just sucks or is inaccurate, meh, what are you going to do? Especially if you are researching for something you are working on, could be a paper, a project, maybe something that could have dire financial or safety concerns if you get something wrong, and may need the information to be reliable.