Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Now I want to mark this site (in Google, say) with a "high quality" measure.

Then I want my search results to prefer this page, and other pages marked high for quality by people who [some algorithm] to this page.

Then Google would be far more useful to me than it is now, when screeds of blogspam always seem to occupy the top organic results for any "how to" tech questions.

In turn I would be a happier and more loyal Google customer (not a high bar).



And then less reputable website will hire consultants and influencers to mark their website as high quality. And then click farms will appear in India and China to mark website's as high quality at a price. And that is why we cannot have nice stuff / nice search results...


It's not about who else marked this as high quality. I want my signal to count (disproportionally).


YouTube has what you want and it sucks. You end up in an echo chamber of your own preferences. I know what you mean but sometimes your wrong about the world and knowledge so it makes sense to make your preference compete with others


Youtube can be house-trained to occasionally bring you something interesting. It does take effort, but it's not as bleak as I've seen people complaining about it.


This is something that would be really cool to have and I think it would work if you could set up a proper incentive structure so it doesn't get gamed.

Some comments I had on this a little while back: https://news.ycombinator.com/item?id=20282606


You're contradicting your first comment:

> I want my search results to prefer [.....] other pages marked high for quality by people who [some algorithm] to this page.

Facebook/YouTube already do this and it is a disaster, which maybe even has indirectly contributed to rise of antivaxx, Q, Trump election, Capitol attack etc.

Before personalized recommendation systems, when an antivaxxer made a video, hardly anyone saw it, when they liked it, the like resulted in nothing.

When their likes started affecting their personal search results, liking one antivaxx video made search produce hundreds more of them. It finds so much of the same stuff they no longer have free time to check an opposing opinion.

You're operating under an assumption that your judgement is more correct or better for you, and the entire problem is about finding gems in an ocean of trash. The reality is more complicated, for you antivaxx content is trash, for them it is your content that is trash.


I see what you mean, but like the OP, I would rather have google search preference resources that I like over resources that everyone in the internet.

Like if I think Wikipedia is a better resource than crunch base, then let me make that decision. I don’t care how good their SEO is.


This is not really whataboutism, with two sides.

And in any case, I don't care. I know what I feel is good content and it's stuff like this, and I want more of it, and less blogspam.

So yes, my assumption is that my judgement is more correct for me - because it is.


> my judgement is more correct for me - because it is

That is plain arrogant. Not only you refuse to remember that you may be wrong, you also want search engines to protect you from ever discovering that.

What if your neighbor likes something that will eventually hurt you. For example, burning the nearby 5G tower that serves your phone, because they like the theory that will improve everyone's health around. Is their judgement correct for them? Do you want them to get more of stuff they like? Should their community of single-minded people be assisted by search engines in avoiding (what they think is) spam?

I understand everybody wants less SEO spam, I simply point that the solution you're thinking about has already been tried and found to have consequences no one expected. "Give me more stuff that I like" is the old problem. The new problem is called "How to find more stuff that I like without creating an echo chamber".


You seem to be all about straw man situations affecting other people - I'm (perhaps selfishly) only interested in the quality of my own search results, and how much better they could be if I could give feedback to the search engine.

If my neighbour burns down the cell tower, if another neighbour has trapped themselves in an antivax chamber on Facebook - well, those are some hypotheticals that have literally nothing with my point, which is simply that I want better search results, driven my my own selections.


Have you tried making your own google search? You can pick what sites it will search. I'm not sure if you can say "Prioritize these sites, but still give me results from other sites too".

I do wish there was something next to each search result though that was like "I like this site, include it more often." or "This site is garbage, ignore it forever"

I think it was this thing:

https://programmablesearchengine.google.com/about/


I will subscribe to a site I like in an rss reader. Then you can search your feeds. If that fails, you can broaden your search (e.g. Google).


How do you rate the quality of things on Facebook?

My app only allows me to "like" things and I like some stuff that is low quality because I like to see it in my feed.


I think what they're going for is more personalized results -- i.e., don't aggregate my preferences with those of a bunch of click farms.


Let everyone mark sites as high or low quality, and then count other peoples ranking in proportion to how well the correlate with your ranking.

(A vague idea I've wanted to implement for awhile now for both search engines, and for HN/reddit like sites... but the amount of effort to do it well would be really high)


I would guess that, apart from the immense effort of building it, delivering personalised search results like this would be enormously expensive in storage for the search engine. Much more than sorting people into a few cohorts/buckets.

But FFS, it's 2021, we deserve some decent search engine results.

I doubt Google would do it unless they absolutely had to, so I hope you or someone else forces their hand and shows them that it's time.


> I would guess that, apart from the immense effort of building it, delivering personalised search results like this would be enormously expensive in storage for the search engine.

How expensive would it really be?

You have O(the_internet) in pages and metadata, and you have O(world_population) in user preferences. So long as your index structure allows those to be mostly decoupled (if I had to take a first crack at it I'd probably try to embed preferences and pages into a vector space and build a projection index -- exact matches are hard in that system, but decent personalized results are easy), I don't think it'd be all that much more space than a non-personalized search engine, especially given that the world population is kind of small compared to the size of the internet.

For that matter, the web isn't thaaat big (ignoring images and video). The entire common crawl can fit on a single $3k-$5k disk uncompressed.


A web-of-trust approach might help with that.

Select your own trusted reviewers, and unsubscribe from any that start recommending garbage.


Something like this would feel like a small first step, maybe the other way around though.

I'd like to put a timeout on any blogspam web sites I come across, and not have them show up in my search results again for one year. By that time they might have switched to having decent content, or maybe just gone down the toilet anyway or been pushed down by Google's algorithms. Or if not I could ban hammer them again.


Maybe you want to set up a programmable search engine? https://support.google.com/programmable-search/answer/451388...

Haven't tried it myself; but remember using a predecessor to this maybe 15 years ago.

"You may want to augment your results with general Web Search results. This includes results from anywhere on the web, but places emphasis on your personalized results"


That's interesting, but it lost me at " all you have to do is choose which sites to search".


I was hoping you would just need to add urls to it.

I'm guessing the right way to handle this problem is to search your browser history or bookmarks but those are additional searches.


I'd love something like this.

I've managed to get partway there by using ArchiveBox and feeding it my bookmarks, and then using the sonic to search through them.

Would be great to have this more streamlined into a normal Google/DuckDuckGo/whatever search workflow, with an option to also search your friends' bookmarks.



This instinct is so right, but the multitude of comments show how hard of a problem this is. I'd be so elated to work on something that solves this but I think the answer may lie in self-regulation.

We software people are generally mini-philosophers and it can be easy to lose sight of ourselves. But can you or I solve Search?

I guess what I've learned before replying is that it is often a high bar whether we recognize it as such.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: