> At one point, Google ran a test that pitted its search engineers against RankBrain. Both were asked to look at various web pages and predict which would rank highest on a Google search results page. RankBrain was right 80 percent of the time. The engineers were right 70 percent of the time.
I don't really understand the point of this metric. Why are they predicting what ranks highest on Google search? Wouldn't a better metric be who predicts the correct place a user was looking for?
Is the thinking that if they are using machine learning, than whatever the user is looking for should have bubbled up to the top?
I am in the recent days having the impression that Google whatever it is doing is focused more and more in presenting to the user google biggest clients, and hoping that it will be useful.
Because I am having more and more trouble finding what I want, people used to consider me a master of google fu, finding whatever random stuff they wanted, now I am struggling, specially after google changed the + and "" meaning (+ went from "mandatory" to mean "google plus search" and " went to mean "literal string" to mean "a sort of mandatory thing")
If I need to find some obscure term, I know now that google won't find it, despite finding that same term in the past, finding pages with a certain information on it never happen anymore, even using the "" thing.
For example I own a ASUS N46VM laptop with nVidia Optimus... this laptop is terrible, and I am always having to look online how to make it behave properly, before the "+" change, I could type +N46VM and be guaranteed I only would get relevant ifnormation... recently I was desperately searching for some stuff, and found out no matter what I input on google, it returned completely bogus results, where the string N46VM was nowhere in the page, not even in the "time" dimension (ie: if I load the page on archive.org for example and scan every version of it, N46VM never had been on it, google just heuristically decided the page was relevant and gave it to me wrongly).
EDIT: I am having some success with DuckDuckGo
although their research system is clearly cruder than google, having much less heuristics and whatnot, frequently I find the stuff I want easier on DuckDuckGo anyway, after some pages of browsing results... while on google I browse 40 pages and all of it is completely irrelevant and unrelated (while on DuckDuckGo it shows me 40 pages with the term I want, but in the wrong context).
Same experience. Changing the special characters' meanings ruined my google-fu, and starting around ~'08-'09 they seemed to give up fighting linkfarm sites and just stopped ever ranking small sites highly unless you search for something so specific that only one small site could possibly match.
It's made the web feel way, way smaller than it used to, and probably has made it smaller by starving small-but-relevant sites of traffic. Meanwhile any content site (news sites and such) that appears on the first page of results is now virtually indistinguishable from the horrible old link farms, in terms of screen real estate devoted to scummy ads.
The Old Web had popups and such, but even so the modern web feels way dirtier, somehow.
>"It's made the web feel way, way smaller than it used to, and probably has made it smaller by starving small-but-relevant sites of traffic. "
That's the beauty of it (For Google, anyways). Now those small, traffic-starved sites are dependant on getting traffic via paid adwords. Either that, or they have to rely on "organic" social-media cruft in order to get anywhere meaningful.
There is a "verbatim" mode of search that may help in these cases, which looks like it turns off a bunch of search heuristics. When you get your result, there's a "Search tools" button that reveals a dropdown that defaults to "All results". In that dropdown is a "Verbatim" option. Try that.
You can add the query parameter "tbs=li:1" to get verbatim results right away.
> "verbatim" mode... which looks like it turns off a bunch of search heuristics.
Ugh. This sounds like one of those families of PHP escaping functions: `escape_string()`, `really_escape_string()`, `escape_string_all_the_way()`, `no_really_i_mean_it_this_time_escape_string()`, etc. Google seems to have improved somewhat at fighting SEO spam, but their efforts to "helpfully" change queries have consistently made their service worse.
I'm with you and I have a pet theory why this is so:
When Google was new I found it strange that when I looked up "Apache" pages about the web server appeared before pages about the Native American tribes.
I was happy because the webserver was what I was looking for, but I found it strange because this was not what most people around me would have expected.
DuckDuckGo is like the old Google, it's great for tech and science queries. It's great for masters of the old Google fu.
Google on the other hand tries to cater for a huge and diverse audience with different expectations from the search results.
Some would expect information about the web server, some about Native American tribes. When Google's user base grew improving overall search quality meant worse results for some specific user groups.
Google tries to solve this problem with personalized search.
My pet theory is that Google puts almost all effort into personalized search. Regular search is probably not so important to them anymore, because almost everyone is logged into Google or has at least a long living Google cookie.
tl;dr DuckDuckGo is the old Google, the new Google is only good with personalized search
I feel the same way I used to be able to find very hard to find things on Google. Now it's hard to find relativity common things using Google.
I don't know if it's because they've deprecated some search operators or the index is nurfed by some invisible changes or I'm just remembering things better than they were.
It is definitely worse. My intuition is that it's driven both by a conscious decision to care more about the masses rather than power users, and to drive users to partner/customer sites. There is a conflict of interest here - it's in Google's interest to push users to sites that drive the most revenue for google, whether through advertising or otherwise.
If they are interested ranking results based on the profile they have on you (search history, interests, demographic++), trying to predict the ranks is understandable.
I don't really understand the point of this metric. Why are they predicting what ranks highest on Google search? Wouldn't a better metric be who predicts the correct place a user was looking for?
Is the thinking that if they are using machine learning, than whatever the user is looking for should have bubbled up to the top?