Doubt the effectiveness of doing this one time. I already go on stints looking for something that's not in my long-standing interests. My browsing history will correct the aberration afterwards.
IMO, to hide true traffic you'd need to run a constant steam of noise requests, in which the real visits would be undiscernible. And the noise might need to be non-uniform, unpredictable and still follow the patterns of normal traffic, so it can't be filtered out.
I propose peer-to-peer networked browsing. Whenever someone else visits a website, it loads on your browser, with your cookies. Their browser anonymously and securely sends the URL to your browser. It runs in the background, loading fully rendered web pages headlessly. And also these URLs never make it on to your browsing history. In fact you do not ever see the URLs at all, for the privacy of the network. They exist only for the trackers.
I though of something similar some ago: rather than loading whole pages of sites users don't visit themselves, share cookies for tracking sites so every now and then my google ad cookies get swapped with someone random, my facebook tracking cookies with someone else, etc. This would get around the "background visiting something you really don't want anyone thinking you deliberately accessed" because you only load what you would have loaded anyway, just with different tracking cookies each time.
Thinking it through though there are potentially massive problems, the first of which is that I can think of ways to circumvent it to a large extent with deeper fingerprinting, and if I can no doubt the ad servers could if they the system became popular enough for them to care. Other issues would include making sure useful cookies were not swapped out causing both security and usability issues for the participants.
yeah swapping cookies would require some sort of predictable way of swapping the cookies. -- with your idea you don't want to swap out all of your cookies, since some websites actually need those to function, just the tracking ones. And of course you also don't want to swap out cookies that contain sensitive information like "first name last name address", which some websites may use because they have designed their website with the idea that your cookies will not be shared.
> because they have designed their website with the idea that your cookies will not be shared
Or they were designed by people with little knowledge of relevant security matters.
Or little care for security matters: all too often people throw together a proof-of-concept (naively leaving out thought of concerns like that, as they can be addressed in proper development later if the PoC is successful) then find it being rushed into production without the required review/refactor...
That depends on whether they're tailored to show me products I might be interested in, or misleading information that has been tested to work well to influence people with my political inclinations.
It already exists, Its called ad nauseam. Its an adblocker that opens every ad in the background. Google has banned it from the chrome store so that says something.
If we want to fight tracking, both users and publishers have to see some benefit in it. Now publishers want first-party tracking and don't care about third-party; while users mostly don't care.
I already have to do that, if you're on Firefox, you get treated like a bot and have to solve 10+ puzzles on recaptcha, possibly even just ending on a "network error" screen regardless.
I don't run with resist fingerprinting, the problem persists even if I am on a fresh firefox install with no addons and try to log into my google account.
I enable all that stuff and the number of ReCaptchas, while annoying is probably on like 30% of the sites I visit. And most of them I can live without.
DuckDuckGo might be a cure to this problem. I set up it as default search engine on home main pc a few weeks ago and although I still have sometimes to resort to Google for a few searches, it got remarkably better compared to the past.
> Which has the side effect of requiring you to input a captcha every time you quickly want to google anything.
As I mainly use DDG and considering I get a fuck-you hardest-level (or simply "sending too many requests") recaptcha every time I want to use a site that uses it… What difference does it make? For the few cases where DDG's stupid "let's ignore search terms" decision bites me I can simply use startpage with !sp
No, AdNauseam doesn't work this way, and there are no problems with captcha whatsoever. Other anonymizing solutions might have this effect, but not AN.
The op is implying that by using AN you might trigger some "bot warnings" and Google will start showing captcha more often. Considering that AN is quite similar to a bot, why wouldn't this be the case?
"Which has the side effect of requiring you to input a captcha every time you quickly want to google anything" - this is a common scenario not with AN but in scenarios when Google doesn't have much data about you - a fresh setup, no Google cookies, blocking cookies, Tor browser etc. AN is orthogonal to that - it simply "clicks" ads, bringing Google more money at the expense of advertisers. I use AN on several computers and observe no above-mentioned side effects when Google has enough data on you. When they don't, they will show the captcha. So it's orthogonal to AN.
Sorry I have to disagree. I use DuckDuckGo for year by now and 60% of the time I add !g to my queries. It is slowly improving over time but not quite yet there.
(Links to the blog post about this - https://blog.mozilla.org/firefox/hey-advertisers-track-this/)