Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

So does this mean that something like example.com/password-reset?token_id=2h2GV4nhySERT9pJ may get the random token stripped?

I don't understand how you can have a heuristic that doesn't break things.



I use an addon called NeatURL that strips out tracking parameters.

It has a specific blacklist of parameters to strip. In the several years I've been using it, I've only had two websites break from it, both being legit surveys that I needed to take.


I use a Firefox extension for this same purpose. They maintain a large database of known tracker parameters and strip them. This means occasionally a new or unknown one slips through the cracks, but overall is very effective.


Won’t the tracking companies work around this by providing each account their own set of unique obfuscated tracking names and keywords that gets mapped back behind the scenes? Impossible to build a database that way.


They already somewhat have.

View the demo in normal mode at https://fingerprint.com/ and then open it again in Incognito


Or simply generate a whole new URL with an UUID for every share.


Probably one day, yes. And then the cat and mouse game will continue.


Not one day. Facebook started this practice a year ago.


I assume it only blocks known tracking. Using lists, similar to how content/ad blockers work.


That wouldn't work - people would just start giving their query parameters different name, e.g. instead of "gclid", it might be "dave".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: