I use an addon called NeatURL that strips out tracking parameters.
It has a specific blacklist of parameters to strip. In the several years I've been using it, I've only had two websites break from it, both being legit surveys that I needed to take.
I use a Firefox extension for this same purpose. They maintain a large database of known tracker parameters and strip them. This means occasionally a new or unknown one slips through the cracks, but overall is very effective.
Won’t the tracking companies work around this by providing each account their own set of unique obfuscated tracking names and keywords that gets mapped back behind the scenes? Impossible to build a database that way.
I don't understand how you can have a heuristic that doesn't break things.