Do you mean like ghosts or like quantum randomness and Heisenberg's uncertainty principle?
We cannot compute exactly what happens because we don't know what it is, and there's randomness. Superdeterminism is a common cop out to this. However, when I am talking about whether something is computable, I mean whether that interaction produces a result that is more complicated than a turing complete computer can produce. If it's random, it can't be predicted. So perhaps a more precise statement would be, my default assumption is that "similar" enough realities or sequences of events can be computed, given access to randomness, where "similar" is defined by an ability to distinguish this similulation from reality by any means.
The last digit of pi doesn't exist since it's irrational. Chaitan's constant, later busy beaver numbers, or any number of functions may be uncomputable, but since they are uncomputable, I'd be assuming that their realizations don't exist. Sure, we can talk about the concept, and they have a meaning in the formal system, but that's precisely what I'm saying: they don't exist in this world. They only exist as an idea.
Say for instance that you could arrange quarks in some way, and out pops, from the fabric of the universe, a way to find the next busy beaver numbers. Well, we'd be really feeling sorry then, not least because "computable" would turn out to be a misnomer in the formalism, and we'd have to call this clever party trick "mega"-computable. We'd have discovered something that exists beyond turing machines, we'd have discovered, say, a "Turing Oracle". Then, we'd be able to "mega"-compute these constants. Another reason we'd really feel sorry is because it would break all our crypto.
However, that's different than the "idea of Chaitan's constant" existing. That is, the idea exists, but we can't compute the actual constant itself, we only have a metaphor for it.
OpenGL and pre-12 DirectX were the attempt at unifying video programming in an abstract way. It turned out that trying to abstract away what the low-level hardware was doing was more harmful than beneficial.
> It turned out that trying to abstract away what the low-level hardware was doing was more harmful than beneficial.
Abstraction isn’t inherently problematic, but the _wrong_ abstraction is. Just because abstraction is hard to do well, doesn’t mean we shouldn’t try. Just because abstraction gets in the way of certain applications, doesn’t mean it’s not useful in others.
Not to say nobody is trying, but there’s a bit of a catch-22 where those most qualified to do something about it don’t see a problem with the status quo. This sort of thing happens in many technical fields, but I just have to pick on GPU programming because I’ve felt this pain for decades now and it hasn’t really budged.
Part of the problem is probably that the applications for GPUs have broadened and changed dramatically in the last decade or so, so it’s understandable that this moves slowly. I just want more people on the inside to acknowledge the problem.
I think there are usually two: Calculus for scientists and engineers which is analytical and has lots of symbols, and Calculus for everyone else which is more practical.
Math majors might have their own. I also know they end up taking complex Calculus.
Thinking about it, ours was a small college -- 2500 students. So there may have been a practical reason for everybody taking the same math courses. They were taught more as "service" courses for the sciences and engineering than as theoretical math courses. And the students who didn't need calculus typically satisfied their math requirement with a statistics course.
Complex analysis and real analysis were among the higher-level courses, attended mostly by math majors, with the proviso that there were a lot of double majors. That was where it got interesting.
The requirements for the physics major were only a handful of math credits shy of the math major.
>The requirements for the physics major were only a handful of math credits shy of the math major.
lol, that's how I ended up with a math major. Got lost in the physics (realized I had no intuition for what was actually happening, just manipulating equations) took a couple extra courses, and boom! Math!
They were not in the storage market. They were in the tape market. It just so happens tape was used for storage at the time (floppy is essentially tape in a circular shape).