If one wants to start talking cosmology, it's unlikely to the case that arbitrarily long-lived computers are possible, I don't think any of the theories in [0] are conducive to either an infinite-time or infinite-memory computer, so the strict mathematical definition for Big-O doesn't hold up. IMO it's better to use Big-O as an effective theory for predicting runtime on human-scale computers than take the mathematical formalism too literally.
That doesn't apply for the Bekenstein Bound though.
Literally the first line of the wikipedia article:
> In physics, the Bekenstein bound (named after Jacob Bekenstein) is an upper limit on the thermodynamic entropy S, or Shannon entropy H, that can be contained within a given *finite* region of space which has a *finite* amount of energy—or equivalently, the maximum amount of information that is required to perfectly describe a given physical system down to the quantum level.
I mean, if you want to talk about our actual tech, it's bound by lithography of silicon chips, which are largely n^2, on printed circuit boards, which are n^2, on the surface area of the earth, which is n^2.
Okay, then we just use a bunch of tiny black holes and pack some extra dark energy between them, then we can get back to volumetric scaling. In fact, since dark matter isn't yet fully understood, once we can harness it, we can fit several times as much black hole surface area in the same space as a singular black hole.
[0] https://en.wikipedia.org/wiki/Ultimate_fate_of_the_universe?...