Is there a slightly less dense version of this for those of us a bit more removed from uni maths somewhere? Willing to read three times as much text to get the same insight. :)
> ... the conditional expected value of the next observation, given all the past observations, is equal to the most recent observation.
The book Introduction to Probability Models by Sheldon Ross is a solid undergraduate-level introduction to this material. Martingales are covered in chapter 10.
It does use plenty of math notation, but none of the abstract stuff here about triples, filtrations, etc. It might be a bit slow at first as you "reactivate" the mathy parts of your brain, but as long as you read with a pencil and notepad handy you should be able to work through it.
Martingales are a representation of a class of "do a random thing 'forever'" problems, with a self-similarity constraint and also the restriction that your reward is finite, both of which prevent them from being applied to most problems but make them vastly more computationally tractable. That they're well studied means we know some of those features.
The author then mentions a few of those computational properties (like "Azuma's inequality") and gives a smattering of problems where directly applying them and ignoring most of the rest of the article would help build an intuition.
The rest of the article is useful mostly because it puts all of the above in a more rigorous context and because if you're closer to uni maths it'll help provide some sort of intuition for why the restrictions were necessary and what sort of things might or might not be achievable with them. There's probably a less dense version somewhere, but I'm not quite sure what it would be. Hopefully this short summary helps you know what to look for when you find it.