Matlab and Python are in the same ballpark. Easy syntax and large standard library. Matlab provides a lot more dedicated libraries for niche areas but the overall experience feels the same.
Mathematica doesn't really have a standard counterpart. Jupyter notebooks try to capture the experience but the support for symbolic expressions makes the Mathematica experience very different.
The funny thing is that an oxygen-rich environment is a hell-hole! Oxygen is insanely reactive and will corrode anything. Even early life on earth found oxygen toxic. It was released as a waste product by early life and they were so successful that all that oxygen accumulated resulting in the Great Oxidation Event (https://en.wikipedia.org/wiki/Great_Oxidation_Event).
That likely resulted in many species going extinct!
And the concentration into BIFs, banded iron formations, was all but certainly the result of biological activity.
Our present technology based on iron and steel owes itself to early life on Earth, from 1.6 to as much as 4 billion years ago. As with petroleum and coal-bed formation, a process unlikely to repeat in Earth's future. Iron ores are abundant, but still a finite resource.
Human civilization feels so much more fragile to me since I realized how much we owe our technological progress to the accumulated effects of biological processes over geological timescales. Fossil fuels seem like the most obvious part of this story. If we had to start over "from scratch", would it even be possible? Or have we already so thoroughly exhausted the low-hanging energy stores that a second "industrial revolution" would be effectively impossible if our present civilization collapsed deeply enough?
I wasn't aware that concentrated stores of iron are also an important part of this story!
> Or have we already so thoroughly exhausted the low-hanging energy stores that a second "industrial revolution" would be effectively impossible if our present civilization collapsed deeply enough?
There's plenty of coal left, and we will likely never exploit it, because solar is getting so cheap.
Also, despite long prophecies, peak oil never arrived either. So it doesn't look like we are running out of that stuff.
That's a loss of 2/3 of production to non-scrap effluvia on an annual basis. I'll let you work out the ultimate resource depletion cycle from that. Recycling is useful, but it's no magic bullet, and there are always losses.
The most heavily recycled metal in the US is lead, per USGS data and prior comments of mine, with recovery rates of about 75%, accounting for 40% of net production.
>That's a loss of 2/3 of production to non-scrap effluvia
Considering that the amount of stuff in our world made from steel at any one time is steadily increasing this makes sense.
>The most heavily recycled metal in the US is lead, per USGS data and prior comments of mine, with recovery rates of about 75%, accounting for 40% of net production.
There's little to no "post consumer pre-recycler" use for lead whereas every tom dick and harry can find a use for some old pipes or beams or whatever.
I once heard a similar point and it has fascinated me ever since: an alien observing human culture would be appalled at how dangerous our lives are.
Everything around us is bathed in warm oxygen, just waiting to catch fire! Our homes, our clothes, our fields, our possessions, …our hair. Ready oxidation brings vitality to Earth but it’s also ridiculously dangerous.
It might be a lot more sedate, imagine crystalline creatures from deep below the surface of an ice-ball that rely on indirect chemical gradients or geothermal.
"Your planet is how close to that star!? H20 would be liquid! How do you protect yourselves from the polar solvent leaking down into the rock?"
Conversely, a slower rate of reactivity suggests intelligent life might not yet have arisen in such environments, or ever arise before the opportunity passes.
They need a sufficiently dense energy source, sure. But it may not involve their atmosphere at all. Their plants could store solar energy in self-contained chemical batteries, and the aliens could be using those batteries to power their bodies. Instead of having to constantly breathe, they would instead need a daily battery swap.
I often like to quip that the [21%] corrosive gas is pretty nice today, and I think I'll go consume a big container of industrial solvent to help counteract the radiation from the uncontrolled nuclear [fusion] explosion in the sky.
An oxygen-rich environment is so thermodynamically unstable (it would lead to oxidation and rusting of virtually every other prevalent element) that it would be exceedingly short-lived without the presence of oxygen-liberating biological metabolism. To that extent, a high-oxygen atmosphere is one of the very clear and detectable indicators of probable life which we are capable of detecting even on extra-solar planets (via spectroscopic analysis of reflected or filtered light).
Definitely true, but oxygen is also immensely useful for life evolved to benefit from it, enabling much more complexity. I'm fascinated by the giant insects that got huge back when the oxygen level was much higher.
Related: highly recommend Robert M. Hazen Great Courses and book
I think a large enough org that needs many different certificates should have an internally-trusted CA. That would then allow the org to decide their own policy for all their internal facing certificates.
Then you only have to follow the stricter rules for only the public facing certs.
We make extensive use of self-signed certificates internally on our infrastructure, and we used to manually manage year-long certs. A few months ago I built "LessEncrypt", which is a dead simple ACME-inspired system for handing out certs without requiring hijacking the HTTP port or doing DNS updates. Been running it on ~200 hosts for a few months now and it's been fantastic to have the certs manage themselves.
I've toyed with the idea of adding the ability for the server component to request certs from LetsEncrypt via DNS validation. Acting as a clearing house so that individual internal hosts don't need a DNS secret to get certs. However, we also put IP addresses and localhost on our internal certs, so we'd ahve to stop doing that to be able to get them from LetsEncrypt.
Why or in which cases is opening a dedicated port better than publishing challenges under some /.well-known path using the standard HTTP port?
(You say hijacking the HTTP port, but I don't let the ACME client take over 80/443, I make my reverse proxy point the expected path to a folder the ACME client writes to, I'm not asking for a comparison with a setup where the acme client takes over the reverse proxy and edits its configuration by itself, which I don't like)
The case for it is where it's not easy to plop a file in a .well-known path on port 80/443. If you have a reverse proxy that is easy to set up to publish that, that makes it easier. I guess I could have used different wording, I do consider making the .well-known available a subset of hijacking the port, but can see why it would be confusing. ACME setup can still be trickier to set up, but is definitely a good solution if it fits in your environment.
It used to be only a large enough organization needed this, but smaller organizations could slap their PKI wildcard on everything. Between the 47 day lifetime and the removal of client authentication as a permitted key usage of PKI certs, everyone will need a private CA.
Active Directory Certificate Services is a fickle beast but it's about to get a lot more popular again.
The author is suggesting that websites care more about server side issues than client side issues. To the point that they don't realize that users stop using them.
I think that statement is way too strong and obviously not true of businesses. It might be true if hobbyist websites where the creator is personally more interested on the server side but it's definitely not true of professional websites.
Professional websites that have enough of a budget to care about the server side will absolutely care about the client side and will track usage. If 10% fewer people used the website, the analytics would show that and there would be a fire drill.
What I can agree with on the author is more of a nuanced point. Client side problems are a lot harder and have a very long tail due to unique client configurations (OS, browser, extensions, physical hardware). So with thousands of combinations, you end up with some wild and rare issues. It becomes hard to chase all of them down and some you just have to ignore.
This can lead to it feeling like websites don't care about client side but it just shows client side is hard.
> I think that statement is way too strong and obviously not true of businesses
Amazon.com Inc is currently worth 2.4 billion dollars and the only reason is that most businesses insist on giving their customers the worst online experience possible. I wish that I could one day understand the logic, which goes like this:
1. Notice that people are on their phones all the time.
2. And notice that when people are looking to buy something they first go on the computer or on the smart phone.
3. Therefore let's make the most godawful experience on our website possible, to make sure that our potential customers hate us and don't make a purchase.
4. Customers make their purchase on Amazon instead.
> Amazon.com Inc is currently worth 2.4 billion dollars and the only reason is that most businesses insist on giving their customers the worst online experience possible.
This is an incredibly reductive view of how Amazon came to dominate online retail. If you genuinely believe this, I would strongly urge you to research their history and understand how they became the monopoly they are today.
I assure you, it's not primarily because they care more about the end user's experience.
It's just an example, and it holds true even if it's reductive. If businesses made just 5% of the effort with their online experience as they do with their physical stores or social media campaigning, then they would see massive returns on effort.
Respectfully, this argument reads like it is completely ignorant of the e-commerce landscape over the past 30 years and how much Amazon has shaped and innovated in the space. Not to mention that today they have several verticals beyond e-commerce that make up their valuation.
Okay go on and count only half for the sake of argument. That's still a trillion. Any business can do what Amazon does for their products and their customers. But they don't and they won't. Those who do experience great advantages.
Yer trollin', but yeah, I'll reply, normally when things are successful, people follow suit, so doing something is evidence that it is possible to do. Like, take nuclear weapons for an example...
"Any business can do what Amazon does for their products and their customers."
What I meant is that any business can do for their products and their customers what Amazon does. Not that any business can do everything Amazon does.
There would be little reason for online marketplaces like Amazon to grow so huge, if businesses had cared enough to provide a reasonable online experience. 20 years ago, 10 years ago, or 5 years ago. Now we are in 2025 and most businesses offer worse online customer experience than what good businesses were offering 20 years ago. You can't be 20 years behind the times and say that it's impossible to compete. It's very possible to make a great customer experience and make money online, even for small businesses with limited means. As evidence by many companies doing that.
It's the same with any marketplace like Booking.com or restaurant delivery apps. They wouldn't be half as big if the businesses they serve wouldn't be too lazy and worthless to make a decent online experience for their customers. But here we are.
I think today it is a lot easier for businesses to do what Amazon does - but a lot of that is true because of Amazon. Shopify, stripe, and logistics & last mile providers fill some gaps but they were not as widely known or as easy to integrate with long ago - most didn’t even exist until well into Amazon’s existence.
Can you make your point without resorting to insults?
Businesses don't need to be as good as Amazon or deliver as fast. Amazon is just an example. But business need to take their online experience seriously if they don't want to be pushed aside by Amazon and the likes. And few businesses seem to do that even though it's not hard.
I...don't have this experience. It doesn't hold true for me, and I suspect I am not alone. There are certainly some online stores that are not very great, but by and large, I just don't have problems with them. I prefer the seller's website over Amazon.
Amazon, on the other hand, is plagued with fake or bad products from copycat sellers. I have no idea what I am going to get when I place an order. Frankly, I'm surprised when I get the actual thing I ordered.
A couple years back I tried to buy some parts on digikey and literally could not get the checkout to work without completely disabling noscript (assuming that would've helped). They had like a dozen 3rd party tracking scripts. Eventually I gave up and used Amazon.
it's still the case today, in 2025, that when I bought a Focusrite 18i20 mixer from Sweetwater that turned out to be defective, I had to spend a week with a lengthy and super-long-delayed conversation with their support department convincing them that the unit was in fact defective, that I was using it correctly, and finally getting the prized RMA to return it. Whereas if I had bought it from Amazon, I would have received the original package more quickly, and when defective, I could have had it in a box and shipped off from any local shipper that same day with no emails/phone calls required with a new one to arrive the next day. Amazon even as the leader in "enshittification" still offers a dramatically better experience for a wide range of products (though certainly not all of them).
There are many one man online businesses with very smooth and user friendly customer experience. Does every business have to be as smooth as Amazon? No. But that's not an argument for giving up completely.
> Amazon.com Inc is currently worth 2.4 billion dollars and the only reason is that most businesses insist on giving their customers the worst online experience possible
Not the gp, but from my own experience: some business use out-of-the-box online shop software that is not very good. I wouldn't say "most" but, if you're buying some particular niche products, it becomes true. Slow pages, abysmal usability... one pet peeve is that they offer a brand filter in the left column with checkboxes. I want to select three brands and, every time I tick the checkbox, the page is reloaded. Reloading is painfully slow, so I need one minute to get to the search. If I want to do several searches, it's too much time.
Also, at least in Spain, some delivery companies are awful. I have a package delivered to a convenience store right now. They refuse to give it to me because I have no delivery key. The courier didn't send it to me. I try to get assistance in their web... and they ask me the key that I want them to give me. Nice, huh?
I asked for a refund to the shop. They have ghosted me in the chat, their return form doesn't work. Their email addresses are no-reply. The contact form doesn't work either. Now I need to wait until Monday to phone them.
I know the shop is legit. They're just woefully incompetent and don't know they are or think that's the way things work.
For cheap and not too expensive products, Amazon just works. No "but I went to your house and there was nobody there" bullshit. No questions return policy.
I honestly love reading them. The fact that I got tagged means something went awry and it's fun seeing HOW that happened. Who misunderstood what or who is missing what context.
Then I can make bunch of people's work life easier by clarifying the state for everyone
The main point is that the conditional didn't actually introduce a branch.
Showing the other generated version would only show that it's longer. It is not expected to have a branch either. So I don't think it would have added much value
But it's possible that the compiler is smart enough to optimize the step() version down to the same code as the conditional version. If true, that still wouldn't justify using step(), but it would mean that the step() version isn't "wasting two multiplications and one or two additions" as the post says.
(I don't know enough about GPU compilers to say whether they implement such an optimization, but if step() abuse is as popular as the post says, then they probably should.)
Okay but how does this help the reader? If the worse code happens to optimize to the same thing it's still awful and you get no benefits. It's likely not to optimize down unless you have fast-math enabled because the extra float ops have to be preserved to be IEEE754 compliant
Fragment and vertex shaders generally don't target strict IEEE754 compliance by default. Transforming a * (b ? 1.0 : 0.0) into b ? a : 0.0 is absolutely something you can expect a shader compiler to do - that only requires assuming a is not NaN.
> Unless you’re writing an essay on why you’re right…
He's writing an essay on why they are wrong.
"But here's the problem - when seeing code like this, somebody somewhere will invariably propose the following "optimization", which replaces what they believe (erroneously) are "conditional branches" by arithmetical operations."
Hence his branchless codegen samples are sufficient.
Further, regarding.the side-issue "The second wrong thing with the supposedly optimizer [sic] version is that it actually runs much slower", no amount of codegen is going to show lower /speed/.
You missed the second part where article says that "it actually runs much slower than the original version", "wasting two multiplications and one or two additions", based on idea that compiler is unable to do a very basic optimization, implying that compiler compiler will actually multiply by one. No benchmarks, no checking assembly, just straightforward misinformation.
The ICP-Brazil thing was just a surprise coincidence as I was finishing up the article. I had started collecting my notes on the topic when Entrust got distrusted.
Fun threads to read through:
That's really stupid: they got in trouble for writing down a requirement that wasn't actually a requirement and wasn't meant to be a requirement, in a human readable document, and then not revoking and reissuing those certificates which didn't meet the requirement, which still don't meet it so revoking and reissuing achieves nothing, and weren't supposed to meet it anyway so there's nothing actually wrong with them. Part of the point of having humans in a process is so that they can make sensible decisions when the process prescribes something nonsensical, but Mozilla wants to follow the nonsensical process at all costs.
No that's not what they did. They had made promises about how they would handle issuing certs not in compliance with their CPS then went back on them, after a long history of similar things.
"Hey, just so you know, we did read your brown M&Ms clause, but since last month M&M is running a special promotion where they only make rainbow colours, we got you a bowl of yellow ones instead."
Which is the correct course of action:
"Of course. Thank you for paying attention to the spirit of the rule."
Or: "No, fuck you, show is cancelled."
---
All of the stuff I said happened did happen. The extra context you are providing is irrelevant since it does not change the fact that what happened is stupid and it's Mozilla's fault that stupid stuff happened.
In the past, I have used manim to make mathematical animations: https://www.manim.community/ Manim is more flexible but that comes with some overhead of complexity and learning. Example of some animations using manim:
I couldn't disagree more to be honest, about this post. Animations are good when they provide object permanence, and let you track what's changing and how.
This post linearly interpolates complex functions blindly, which doesn't tell you anything useful, unless the thing being interpolated is an affine or projective transform where that makes sense.
e.g. For complex powers, the most natural animation is to animate the exponent, which will show a continuous folding or unfolding. Here the squaring just looks like the extra 360° appears out of nowhere.
For mobius-like transforms, interpolating the inverse might be better.
One particularly good example is e.g. visualizing equally spaced points on a circle, and their various combinations as roots and poles of complex functions.
The goal of math animation should be to highlight and travel the natural geodesics of the concept space, with natural starts and stops too.
> The goal of math animation should be to highlight and travel the natural geodesics of the concept space, with natural starts and stops too.
> The rest is cargo culting.
A geodesic as I understand it is the curve representing the shortest path between two points in some manifold.
So take one thing that I have found math animations useful for: showing the path of travel of some parametric system. Is that a geodesic? Not necessarily in the cartesian space of the system. I don't know what it would mean for it to be a natural geodesic of the concept space.
For me the goal of math animation is the same as the goal of any math visualisation: to improve understanding and intuition. When I animate something (Which I only ever do for myself) that is why I do it. Am I cargo culting in your estimation?
Let's take another example: Say I do an animation of some sort of force problem in mechanics. I can show the paths of some particles in the simulation and the magnitude and direction of the various vectors vs time. Is that cargo culting? It's definitely not any kind of geodesic. Does it help my understanding? Quite possibly.
In that sense in the blog post you are addressing, in my opinion the position vs momentum distribution animation is really great because it really helps my intuition of how those probability distributions are related and how one would change as the other changes.
Please note that complex powers involve the complex logarithm, which is multivalued, it should be a surface in 3D to really see the whole function. The animation I made is only taking one value of the power
Matlab and Python are in the same ballpark. Easy syntax and large standard library. Matlab provides a lot more dedicated libraries for niche areas but the overall experience feels the same.
Mathematica doesn't really have a standard counterpart. Jupyter notebooks try to capture the experience but the support for symbolic expressions makes the Mathematica experience very different.
reply