Game theory at work? Someone needs to maintain legacy code for free that hosts thousands of sites and gets nothing but trouble (pride?) in return. Meanwhile the forces of the world present riches and power in return to turn to the dark side (or maybe just letting your domain lapse and doing something else).
If security means every maintainer of every OSS package you use has to be scrupulous, tireless, and not screw up for life, not sure what to say when this kind of thing happens other than "isn't that the only possible outcome given the system and incentives on a long enough timeline?"
Kind of like the "why is my favorite company monetizing now and using dark patterns?" Well, on an infinite timeline did you think service would remain high quality, free, well supported, and run by tireless, unselfish, unambitious benevolent dictators for the rest of your life? Or was it a foregone question that was only a matter of "when" not "if"?
It seems when proprietary resources get infected it's because hackers are the problem, but when open source resources get infected its a problem with open source.
But there isn't any particular reason why a paid/proprietary host couldn't just as easily end up being taken over / sold to a party intending to inject malware. It happens all the time really.
Yes, the economic problem of reward absence is exclusive to open source and private software does not have it. They may have others, like excess of rewards to hackers in form of crypto ransom to the point that the defense department had to step in and ban payouts.
As long as business is not going as well as owners want, the same economic problem exists in private software too - in fact, the private companies get acquired all the time too, and they get shut down, causing DOS for many of their clients.
One difference is that closed-source source usually much less efficient; I cannot imagine "100K+" customers from a commercial org with just a single developer. And when there are dozens or hundreds of people involved, it's unlikely that new owners would turn to outright criminal activity like malware; they are much more likely to just shut down.
agreed, but if a company is making millions for the security of software, the incentive is to keep it secure so customers stick with it. Remember the lastpass debacle, big leak and lost many customers...
Directly security-focused products like lastpass are the only things that have any market pressure whatsoever on this, and that's because they're niche products for which the security is the only value-add, marketed to explicitly security-conscious people and not insulated by a whole constellation of lock-in services. The relevant security threats for the overwhelming majority of people and organizations are breaches caused by the practices of organizations that face no such market pressure, including constant breaches of nonconsensually-harvested data, which aren't even subject to market pressures from their victims in the first place
Even for security-related products the incentives are murky. If they're not actually selling you security but a box on the compliance bingo then it's more likely that they actually increase your attack surface because they want to get their fingers into everything so they can show nice charts about all the things they're monitoring.
Aye. My internal mythological idiolect's trickster deity mostly serves to personify the game-theoretic arms race of deception and is in a near-constant state of cackling derisively at the efficient market hypothesis
I 100% agree. I feel a huge part of my responsibility as a software "engineer" is to manage complexity. But i feel i'm fighting a losing battle, most everyone seems to pull in the opposite direction.
Complexity increases your surface area for bugs to hide in.
I've come to the conclusion it's tragedy-of-the-commons incentives: People get promotions for complex and clever work, so they do it, at the cost of a more-complex-thus-buggy solution.
And yes, it's not just software, it's everywhere. My modern BMW fell apart, in many many ways, at the 7 year mark, for one data point.
But we have exceeded our ability to communicate the ideas and concepts, let alone the instructions of how to build and manage things.
Example: a junior Jiffy Lube high school dropout in 1960 could work hard and eventually own that store. Everything he would ever need to know about ICE engines was simple enough to understand over time… but now? There are 400 oil types, there are closed source computers on top computers, there are specialty tools for every vehicle brand, and you can’t do anything at all without knowing 10 different do-work-just-to-do-more-work systems. The high school dropout in 2024, will never own the store. Same kid. He hasn’t gotten dumber. The world just left him by in complexity.
Likewise… I suspect that Boeing hasn’t forgotten how to build planes, but the complexity has exceeded their ability. No human being on earth could be put in a room and make a 747 even over infinite time. It’s a product of far too many abstract concepts in a million different places that have come together to make a thing.
We make super complex things with zero effort put into communicating how or why they work a way they do.
We increase the complexity just to do it. And I feel we are hitting our limits.
The problem w/ Boeing is not the inability of people to manage complexity but of management's refusal to manage complexity in a responsible way.
For instance, MCAS on the 737 is a half-baked implementation of the flight envelope protection facility on modern fly-by-wire airliners (all of them, except for the 737). The A320 had some growing pains with this, particularly it had at least two accidents where pilots tried to fly the plane into the ground, thought it would fail because of the flight envelope protection system, but they succeeded and crashed anyway. Barring that bit of perversity right out of the Normal Accidents book, people understand perfectly well how to build a safe fly-by-wire system. Boeing chose not to do that, and they refused to properly document what they did.
Boeing chose to not develop a 737 replacement, so all of us are suffering: in terms of noise, for instance, pilots are going deaf, passengers have their head spinning after a few hours in the plane, and people on the ground have no idea that the 737 is much louder than competitors.
Okay but your entire comment is riddled with mentions of complex systems (flight envelope system?) which proves the point of the parent comment. "Management" here is a group of humans who need to deal with all the complexity of corporate structures, government regulations, etc.. while also dealing with the complexities of the products themselves. We're all fallible beings.
Boeing management is in the business of selling contracts. They are not in the business of making airplanes. That is the problem. They relocated headquarters from Seattle to Chicago and now DC so that they can focus on their priority, contracts. They dumped Boeing's original management style and grafted on the management style of a company that was forced to merge with Boeing. They diversified supply chain as a form of kickbacks to local governments/companies that bought their 'contracts'.
They enshiftified every area of the company, all with the priority/goal of selling their core product, 'contracts', and filling their 'book'.
We are plenty capable of designing Engineering systems, PLMs to manage EBOMs, MRP/ERP systems to manage MBOMs, etc to handle the complexities of building aircraft. What we can't help is the human desire to prioritize enshitfication if it means a bigger paycheck. Companies no longer exist to create a product, and the product is becoming secondary and tertiary in management's priorities, with management expecting someone else to take care of the 'small details' of why the company exists in the first place.
Boeing is a kickbacks company in a really strange way. They get contracts based on including agreements to source partly from the contracties local area. Adding complexity for contracts and management bonus sake, not efficiency, not redundancy, not expertise. Add onto that a non-existent safety culture and a non-manufacturing/non-aerospace focused management philosophy grafting on from a company that failed and had to be merged into Boeing replacing the previous Boeing management philosophy. Enshitifaction in every area of the company. Heck they moved headquarters from Seattle to Chicago, and now from Chicago to DC. Prioritizing being where the grift is over, you know, being where the functions of the company are so that management has a daily understanding of what the company does. Because to management what the company does is win contracts, not build aerospace products. 'Someone else' takes care of that detail, according to Boeing management. Building those products in now secondary/tertiary to management.
I did ERP/MPR/EBOM/MBOM/BOM systems for aerospace. We have that stuff down. We have systems for this kind of communication down really well. We can build within a small window an airplane with thousands of parts with lead times from 1 day to 3 months to over a year for certain custom config options, with each parts design/FAA approval/manufacturing/installation tracked and audited. Boeing's issue is culture, not humanity's ability to make complex systems.
But I do agree that there is a complexity issue in society in general, and a lot of systems are coasting on the efforts of those that originally put them in place/designed them. A lot of government seems to be this way too. There's also a lot of overhead for overheads sake, but little process auditing/iterative improvement style management.
Ironically I think you got that almost exactly wrong.
Avoiding "cowboyism" has instead lead to the rise of heuristics for avoiding trouble that are more religion than science. The person who is most competent is also likely to be the person who has learned lessons the hard way the most times, not the person who has been very careful to avoid taking risks.
And let me just say that there are VERY few articles so poorly written that I literally can't get past the first paragraph, and an article that cherry-picks disasters to claim generalized incompetence scores my very top marks for statistically incompetent disingenuous bullshit. There will always be a long tail of "bad stuff that happens" and cherry-picking all the most sensational disasters is not a way of proving anything.
I'm predisposed to agree with the diagnosis that incompetence is ruining a lot of things, but the article boils down to "diversity hiring is destroying society" and seems to attribute a lot of the decline to the Civil Rights Act of 1964. Just in case anybody's wondering what they would get from this article.
> By the 1960s, the systematic selection for competence came into direct conflict with the political imperatives of the civil rights movement. During the period from 1961 to 1972, a series of Supreme Court rulings, executive orders, and laws—most critically, the Civil Rights Act of 1964—put meritocracy and the new political imperative of protected-group diversity on a collision course. Administrative law judges have accepted statistically observable disparities in outcomes between groups as prima facie evidence of illegal discrimination. The result has been clear: any time meritocracy and diversity come into direct conflict, diversity must take priority.
TL;DR "the California PG&E wildfires and today's JavaScript vulnerability are all the fault of Woke Politics." Saved you a click.
A more fundamental reason is that society is no longer interested in pushing forward at all cost. It's the arrival at an economical and technological equilibrium where people are comfortable enough, along with the end of the belief in progress as an ideology, or way to salvation somewhere during the 20th century. If you look closely, a certain kind of relaxation has replaced a quest for efficiency everywhere. Is that disappointing? Is that actually bad? Do you think there might be a rude awakening?
Consider: It was this scifi-fueled dream of an amazing high-tech, high-competency future that also implied machines doing the labour, and an enlightened future relieving people of all kinds of unpleasantries like boring work, therefore prevented them from attaining high competency. The fictional starship captain, navigating the galaxy and studying alien artifacts was always saving planets full of humans in desolate mental state...
My own interpretation of the business cycle is that growth cause externalities that stop growth. Sometimes you get time periods like the 1970s where efforts to control externalities themselves would cause more problems than they solved, at least some of the time. (e.g. see the trash 1974 model year of automobiles where they hadn’t figured out how to make emission controls work.)
I’d credit the success of Reagan in the 1980s at managing inflation to a quiet policy of degrowth the Republicans could get away with because everybody thinks they are “pro business”. As hostile as Reagan’s rhetoric was towards environmentalism note we got new clean air and clean water acts in the 1980s but that all got put in pause under Clinton where irresponsible monetary expansion restarted.
> along with the end of the belief in progress as an ideology, or way to salvation somewhere during the 20th century.
That 20th century belief in technological progress as a "way to salvation" killed itself with smog and rivers so polluted they'd catch on fire, among other things.
Thank you for summarizing (I actually read the whole article before seeing your reply and might have posted similar thoughts). I get the appeal of romanticizing our past as a country, looking back at the post-war era, especially the space race with a nostalgia that makes us imagine it was a world where the most competent were at the helm. But it just wasn't so, and still isn't.
Many don't understand that the Civil Rights Act describes the systematic LACK of a meritocracy. It defines the ways in which merit has been ignored (gender, race, class, etc) and demands that merit be the criteria for success -- and absent the ability for an institution to decide on the merits it provides a (surely imperfect) framework to force them to do so. The necessity of the CRA then and now, is the evidence of absence of a system driven on merit.
I want my country to keep striving for a system of merit but we've got nearly as much distance to close on it now as we did then.
>Many don't understand that the Civil Rights Act describes the systematic LACK of a meritocracy. It defines the ways in which merit has been ignored (gender, race, class, etc) and demands that merit be the criteria for success
The word "meritocracy" was invented for a book about how it's a bad idea that can't work, so I'd recommend not trying to have one. "Merit" doesn't work because of Goodhart's law.
I also feel like you'd never hire junior engineers or interns if you were optimizing for it, and then you're either Netflix or you don't have any senior engineers.
FWiW Michael Young, Baron Young of Dartington, the author of the 1958 book The Rise of the Meritocracy popularised the term which rapidly lost the negative connotations he put upon it.
He didn't invent the term though, he lifted it from an earlier essay by another British sociologist Alan Fox who apparently coined it two years earlier in a 1956 essay.
Everything has become organized around measurable things and short-term optimization. "Disparate impact" is just one example of this principle. It's easy to measure demographic representation, and it's easy to tear down the apparent barriers standing in the way of proportionality in one narrow area. Whereas, it's very hard to address every systemic and localized cause leading up to a number of different disparities.
Environmentalism played out a similar way. It's easy to measure a factory's direct pollution. It's easy to require the factory to install scrubbers, or drive it out of business by forcing it to account for externalities. It's hard to address all of the economic, social, and other factors that led to polluting factories in the first place, and that will keep its former employees jobless afterward. Moreover, it's hard to ensure that the restrictions apply globally instead of just within one or some countries' borders, which can undermine the entire purpose of the measures, even though the zoomed-in metrics still look good.
So too do we see with publically traded corporations and other investment-heavy enterprises: everything is about the stock price or other simple valuation, because that makes the investors happy. Running once venerable companies into the ground, turning merges and acquisitions into the core business, spreading systemic risk at alarming levels, and even collapsing the entire economy don't show up on balance sheets or stock reports as such and can't easily get addressed by shareholders.
And yet now and again "data-driven" becomes the organizing principle of yet another sector of society. It's very difficult to attack the idea directly, because it seems to be very "scientific" and "empirical". But anecdote and observation are still empirically useful, and they often tell us early on that optimizing for certain metrics isn't the right thing to do. But once the incentives are aligned that way, even competent people give up and join the bandwagon.
This may sound like I'm against data or even against empiricism, but that's not what I'm trying to say. A lot of high-level decisions are made by cargo-culting empiricism. If I need to choose a material that's corrosion resistant, obviously having a measure of corrosion resistance and finding the material that minimizes it makes sense. But if the part made out of that material undergoes significant shear stress, then I need to consider that as well, which probably won't be optimized by the same material. When you zoom out to the finished product, the intersection of all the concerns involved may even arrive at a point where making the part
easily replaceable is more practical than making it as corrosion-resistant as possible. No piece of data by itself can make that judgment call.
Well, on the web side, it'd be a lot less complex if we weren't trying to write applications using a tool designed to create documents. If people compiled Qt to WASM (for instance), or for a little lighter weight, my in-development UI library [1] compiled to WASM, I think they'd find creating applications a lot more straightforward.
Most apps don’t need to be on the web. And the ones that need to be can be done with the document model instead of the app model. We added bundles of complexity to an already complex platform (the browser).
I don't think there's any. Too many luminaries are going to defend the fact that we can have things like "poo emojis" in domain names.
They don't care about the myriad of homograph/homoglyph attacks made possible by such an idiotic decision. But they've got their shiny poo, so at least they're happy idiots.
> Too many luminaries are going to defend the fact that we can have things like "poo emojis" in domain names. They don't care about the myriad of homograph/homoglyph attacks made possible by such an idiotic decision.
There is nothing idiotic about the decision to allow billions of people with non-latin scripts to have domain names in their actual language.
What's idiotic is to consider visual inspection of domain names a neccessary security feature.
DNS could be hosted on a blockchain, each person use his own rules for validating names, and reject, accept or rename any ambiguous or dangerous part of the name, in a totally secure and immutable way.
Blockchain has the potential to be the fastest and cheapest network on the planet, because it is the only "perfect competition" system on the internet.
"Perfect competition" comes from game theory, and "perfect" means that no one is excluded from competing. "Competition" means that the best performing nodes of the network put the less efficient nodes out of business.
For the moment unfortunately, there is no blockchain which is the fastest network on the planet, but that's gonna change. Game theory suggests that there will be a number of steps before that happens, and it takes time. In other words, the game will have to be played for a while, for some objectives to be achieved.
UTF8 and glyphs are not related to supply chains, and that's a little bit off topic, but i wanted to mention that there is a solution.
in a strange way, this almost makes the behavior of hopping onto every new framework rational. The older and less relevant the framework, the more the owner's starry-eyed enthusiasm wears off. The hope that bigcorp will pay $X million for the work starts to fade. The tedium of bug fixes and maintenance wears on, the game theory takes it's toll. The only rational choice for library users is to jump ship once the number of commits and hype starts to fall -- that's when the owner is most vulnerable to the vicissitudes of Moloch.
> in a strange way, this almost makes the behavior of hopping onto every new framework rational.
Or maybe not doing that and just using native browser APIs? Many of these frameworks are overkill and having so many "new" ones just makes the situation worse.
Many of them predate those native browser APIs. Pollyfills, the topic at hand, were literally created to add modern APIs to all browsers equally (most notably old Safaris, Internet Explorers, etc.).
Good point. What's often (and sometimes fairly) derided as "chasing the new shiny" has a lot of other benefits too: increased exposure to new (and at least sometimes demonstrably better) ways of doing things; ~inevitable refactoring along the way (otherwise much more likely neglected); use of generally faster, leaner, less dependency-bloated packages; and an increased real-world userbase for innovators. FWIW, my perspective is based on building and maintaining web-related software since 1998.
to be fair there is a whole spectrum between "chasing every new shiny that gets a blog post" vs. "I haven't changed my stack since 1998."
there are certainly ways to get burned by adopting shiny new paradigms too quickly; one big example in web is the masonry layout that Pinterest made popular, which in practice is extremely complicated to the point where no browser has a full implementation of the CSS standard.
To be fair, when it comes to React, I don't think there is a realistic "new shiny" yet. NextJS is (was?) looking good, although I have heard it being mentioned a lot less lately.
Perhaps. I view it as the squalor of an entirely unsophisticated market. Large organizations build and deploy sites on technologies with ramifications they hardly understand or care about because there is no financial benefit for them to do so, because the end user lacks the same sophistication, and is in no position to change the economic outcomes.
So an entire industry of bad middleware created from glued together mostly open source code and abandoned is allowed to even credibly exist in the first place. That these people are hijacking your browser sessions rather than selling your data is a small distinction against the scope of the larger problem.
Tea is not the “replacement for homebrew” apart from the fact that the guy that started homebrew also started tea. There’s a bunch of good reasons not to use tea, not least the fact that it’s heavily associated with cryptocurrency bullshit.
Alternatively, if you rely on some code then download a specific version and check it before using it. Report any problems found. This makes usage robust and supports open source support and development.
I'm afraid this is hitting on the other end of inviolable game theory laws. Dev who is paid for features and business value wants to read line-by-line random package that is upgrading from version 0.3.12 to 0.3.13 in a cryptography or date lib that they likely don't understand? And this should be done for every change of every library for all software, by all devs who will always be responsible, not lazy, and very attentive and careful.
On the flip side there is "doing as little as possible and getting paid" for the remainder of a 40 year career where you are likely to be shuffled off when the company has a bad quarter anyway.
In my opinion, if that was incentivized by our system, we'd already be seeing more of it, we have the system we have due to the incentives we have.
Correct. I don't think I have ever seen sound engimeering decisions being rewarded at any business I have worked for. The only reason any sound decisions are made is that some programmers take the initiative, but said initiative rarely comes with a payoff and always means fighting with other programmers who have a fetish for complexity.
If only programmers had to take an ethics oath so they have an excuse not to just go along with idiotic practices.
Then there are the programmers who read on proggit that “OO drools, functional programming rules” or the C++ programmers who think having a 40 minute build proves how smart and tough they are, etc.
> Report any problems found. This makes usage robust and supports open source support and development.
Project maintainers/developers are not free labor. If you need a proper solution to any problem, make a contract and pay them. This idea that someone will magically solve your problem for free needs to die.
Vendoring should be the norm, not the special case.
Something like this ought to be an essential part of all package managers, and I'm thinking here that the first ones should be the thousands of devs cluelessly using NPM around the world:
We've seen a lot more attacks succeed because somebody has vendored an old vulnerable library than supply chain attacks. Doing vendoring badly is worse than relying on upstream. Vendoring is part of the solution, but it isn't the solution by itself.
Not alone, no. That's how CI bots help a lot, such as Dependabot.
Althought it's also worrying how we seemingly need more technologies on top of technologies just to keep a project alive. It used to be just including the system's patched header & libs, now we need extra bots surveying everything...
Maybe a linux-distro-style of community dependency management would make sense. Keep a small group of maintainers busy with security patches for basically everything, and as a downstream developer just install the versions they produce.
In the old ways, you mostly rely on a few libraries that each solve a complete problem and is backed by a proper community. The odd dependency is usually small and vendored properly. Security was mostly the environment concern (the OS) as the data is either client side or some properly managed enterprise infrastructure). Now we have npm with its microscopic and numerous packages, everyone wants to be on the web, and they all want your data.
That isn't the plan. For this to work new versions have to be aggressively adopted. This is about accepting that using an open source project means adopting that code. If you had an internal library with bug fixes available then the right thing is to review those fixes and merge them into the development stream. It is the same with open source code you are using. If you care to continue using it then you need to get the latest and review code changes. This is not using old code, this is taking the steps needed to continue using code.
> did you think service would remain high quality, free, well supported, and run by tireless, unselfish, unambitious benevolent dictators for the rest of your life
I would run some things I run forever free, if once in a while 1 user would be grateful. In reality that doesn’t happen so I usually end up monetising and then selling it off. People whine about everything and get upset if I don’t answer tickets within a working day etc. Mind you; these are free things with no ads. The thing is; they expect me to fuck them over in the end as everyone does, so it becomes a self fulfilling prophecy. Just a single email or chat saying thank you for doing this once in a while would go a long way, but alas; it’s just whining and bug reports and criticism.
If security means every maintainer of every OSS package you use has to be scrupulous, tireless, and not screw up for life, not sure what to say when this kind of thing happens other than "isn't that the only possible outcome given the system and incentives on a long enough timeline?"
Kind of like the "why is my favorite company monetizing now and using dark patterns?" Well, on an infinite timeline did you think service would remain high quality, free, well supported, and run by tireless, unselfish, unambitious benevolent dictators for the rest of your life? Or was it a foregone question that was only a matter of "when" not "if"?