I realized early on in my enterprise software job that if I produce code faster than average for my team, it will just get stuck in the rest of our review and QA processes; it doesn’t get released any faster.
It feels like LLM code gen can exacerbate and generalize this effect (especially when people send mediocre LLM code gen for review which then makes the reviews become painful).
Theory of Constraints right there. Producing faster than the slowest resource in the chain is detrimental to the entire process. You waste resources and create difficulties upstream.
So much wasted time debating whether the 1000 lines of generated code are actually necessary when the actual transform in question is 3 of them. “But it works”, goes the refrain.
It doesn’t even need to be the case that the LLM produces worse code. Software development is like a gas that expands to fill its container. If the schedule allows a fixed amount of time before shipping, and the time to write the code shrinks to zero, it just means the other parts of the process will fill the remaining time, even if the LLM did a decent job in the first place.
I thought this was a really good piece of writing. It’s rare to do something like this because the job discourages it by putting PR filters on everything you say.
My uncle was a pretty big pop star in the 1960s. His group at one point had a big fanzine, they were household names across the country, over time they had stalkers and weird fans and all that, made movies and albums, had big parties and knew other famous people, pretty much all those things that the OP writes about (circa 50 years later, some of it has changed but not that much).
He could be charismatic and surprisingly eloquent and I could picture him writing a piece like this, if the mood had struck.
He also lost pretty much all the money through mismanagement (several times over), eventually moved out of LA, had a tumultuous family life with numerous spouses and wasn’t around much for his kids, and after his 40s was trapped in a sad cycle of reunion tours because the band still needed the money. The tours still had some level of excitement and crowd enthusiasm, even pretty late in life and I guess he always loved the stage, the performing, all that. But in the end, I kinda felt it seemed like a lonely existence. Hard to form really deep connections when you’re always traveling and often away in your head.
> after his 40s was trapped in a sad cycle of reunion tours because the band still needed the money.
Celebrity memoirs are often written for the same reasons, or to promote other ventures. For instance Peter Wolf seemingly reluctantly shared vignettes about Dylan, The Stones, Faye Dunaway, and rock 'n' roll life in the 1970s to promote his newer stuff:
"I was putting out solo CDs. Not to sound self-congratulatory, but I thought each one got better and better— but they weren’t finding an audience. I thought a book might encourage people to check out the other stuff. So basically, the intent of the book was to find a wider audience."
It was interesting and a fun read, but not a “good piece of writing” in my opinion. Apart from some spelling mistakes, the sentences droned on and it read more like a semi-coherent rant than a thoughtful piece on “being a pop star”.
I thought it was excellent for something that appears mostly off the cuff. This is what lots of good writing looks like before the editors get to it, btw
It is thoughtful, that's not the problem. It's just not written in the standard language "written English", but instead in "spoken English" with some attempts towards the former ("My final thought on ...") that sound like someone trying formal writing for the first time.
The sentences do drone on, but they're fully coherent; this is above-average writing. It wouldn't likely meet publishing standards, but it's a lot better than you'd expect a randomly-chosen person to produce.
I'm sure an editor would go through and suggest tightening up some points, but I agree it's good enough as a first draft.
The problem is there are too types of writers who don't get the help of an editor, those who are too big and famous to accept one and those too poor to afford one.
I sort of feel the people who are saying it's bad aren't very able to separate their own preferences from determining quality
I certainly believe that if you want to be a successful musician, not even a pop star necessarily just one that's able to draw crowds large enough to sustain you financially, you probably are bound by certain norms and expectations. Not necessarily because audiences hate women (or men for that matter) that break the mold, but they're not as easy to digest. It adds friction. And when there are thousands of other artists out there to listen to, that friction can be the difference between success and failure.
I agree with you though, if you're willing to live a small life where you only need the love and respect of a small handful of people, you can do almost anything and very few people will genuinely hate you.
> I wonder if one or both of us have biased vision
The more common term you're searching for is "privilege", and yes, you both have it.
Do you hang a lot in professional entertainment circles? I'm not saying she's certainly correct, but if I were to wonder what problems a mid-20s female pop star faces, I'd buy her anecdata over a 50-ish man who posts on HN.
I admit I started reading with some skepticism. It didn't read like PR, so I assumed I was reading fanfic. By the midpoint, she managed to convince me otherwise.
I think the author is walking a tightrope between convincing the reader that she wrote this herself and that there's more depth to her than what we see on stage or in pop media. Writing this blog is definitely a tougher assignment than doing podcast interviews or behind the scenes videos.
You are right, of course, a good editor could make this better, but I think she's deliberately avoiding that here. A pop star is unwise to fire a good producer without a better replacement, but sometimes they have to bring out the piano and do an acoustic performance live.
As interesting as I find it, cannot agree more. It's very childish writing - feels a lot like it was written by a teenager. It sort of reminds me of my young 8 year old niece telling me a story she finds so exciting she barely comes up for air.
That's the fate of many acts from that period. So so so many artists who were stratospherically popular but are still touring for cash playing to nobody younger than them. It's sad.
"better to have loved and lost than never to have loved at all"
Is it sadder than any other individual who has to work into retirement age? Or is the fall itself what you find sad? I can imagine some artists might be happier in this latter stage of their lives where they can focus on their real fans and better fostering other personal relationships in their lives.
I misinterpreted the title and was hoping that this was going to be a post about realtime algorithmic music generation from the Postgres WAL, something like the Hatnote “listen to Wikipedia edits” project.
Agreeing with most of the other comments here that this discussion needs more context which we don't have...
If the request for additional access controls/access cleanup came from one of the Ruby Central funders, could we not know who that was and what exactly their ask consisted of? I am interested in knowing their side of the story, and what the motivation was. (But in general, cutting off long-time maintainers' access seems like a bad choice - as presumably they have long since proven their good will toward the ruby community as shepherds of these projects.)
I have a toy web application that accepts a very, very low rate of writes. It's almost all reads.
It is implemented like this:
- The front end uses react to draw a UI.
- It fetches data from a JSON file stored on S3.
- When we get a write request, a lamdba function reads the JSON file, parses it, adds new data, and writes back to S3.
The main selling point for me is that almost all of it is static assets, which are cheap and simple to host, and the tiny bit of "back end" logic is just one nodejs function.
The previous implementation used SQLite and a golang back end, but I eventually got tired of having to maintain a virtual machine to host it.
I ever plan do it with sqlite, loading it at memory during app start and flush data to s3 during runtime but it create more corner cases and logic to handle.
You can set concurrency limits per function on AWS, so you can apply a hard limit on your function to only have a single invocation running at the same time. That should give you a guarantee that data isn't lost without the producer noticing.
“If the devotion scholars feel toward their work is intense and sometimes irrational, it’s because this is one of the last spaces of unalienated labor”
Speaking as a former academic, I don’t really agree with this — I think academia can make you believe wrongly that it’s a kind of “unalienated labor,” but actually the alienation runs deep, all the deeper when it’s invisible at first glance.
Yes, you don’t have to make something that is sold to customers or that fits in a JIRA ticket. But when you stop and think about it, you’re going to be doing research based on topics and paradigms that other people have largely defined (advisors, peers); you have to publish in journals that are often for profit and pay you zero; when you teach you usually don’t get paid all the tuition that your students are paying per course (the institution takes a big cut); you end up doing a lot of silly things to have a solid institutional position… TLDR, it has great moments of course, but it isn’t unalienated.
Besides, in many fields, lots of interesting research is now being done at industrial labs. So there's no reason to cope with the abuse the article describes so well, e.g. "[...] a celebrity scholar who enjoys seeing his advisees suffer, plays them against each other, and likes to remind them that without him, their careers are nothing".
As an Oxbridge academic, I can confidently state that lots of things done by e.g. Isomorphic Labs, GSK AI, or Altos Labs are better than the stuff we do in the exact same subfield. Furthermore, they pay better, there is less drama, the workplace is much more professional and, above everything, they don't suffer from the power imbalance that has made academia so toxic.
"In the exact same subfield" is the key point. The academia is small. If a topic has enough direct monetary value to justify substantial spending in it, the industry will usually do better work. Academic research works better in topics that don't have such monetary value, at least not yet.
The academia lacks consistency, but I wouldn't characterize it as toxic. Many individual labs and departments are toxic, but the academia as a whole isn't. The same freedom that lets individual PIs pursue their own directions in their own ways also lets many of them create toxic work environments. But curtailing the toxicity is difficult without sacrificing the freedom the academia depends on.
I don't agree curtailing toxicity would sacrifice freedom. The toxicity I was referring to translates into power abuse, bullying, data fabrication, and all the different kinds of misconduct that emerge in systems where there is no control, no filtering, and no skin in the game. Actually, I think freedom and creativity would flourish if academic misconduct was pursued more actively. I have worked at a few top departments, and academic misconduct led to extremely low efficiency and resource waste. Everyone was either fighting or demotivated. Huge multi-million projects didn't get anywhere. Some minimal guardrails are needed.
If you are a PhD student or a postdoc, you are probably working in a PI's lab and often funded from their grants. It's also common that nobody else at the university understands the project well enough to replace the PI as your supervisor. That creates incentives to avoid reporting abuse and to tolerate unhealthy levels of toxicity, as the likely alternatives are switching to a new lab (and delaying your career) or leaving the academia.
True, there are incentives to avoid reporting abuse. I think in cases where abuse is reported, universities tend to support abusers because they are the ones who bring in grant money. Furthermore, internal control systems are not independent, and they tend to be linked to senior faculty members, who are the ones usually breaching the academic code of conduct.
In many cases journals have retracted articles after evident image manipulation was discovered. University committees rarely take disciplinary action against fraudsters. In some prominent cases they have even issued statements of support. This is starting to change, albeit slowly. For example, Sweden now has a national integrity board that investigates those types of breaches, much more likely to be neutral as it is not closely linked to the investigated subjects.
You still have more freedom than most any job in the United States at that compensation level. Yes you have to get a grant approved, but you can literally pivot to all sorts of topics within your domain if you just make a reasonable enough proposal. You can craft a class to your own liking and teach whatever you come up with. And if you have tenure you are basically set for life and don’t have to worry about the macroeconomy. You can die in office still engaging in interesting intellectual pursuits. Ageism actually goes in your favor in academia where wisdom is celebrated unlike private sector where you look like a cost center with your paygrade and liable to retire and screw up your team at any moment.
Yes there are responsibilities but you’d be hard pressed to find a tenured professor who feels like they are really very onerous, especially considering how much they had to work their tail off in grad school, postdoc, and tenure track years with little to no ability to delegate any of that. Even as department chair, you will probably get assigned an admin assistant to manage that and you will pass that torch to a colleague before long.
It's important to understand that only a small percentage of academics will ever get tenure. The rest will keep toiling away on increasingly poorly paid and desperate postdocs until they finally age out and decide to take a job in industry. That job will pay less than the equivalent job for someone who never took the PhD track.
Of course, the percentage of tenured winners varies a lot by fields. It's very low in the humanities, somewhat better in CS and math, etc.
Once you get tenure, if you ever do, you will indeed have a lot of freedom, but you will also have a lot of work to do. Sure you can pass grading and other jobs off to grad students and postdocs (which you were for the last decade...) but in many fields, the need to fundraise never ends. It's sort of like funding a new startup every year with a different set of grad students.
Most people don't want to sit alone in a closet and think deep thoughts (well, ok, mathematicians do...). But if you want to do something in the real world, you'll need funding, and that means writing a LOT of grant proposals.
> It's important to understand that only a small percentage of academics will ever get tenure. The rest will keep toiling away on increasingly poorly paid and desperate postdocs until they finally age out and decide to take a job in industry.
There's also a good chunk of people who fail to advance past the assistant professor level, which is pre-tenure at US institutions (not sure about other countries). And it's up or out, so if you're an assistant professor and you don't get tenure within a certain number of years, you lose your job.
> The rest will keep toiling away on increasingly poorly paid and desperate postdocs until they finally age out and decide to take a job in industry.
…and that’s for the fortunate disciplines, like CS, where there is actually an “industry” to go to. Let’s just say things look rather less pleasant in the humanities.
My grandfather was tenured, published voraciously up until he retired at 73, and was sorely disappointed when I chose not to follow in his footsteps and go into academia. Why? Primarily because I had to hear him gripe about how poorly the school administration treated him and his colleagues, culminating in him having to sue the university several times for the same reasons (and him winning every time in arbitration- he basically tripled his retirement savings).
I have a lot of respect for academics, but the culture around the administration of higher learning is putrid.
Unalienated labor doesn't exist. It is one of many fairy tales leftists like to tell themselves, like the idea of an objective fair universal price. The tale promises that work wouldn't be laborious and draining if it wasn't for those damn capitalists they would all labor happily in some kind of utopia. It would be comical except people actually believe it. Notice their ears remain firmly plugged to testimonies of life as a worker under communism which would tell them their framework is fatally flawed.
Of course, operating under such ideological blinkers it is no wonder why so many leftist grad students toil for the promised land. Others merely do the same for believed good hours and prestiege with no such delusions.
Remind me to stick to my hyperlocal fast food restaurant that only has one location and probably doesn't record every conversation you have with them or use any of the other gross surveillance technology that was recorded here.
The story is really about two things. Their poor information security is pathetic, but their actual surveillance tech is genuinely kind of politically concerning. Even if it is technically legal, it's unethical to record conversations without consent.
>hyperlocal fast food restaurant that only has one location and probably doesn't record every conversation you have
Good news! With AI programming assistance, this invasive technology--with the concomitant terrible security--will be available to even the smallest business so long as nephews "who are good with computers and stuff" exist!
Regardless of the cost and capacity analysis, it's just hard to fight the industry trends. The benefits of "just don't think about hardware" are real. I think there is a school of thought that capex should be avoided at all costs (and server hardware is expensive up front). And above all, if an AWS region goes down, it doesn't seem like your org's fault, but if your bespoke private hosting arrangement goes down, then that kinda does seem like your org's fault.
> Can you explain on this claim, beyond what the article mentioned?
I run a lambda behind a load balancer, hardware dies, its redundant, it gets replaced. I have a database server fail, while it re provisions it doesn't saturate read IO on the SAN causing noisy neighbor issues.
I don't deal with any of it, I don't deal with depreciation, I don't deal with data center maintenance.
> I don't deal with depreciation, I don't deal with data center maintenance.
You don't deal with that either if you rent a dedicated server from a hosting provider. They handle the datacenter and maintenance for you for a flat monthly fee.
They do rely on you to tell them if hardware fails, however, and they'll still unplug your server and physically fix it. And there's a risk they'll replace the wrong drive in your RAID pair and you'll lose all your data - this happens sometimes - it's not a theoretical risk.
But the cloud premium needs reiteration: twenty five times. For the price of the cloud server, you can have twenty-five-way redundancy.
> And there's a risk they'll replace the wrong drive in your RAID pair and you'll lose all your data - this happens sometimes - it's not a theoretical risk.
A medium to large size asteroid can cause mass extinction events - this happens sometimes - it's not a theoretical risk.
The risk of the people responsible for managing the platform messing up and losing some of your data is still a risk in the cloud. This thread has even already had the argument "if the cloud provider goes down, it's not your fault" as a cloud benefit. Either cloud is strong and stable and can't break, or cloud breaks often enough that people will just excuse you for it.
Many people have already had their data destroyed by remote hands replacing the wrong side of a RAID. Nobody's already had their server destroyed by a mass-extincting meteor.
There's a reason semiconductor manufacturing is so highly automated, and it's not labor cost.
Humans err. Computers only err when told. But they'll repeat a task reliably without random mistakes if told what to do by a competent (manufacturing process) engineering organization. Yes it takes more than one engineer.
> I think there is a school of thought that capex should be avoided at all costs
Yep, and it's mostly caused by the VC funding model - if your investors are demanding hockey-stick growth, there is no way in hell a startup can justify (or pay for) the resulting Capex.
Whereas a nice, stable business with near-linear growth can afford to price in regular small Capex investments.
> I think there is a school of thought that capex should be avoided at all costs (and server hardware is expensive up front).
Yes, there is.
Honestly, it looks to me that this school of thought is mostly adopted by people that can't do arithmetic or use a calculator. But it does absolutely exist.
That said, no, servers are not nearly expensive enough to move the needle on a company nowadays. The room that often goes around them is, and that's why way more people rent the room than the servers in it.
I ran the IT side of a media company once, and it all worked on a half-empty rack of hardware in a small closet... except for the servers that needed bandwidth. These were colocated. Until we realized that the hoster did not have enough bandwidth, at which point we migrated to two bare metal servers at Hetzner.
In practice, all that except connectivity is relatively easy to have on-site.
Connectivity is highly dependent on the business location, local providers, their business plans and their willingness to go out of their way to serve the clients.
And I am not talking only about bandwidth, but also reserve lines and latency.
I think you hit the nail on the head. What enterprise are paying for is abstraction of responsibility. Suits would never criticise going with Microsoft or Amazon.
> if an AWS region goes down, it doesn't seem like your org's fault, but if your bespoke private hosting arrangement goes down, then that kinda does seem like your org's fault.
Never underestimate the price people are willing to pay to evade responsibility. I estimate this is a multi-billion dollar market.
To be clear - this isn't an endorsement on my part, just observations of why cloud-only deployment seems common. I guess we shouldn't neglect the pressure towards resume-oriented development either, as it undoubtedly plays a part in infra folks' careers. It probably makes you sound obsolete to be someone who works in a physical data center.
I for one really miss being able to go see the servers that my code runs on. I thought data centers were really interesting places. But I don't see a lot of effort to decide things based on pure dollar cost analysis at this point. There's a lot of other industry forces besides the microeconomics that predetermine people's hosting choices.
I realized early on in my enterprise software job that if I produce code faster than average for my team, it will just get stuck in the rest of our review and QA processes; it doesn’t get released any faster.
It feels like LLM code gen can exacerbate and generalize this effect (especially when people send mediocre LLM code gen for review which then makes the reviews become painful).
reply