> All these questions are true for agriculture, yet you say "yes thank you, and please continue" for that industry I am sure, which seeks to improve product through random walk and unknown mechanisms.
Tell me you know nothing about modern agriculture without telling me that
> Genetic algorithms created our species, which are far more complex than anything we have written in computer science. I think they have stood up to the tests of creating a viable product for a given behavior.
Yes, and our species is a fragile barely functioning machinery with an insane number of failing points, and hillariously bad and inefficiently placed components.
Similar? Very different. The HKmap.live app was build and marketed directly for the protests. It tracked social media and geolocated where the police and protests were happening, etc. This is a big distinction.
Offtopic (I don't know if they are LLM Or not , I don't want to respond to it because many people will already do it) but someone should study the significance of em-dashes because to me (and maybe you or others) its one of the most significant indicators (sometimes false) as well.
If someone ever used em-dashes before , what are you using now? (I don't use em-dashes but I am curious!) and did you guys ever change to purposeful linguistic errors to not look AI
I am thinking of going back to a , comma like this with spaces in both side intentionally because I used to make this mistake in the start and I had people genuinely fume over this grammar nitpick more times than I can count. But after I stopped it and got better at writing, I got called AI too (mfw when I am a human helloooo)
Just writing what's on my mind recently; I find it funny to change back to grammar mistakes because of AI .
(I made the errors in this post as well! I try to ship ideas fast lol & brain.exe not working after being tired right now :] )
> If someone ever used em-dashes before , what are you using now? (I don't use em-dashes but I am curious!) and did you guys ever change to purposeful linguistic errors to not look AI
I keep using them because I've been using them on Mac keyboards since forever (and on iOS keyboards). I don't use them while on Linux because I couldn't be bothered to learn how to type them yet :)
I will not give up my ways just because AI is taking over
Your problem is thinking that hype artists, professionals and skeptics are all the same voice with the same opinion. Because of that, you can't recognize when sentiment is changing among the more skeptical.
Functional illiteracy and lack of any capacity to hold any context longer than two sentences has long been a plague on HN. Now that we've outsourced our entire thinking process to "@grok is this true", it has now claimed almost the entirety of human race.
soulofmischief: complains that AI-skeptics would say the Wright brothers were idiots because they didn't imediately implement a supersonic jet
ares623: we were promised supersonic jets today or very soon (translation: AI hype and scam artists have already promised a lot now)
eru: The passive voice is doing a lot of work in your sentence. (Translation: he questions the validity of ares623's statement)
me: Here are just three examples of hype and scam promising the equivalent of super jet today, with some companies already being burned by these promises.
Apply your own "functional literacy". I made a clarification that those outside of an industry have to separate the opinions of professionals and hype artists.
The irony of your comment would be salient, if it didn't feel like I was speaking with a child. This conversation is over, there's no reason to continue speaking with you as long you maintain this obnoxious attitude coupled with bad reading comprehension.
Here's Ryan Dahl, cofounder of Deno, creator of Node.js tweeting today:
--- start quote ---
This has been said a thousand times before, but allow me to add my own voice: the era of humans writing code is over. Disturbing for those of us who identify as SWEs, but no less true. That's not to say SWEs don't have work to do, but writing syntax directly is not it.
They have everything to gain by saying those things. It doesn’t even need to be true. All the benefits arrive at the point of tweeting.
If it turns out to be not true then they don’t lose anything.
So we are in a state where people can just say things all the time. Worse, they _have_ to say. To them, Not saying anything is just as bad as being directly against the hype. Zero accountability.
Last one is irrelevant. Of course some companies are miscalculating.
OpenAI never claimed they had achieved AGI internally. Sam was very obviously joking, and despite the joke being so obvious he even clarified hours later.
>In a post to the Reddit forum r/singularity, Mr Altman wrote “AGI has been achieved internally”, referring to artificial general intelligence – AI systems that match or exceed human intelligence.
>Mr Altman then edited his original post to add: “Obviously this is just memeing, y’all have no chill, when AGI is achieved it will not be announced with a Reddit comment.”
Dario has not said "we are months away from software jobs being obsolete". He said:
>"I think we will be there in three to six months, where AI is writing 90% of the code. And then, in 12 months, we may be in a world where AI is writing essentially all of the code"
He's maybe off by some months, but not at all a bad prediction.
Arguing with AI skeptics reminds me of debating other very zealous ideologues. It's such a strange thing to me.
Like, just use the stuff. It's right there. It's mostly the people using the stuff vs. the people who refuse to use it because they feel it'll make them ideologically impure, or they used it once two years ago when it was way worse and haven't touched it since.
> There are likely to be no devices running iOS 16
My iPhone X is stuck on iOS 16 with no way to upgrade.
However, the phone is still working well. Despite being in daily use for 8 years it still has 81% battery capacity, has never been dropped, has a great OLED screen, can record 4K@60 video. It is far more responsive than a brand new 2025 $200 Android phone from e.g. Xiaomi. It still gets security patches from Apple. The only real shortcoming compared to a modern iPhone is the low light camera performance. That and some app developers don't support iOS 16 anymore, so e.g. I can't use the ChatGPT app and have to use it via the browser, but the Gemini app works fine.
I visited a distillery in 2020. Their machines were managed by HP laptops running Windows XP. Those machines and those laptops and that Windows XP are probably still there with their old IE browser.
They will probably be there for as long as the capacitors last, but the critical thing is that they are almost certainly running some Win32 industrial process software with no need for web browsers or for that matter even Internet connectivity. In fact I hope they’re not on wifi given the state of legacy WinXP security!
Here's a logical step you skipped: A blind matematician can do revolutionary work in mathematics. He is highly unlikely to do revolutionary work in agriculture.
Interesting example, as there was an article on HN front page 10 days ago about exactly that - a blind person doing revolutionary work in agriculture. [0][1]
You can only call Shadcn a "baseline" if it was a baseline of the last floor of the babel tower of abstractions.
reply