Weird, as I see it as almost the flip side of the coin from the author.
I was just today discussing my playing with a resolution to - by year end - go back to a flip phone.
But the only way I see this as feasible in a modern world, to disconnect from a connected world, is to leverage AI to do so. Have it read and triage my inbox, and if it's important and urgent, call me. Otherwise just summarize it in an EOD back and forth.
Rather than seeing AI as enslaving me to overstimulation for my monkey brain I see it as a potential liberator at long last.
Similarly, while yes Moore's Law for most traditional computing workloads means we're hitting a flattening curve, specifically neural network workloads have continued massive potential in hardware shifts over the next decade. And the ways in which we are already seeing proof of concepts for moving traditional computing tasks to AI has me wondering just how important traditional processing is actually going to be for the majority of my increasingly liberated computer usage.
For the first time in over a decade I'm actually looking forward to what tech has to bring to my life beyond a slightly faster processor with diminishing returns.
I see this sentiment a lot. People want to have AI be on the internet for them instead of themselves.
It really shed light on the 'always online' thing: we never really 'want' this, rather we feel that it is necessary for survival.
Given the same benefit, people would rather not be on the internet. That blows my mind a bit.
It's a "broken glass cornucopia": the Internet is a horn of plenty disgorging an unlimited amount of free, good content .. with harmful fragments mixed in.
Sometimes the effort of filtering out the harmful bits is too much. Sometimes one too many bits breaks through. Sometimes you become aware of it gradually, like a cigarette smoker. Sometimes the very plenty itself feels like the problem.
I think one of the things I'm going to spend 2024 looking at is trying to separate out "want to do", "enjoy doing", "feel compelled to do", and "feel good about having done".
This has been true since the very beginning of the information age, before the internet was a thing. The dream was to have "user agents", little AIs that would navigate and take action on our behalf. The BBC released this "Hyperland" video with Douglas Adams and Tom Baker in 1990. https://archive.org/details/DouglasAdams-Hyperland
I think it’s more that people want to pick and choose what parts of the internet they personally engage with, automating the loathsome-but-necessary parts and leaving the parts they enjoy for themselves.
It’s kind of chasing the internet as it was for many in the late 90s and early 00s, where it wasn’t yet common enough to be a hard requirement for most people, with the bulk of usage being being for leisure or at least entirely voluntary.
It was an explosion of individuality, new information - I found this highly beneficial to personal growth. It is now a series of corporate sandboxes connected by steep-walled corridors, with each individual's actions bent and monetised for the benefit of others. The promise of the future is even worse - an ai that will curate a custom reality for you.
But then, who wants that? If that's it, may it die a quick death or at least let its offerings be recognised as largely irrelevant to people, like the some unhinged communist uber television channel.
My instagram feed is full of fake AI images. I really find it fucking disgustingly boring. Last month was the first time that I actually truly thought, "Instagram is boring, why bother".
I know Instagram isn't "the internet" but there is about 5 platforms that make "the internet" up, and Meta's are up there.
But yeah, the less I'm on a computer or phone, the better.
> But the only way I see this as feasible in a modern world, to disconnect from a connected world, is to leverage AI to do so. Have it read and triage my inbox, and if it's important and urgent, call me.
Might shock you but people for ages did and today do live like this without AI!
Rather than retro, how about keeping the performance per cycle of earlier systems on modern hardware? Alan Kay used to say that there should be no more delay between pressing a key and having something happen on a computer than on a piano. This was in response to sluggish time-sharing systems of the 1970s. Now we have dedicated gigaflops per user and it can still take seconds to get simple things to happen.
If you want to do retro stuff for recreation, fine, but don't make a cult of it.
If modern software were as optimized as it had to be to run on Pentium 4’s and PowerPC G5’s it’d be amazing on modern hardware.
Photoshop 7/CS1 was pretty zippy on a decent PowerMac back then for instance, which would lead one to expect that a similarly well optimized graphics editor would be practically instantaneous on a M3 Max or Ryzen 7950X with how ridiculously more powerful that hardware is, but no such editors seem to exist.
So when some numpty dev exclaims "it doesn't matter that it uses twice the bandwidth, cpu and hard drive space that it needs to, all those things are cheap and getting exponentially cheaper" I can officially slap them with a trout now?
I mean, maybe, but it seems like software isn't really getting significantly slower anymore. They've actually started optimizing their 7000 layer mega stacks, enough for Moore's law to finally catch up, and things seem pretty OK , aside from video games.
Games need insane amounts of CPU despite not producing any more happiness than 90s games. The graphics and content really has expanded though, so it's not like it's all for nothing, it's just that the amount of enjoyment doesn't seem to scale linearly with graphics quality.
Still lots of great stuff that does run on an i5 though!
And that part is sort of waining too. Games are starting to 'tap out', so to speak, on the massively-growing CPU/GPU desires. Helped along by having really popular devices to target that are CPU/GPU/memory constrained (compared to a gaming PC or PS5 or such), and a desire by business to have highly-profitable massively-online or Games-as-a-Service titles as widely-available as possible.
See how many of the most popular highest-grossing games, like say Overwatch or Warframe or No Man's Sky or Fortnite or Roblox or Minecraft or Apex Legends or such, all end up wanting to target at least one traditionally-lower-end device (a Steam Deck / Nintendo Switch / iPhone+iPad / Android / etc).
Games make a lot more money than they used to. Might want to see how many copies stuff like Quake sold, which whilst despite feeling like a global phenomenon, wasn't really by modern standards. Though I do agree with the sentiment about happiness, to a point. MMOs are extremely sophisticated these days. Even FPS games with big players counts like FPS are really interesting versus what we used to have.
Yes: even if you are not aware of it, each one of you are looking for ways to get away from the modern world. We all are. All the time. There are two kinds of people in our modern western world: exhausted people and liars.
I laughed out loud at this.
It's a bit unfair though. I'm looking for ways to get away from parts of the modern world, but certainly not all!
EDIT: Great article, one more note from me.
kids: we did not yet call them “apps” back then
So I nerded out and look it up - the term app as a shortening of "application program" is attested from 1992. Anecdotally I remember it being used by linux users pre-iphone.
> First trend: Moore’s Law, or rather, its lack thereof. Despite all the efforts of Apple to hide its flattening curve with ARM-based chips, it is vox populi that a computer from 2014 is just about as powerful as the one you bought as a Christmas gift in 2023, bar the amount of RAM or the speed and capacity of the SSDs therein. On the other hand, consumer PCs of 2014 were, by all standards, widely superior to those of 2004, and immensely more than those of 1994 and, needless to say, 1984.
Generative AI and local LLMs have really changed this. It is an amazing capability to be able to run Stable Diffusion and LLMs like Mistral in the privacy and control of your own hardware.
Despite whatever assurances OpenAI may give you about the privacy of your data, you are not in control. They already deem certain prompts problematic. How long before they forward problematic prompts to law enforcement? Also, how safe is your IP? We already know that LLMs can reproduce some of their training material verbatim. Do you want your source code to be able to be reproduced like that?
The latest consumer hardware is able to run a lot of these models at acceptable performance. I think that demand for that might drive at least some people upgrading or else getting a more powerful computer and/or GPU.
> First trend: Moore’s Law, or rather, its lack thereof. Despite all the efforts of Apple to hide its flattening curve with ARM-based chips, it is vox populi that a computer from 2014 is just about as powerful as the one you bought as a Christmas gift in 2023, bar the amount of RAM or the speed and capacity of the SSDs therein.
But my eyes see something different. Let's take a very good game in 2014, Dragon Age: Inquisition, and a good game this year, say Cyberpunk 2077. There is a progress between the two that sort of compensate for maybe single-thread benchmark lack of progress.
Ofc the moore low is done now if you look at chips, but the pace of progress, if you look at the actual result where you need processing, seems to be satisfying.
I do know what you mean though: since multi-core processors started, the experience on office-like work has been little changed and you can def run everything well with a 3Ghz 4-core CPU today. The SSD was really more of a game changer.
If the progress you're talking about in terms of processing efficiency of those games? Or something else?
Of note, I would bet that if you took a modern GPU, SSD, 32gb of RAM, and stuck it on a 2014-era motherboard with a i7-4790K it would run Cyberpunk 2077 just fine.
I don't think you could say the same about a Pentium 4 running Dragon Age: Inquisition, or even Skyrim.
Single-thread performance is only relevant for workloads that can't be parallelized, or for software that will never be adapted. It has been well known for more than a decade now that parallelization is the only way forward.
Edit: application-specific accellerators are another promising approach that has been in vogue for a long time and which achieves speedups that are obviously not easily visible by looking at simple GHz metrics
> only relevant for workloads that can't be parallelized
This ends up dominating anyway. Partly because the easy stuff does get parallelized. So you're left with things like "browser layout engine" being single threaded. Almost all systems have single-threaded UIs, because trying to do anything else becomes unreasonably hard to reason about. The UI thread will delegate work to other threads, but there is nearly always a single "UI thread" bottleneck.
For most desktop applications, this bottleneck is not that restricting though. Almost any noticeable UI delay is due to stuff that ought to be offloaded into a background thread, but isn't. For example, IntelliJ often freezes up when I enter a search string in the settings dialog, which is probably due to a search index being loaded or built on the UI thread.
Layout engines for browsers do get parallelized though. And the point of modern 3D APIs like Vulkan APIs is precisely to allow calling it from more than one thread.
I was just today discussing my playing with a resolution to - by year end - go back to a flip phone.
But the only way I see this as feasible in a modern world, to disconnect from a connected world, is to leverage AI to do so. Have it read and triage my inbox, and if it's important and urgent, call me. Otherwise just summarize it in an EOD back and forth.
Rather than seeing AI as enslaving me to overstimulation for my monkey brain I see it as a potential liberator at long last.
Similarly, while yes Moore's Law for most traditional computing workloads means we're hitting a flattening curve, specifically neural network workloads have continued massive potential in hardware shifts over the next decade. And the ways in which we are already seeing proof of concepts for moving traditional computing tasks to AI has me wondering just how important traditional processing is actually going to be for the majority of my increasingly liberated computer usage.
For the first time in over a decade I'm actually looking forward to what tech has to bring to my life beyond a slightly faster processor with diminishing returns.