I understand artists etc. Talking about AI in a negative sense, because they don’t really get it completely, or just it’s against their self interest which means they find bad arguments to support their own interest subconsciously.
However tech people who thinks AI is bad, or not inevitable is really hard to understand. It’s almost like Bill Gates saying “we are not interested in internet”. This is pretty much being against the internet, industrialization, print press or mobile phones. The idea that AI is anything less than paradigm shifting, or even revolutionary is weird to me. I can only say being against this is either it’s self-interest or not able to grasp it.
So if I produce something art, product, game, book and if it’s good, and if it’s useful to you, fun to you, beautiful to you and you cannot really determine whether it’s AI. Does it matter? Like how does it matter? Is it because they “stole” all the art in the world. But somehow if a person “influenced” by people, ideas, art in less efficient way almost we applaud that because what else, invent the wheel again forever?
Apologies, but I'm copy/pasting a previous reply of mine to a similar sentiment:
Art is an expression of human emotion. When I hear music, I am part of those artists journey, struggles. The emotion in their songs come from their first break-up, an argument they had with someone they loved. I can understand that on a profound, shared level.
Way back me and my friends played a lot of starcraft. We only played cooperatively against the AI. Until one day me and a friend decided to play against each other. I can't tell put into words how intense that was. When we were done (we played in different rooms of house), we got together, and laughed. We both knew what the other had gone through. We both said "man, that was intense!".
I don't get that feeling from an amalgamation of all human thoughts/emotions/actions.
One death is a tragedy. A million deaths is a statistic.
Yet humans are the ones enacting an AI for art (of some kind). Is not therefore not art because even though a human initiated the process, the machine completed it?
If you argue that, then what about kinetic sculptures, what about pendulum painting, etc? The artist sets them in motion but the rest of the actions are carried out by something nonhuman.
And even in a fully autonomous sense; who are we to define art as being artefacts of human emotion? How typically human (tribalism). What's to say that an alien species doesn't exist, somewhere...out there. If that species produces something akin to art, but they never evolved the chemical reactions that we call emotions...I suppose it's not art by your definition?
And what if that alien species is not carbon based? If it therefore much of a stretch to call art that an eventual AGI produces art?
My definition of art is a superposition of everything and nothing is art at the same time; because art is art in the eye of the arts beholder. When I look up at the night sky; that's art, but no human emotion produced that.
You seem to be conflating natural beauty and the arts.
Just because something beautiful can be created without emotion, that doesn't mean it's art. It just means something pleasing was created.
We have many species on earth that are "alien" to us - they don't create with emotion, they create things that are beautiful because that's just how it ended up.
Bees don't create hexagonal honeycomb because they feel a certain way, it's just the most efficient way for them to do so. Spider webs are also created for efficacy. Down to the single cell, things are constructed in beautiful ways not for the sake of beauty, but out of evolution.
The earth itself creates things that are absolutely beautiful, but are not art. They are merely the result of chemical and kinetic processes.
The "art" of it all, is how humans interpret it and build upon it, with experience, imagination, free will and emotions.
What you see in the night sky, that is not art. That is nature.
The things that humans are compelled to create under the influence of all this beauty - that is the art.
With a kinetic structure, someone went through the effort to design it to do that. With AI art, sure you ask it to do something but a human isn't involved in the creative process in any capacity beyond that
This is a very reductionist claim about how people use AI in their art process. The truth is that the best artists use AI in a sort of dance between the human and machine. But always, the human is the prime mover through a process of iteration.
Sure, but in the case of AI it resembles the relationship of a patron to an art director. We generally don't assign artistry to the person hiring an art director to create artistic output, even if it requires heavy prompting and back and forth. I am not bold enough to try to encompass something as large and fundamental as art into a definition, though I suppose that art does cary something about the craft of using the medium.
At any rate, though there is some aversion to AI art for arts sake, the real aversion to AI art is that it squeezes one of the last viable options for people to become 'working artists' and funnels that extremely hard earned profit to the hands of the conglomerates that have enough compute to train generative models. Is making a living through your art something that we would like to value and maintain as a society? I'd say so.
No doubt, but if your Starcraft experience against AI was "somehow" exactly same with AI, gave you the same joy, and you cannot even say whether it was AI or other players, does that matter? I get this is kind of Truman Show-ish scenario, but does it really matter? If the end results are same, does it still matter? If it does, why? I get the emotional aspect of it, but in practice you wouldn't even know. Now is AI at that point for any of these, possibly not. We can tell AI right now in many interactions and art forms, because it's hollow, and it's just "perfectly mediocre".
It's kind of the sci-fi cliche, can you have feelings for an AI robot? If you can what does that mean.
I have to say this sort of thing is hard to think about, as it's pretty hypothetical right now. But I can't imagine how the current iteration of AI could give me the same joy? Me and my friend were roommates. We played other games together, including dnd. We struggled with another friend to build a LAN so we could play these games together.
I can't imagine having the same shared experience with an AI. Even if I could, knowing there is no consciousness there does changes things (if we can know such thing).
This reminds me of solipsism. I have no way of knowing if others are conscious, but it seems quite lonely to me if that were true. Even though it's the exact same thing to the outside. It's not?
We lost that like 100 years ago. Sitting and watching someone perform music in an intimate setting rarely happens anymore.
If you listen to an album by your favorite band, it is highly unlikely that your feelings/emotions and interpretations correlate with what they felt. Feeling a connection to a song is just you interpreting it through the lens of your own experience, the singer isn't connecting with listeners on some spiritual level of shared experience.
I am not an AI art fan, it grosses me out, but if we are talking purely about art as a means to convey emotions around shared experiences, then the amalgamation is probably closer to your reality than a famous musicians. You could just as easily impose your feelings around a breakup or death on an AI generated classical piano song, or a picture of a tree, or whatever.
So are photos that are edited via Photoshop not art? Are they not art if they were taken on a digital camera? What about electronic music?
You could argue all these things are not art because they used technology, just like AI music or images... no? Where does the spectrum of "true art" begin and end?
They aren't arguing against technology, they're saying that a person didn't really make anything. With photoshop, those are tools that can aid in art. With AI, there isn't any creative process beyond thinking up a concept and having it appear. We don't call people who commission art artists, because they asked someone else to use their creativity to realise an idea. Even there, the artist still put in creative effort into the composition, the elements, the things you study in art appreciation classes. Art isn't just aesthetically pleasing things, it has meaning and effort put into it
There is also something personally ego-shattering about getting destroyed by another human. If I died 10 times trying to beat a boss in a game I wouldn't care much, but if someone beat me 10 times in a row at a multiplayer game I would be questioning everything.
I actually think this is the same point as who you’re responding to. If the human vs ai factor didn’t matter, you wouldn’t care if it was the human or ai on your co-op. The differences are subtle but meaningful and will always play a role in how we choose experiences
I think your view makes sense. On the other hand, Flash revolutionized animation online by allowing artists to express their ideas without having to exhaustively render every single frame, thanks to algorithmic tweening. And yeah, the resulting quality was lower than what Disney or Dreamworks could do. But the ten thousand flowers that bloomed because a wall came down for people with ideas but not time utterly redefined huge swaths of the cultural zeitgeist in a few short years.
I strongly suspect automatic content synthesis will have similar effect as people get their legs under how to use it, because I strongly suspect there are even more people out there with more ideas than time.
I hear the complaints about AI being "weird" or "gross" now and I think about the complaints about Newgrounds content back in the day.
It matters because the amount of influence something has on you is directly attributable to the amount of human effort put into it. When that effort is removed so to is the influence. Influence does not exist independently of effort.
All the people yapping about LLM keep fundamentally not grasping that concept. They think that output exists in a pure functional vacuum.
I don't know if I'm misinterpreting the word "influence", but low-effort internet memes have a lot more cultural impact than a lot of high-effort art. Also there's botnets, which influence political voting behaviour.
> low-effort internet memes have a lot more cultural impact than a lot of high-effort art.
Memes only have impact in aggregate due to emergent properties in a Mcluhanian sense. An individual meme has little to no impact compared to (some) works of art.
I see what you're getting at, but I think a better framing would be: there's an implicit understand amongst humans that, in the case of things ostensibly human-created, a human found it worth creating. If someone put in the effort to write something, it's because they believed it worth reading. It's part of the social contract that makes it seem worth reading a book or listening to a lecture even if you don't receive any value from the first word.
LLMs and AI art flip this around because potentially very little effort went into making things that potentially take lots of effort to experience and digest. That doesn't inherently mean they're not valuable, but it does mean there's no guarantee that at least one other person out there found it valuable. Even pre-AI it wasn't an iron-clad guarantee of course -- copy-writing, blogspam, and astroturfing existed long before LLMs. But everyone hates those because they prey on the same social contract that LLMs do, except in a smaller scale, and with a lower effort-in:effort-out ratio.
IMO though, while AI enables malicious / selfish / otherwise anti-social behavior at an unprecedented scale, it also enables some pretty cool stuff and new creative potential. Focusing on the tech rather than those using it to harm others is barking up the wrong tree. It's looking for a technical solution to a social problem.
Well, the LLMs were trained with data that required human effort to write, it's not just random noise. So the result they can give is, indirectly and probabilistically regurgitated, human effort.
I'm paying infrastructure costs for our little art community, chatbot crawling our servers and ignoring robots.txt, mining the work of our users so it can make copies, and being told that I don't get because this is such a paradigm shift, is pretty great..
Yes, it matters to me because art is something deeply human, and I don't want to consume art made by a machine.
It doesn't matter if it's fun and beautiful, it's just that I don't want to. It's like other things in life I try to avoid, like buying sneakers made by children, or sign-up to anything Meta-owned.
That's pretty much what they said about photographs at first. I don't think you'll find a lot of people who argue that there's no art in photography now.
Asking a machine to draw a picture and then making no changes? It's still art. There was a human designing the original input. There was human intention.
And that's before they continue to use the AI tools to modify the art to better match their intention and vision.
I would argue that with photography, only the tooling changed but the craft remained, while with AI the craft completely transforms in something else, in which I'm just not interested in.
> I understand artists etc. Talking about AI in a negative sense, because they don’t really get it completely, or just it’s against their self interest which means they find bad arguments to support their own interest subconsciously.
This is an extremely crude characterisation of what many people feel. Plenty of artists oppose copyright-ignoring generative AI and "get" it perfectly, even use it in art, but in ways that avoid the lazy gold-rush mentality we're seeing now.
I hear you, that's not a problem of AI but a problem of copyright and other stuff. I suppose they'd enrage if an artist replicated their art too closely, rightly or wrongly. Isn't it flattery that your art is literally copied millions of times? I guess not when it doesn't pay you, which is a separate issue than AI in my opinion. Theoretically we can have worse that's only trained on public domain that'd have addressed that concern.
Just like you cannot put piracy into the bag in terms of movies, tv shows you cannot put AI into the bag it came from. Bottom line, this is happening (more like happened) now let's think about what that means and find a way forward.
Prime example is voice acting, I hear why voice actors are mad, if someone can steal your voice. But why not work on legal framework to sell your voice for royalties or whatever. I mean if we can get that lovely voice of yours without you spending your weeks, and still compensated fairly for it, I don't see how this is a problem. -and I know this is already happening, as it should.
This kind of talk I see as an extension of OP's rant. You talk as if mass theft of these LLM growing companies was inevitable. Hogwash and absolutely wrong. It isn't inevitable and it (my opinion) it shouldn't be.
I work at a company trying very hard to incorporate AI into pretty much everything we do. The people pushing it tend to have little understanding of the technology, while the more experienced technical people see a huge mismatch between its advertised benefits and actual results. I have yet to see any evidence that AI is "paradigm shifting" much less "revolutionary." I would be curious to hear any data or examples you have backing those claims up.
In regards to why tech people should be skeptical of AI: technology exists solely to benefit humans in some way. Companies that employ technology should use it to benefit at least one human stakeholder group (employees, customers, shareholders, etc). So far what I have seen is that AI has reduced hiring (negatively impacting employees), created a lot of bad user interfaces (bad for customers), and cost way more money to companies than they are making off of it (bad to shareholders, at least in the long run). AI is an interesting and so far mildly useful technology that is being inflated by hype and causing a lot of damage in the process. Whether it becomes revolutionary like the Internet or falls by the wayside like NFTs and 3D TV's is unknowable at this point.
Case in point: subscription costs for everything going up to justify the "additional value" that AI is bringing _whether you use it or not_.
This would have been an additional, more expensive, subscription tier in the past.
Anecdote: Literally this morning, krisp.ai (noise cancellation software that succumbed to slop-itis two years ago and added AI notetaker and meeting summarization stuff to their product that's really difficult to turn off, which is insulting seeing how most people purchased this tool JUST FOR NOISE CANCELLING, but I digress) sent an email to their customers (me) announcing that they would no longer offer a free tier and will, instead, offer 14-day trials with all features enabled.
Why?
"As AI has become central to everyday work, we’ve seen that most people preferred the unlimited workflow once they tried it."
Big tech senior software engineer working on a major AI product speaking:
I totally agree with the message in the original post.
Yes, AI is going to be everywhere, and it's going to create amazing value and serious challenges, but it's essential to make it optional.
This is not only for the sake of users' freedom. This is essential for companies creating products.
This is minority report, until it is not.
AI has many modes of failure, exploitability, and unpredictability. Some are known and many are not. We have fixes for some, and band aids for some other, but many are not even known yet.
It is essential to make AI optional, to have a "dumb" alternative to everything delegated to a Gen AI.
These options should be given to users, but also, and maybe even more importantly, be baked into the product as an actively maintained and tested plan-b.
The general trend of cost cutting will not be aligned with this. Many products will remove, intentionally or not, the non-ai paths, and when the AI fails (not if), they regret this decision.
This is not a criticisms of AI or a shift in trends toward it, it's a warning for anyone who does not take seriously, the fundamental unpredictability of generative AI
When people talk about AI, they aren't talking about the algorithms and models. They're talking about the business. If you can't honestly stand up and look at the way the AI companies and related business are operating and not feel at least a little unease, you're probably Sam Altman.
> I understand artists etc. Talking about AI in a negative sense, because they don’t really get it completely, or just it’s against their self interest which means they find bad arguments to support their own interest subconsciously.
Yeah, no. It's presumptuous to say that these are the only reasons. I don't think you understand at all.
> So if I produce something art, product, game, book and if it’s good, and if it’s useful to you, fun to you, beautiful to you and you cannot really determine whether it’s AI. Does it matter? Like how does it matter?
Because to me, and many others, art is a form of communication. Artists toil because they want to communicate something to the world- people consume art because they want to be spoken to. It's a two-way street of communication. Every piece created by a human carries a message, one that's sculpted by their unique life experiences and journey.
AI-generated content may look nice on the surface, but fundamentally they say nothing at all. There is no message or intent behind a probabilistic algorithm putting pixels onto my screen.
When a person encounters AI content masquerading as human-made, it's a betrayal of expectations. There is no two-way communication, the "person" on the other side of the phone line is a spam bot. Think about how you would feel being part of a social group where the only other "people" are LLMs. Do you think that would be fulfilling or engaging after the novelty wears off?
Yes. The work of art should require skills that took years to hone, and innate talent. If it was produced without such, it is a fraud; I've been deceived.
But in fact I was not deceived in that sense, because the work is based on talent and skill: that of numerous unnamed, unattributed people.
It is simply a low-effort plagiarism, presented as an original work.
> However tech people who thinks AI is bad, or not inevitable is really hard to understand.
I disagree. It's really not. Popular AI is extremely powerful and capable of a lot of things. It's also being used for nefarious purposes at the cost of our privacy and, in many cases, livelihoods.
> So if I produce something art, product, game, book and if it’s good, and if it’s useful to you, fun to you, beautiful to you and you cannot really determine whether it’s AI. Does it matter? Like how does it matter?
We don't live in a vacuum.
Every work that someone mostly generated from a prompt is a work results in work that another person (or people) couldn't generate. This was "fine" when the scope of the automation was small, as it gave people time to re-skill or apply their skills elsewhere. This is not fine when those with capital are talking about using this for EVERY POSSIBLE skill. This is even less fine when you consider how the systems that learned how to produce these works were literally trained on stolen data!
Yes, there are plenty of jobs that are safe from today's AI. That doesn't stop the threat of possibility, however.
I also disagree that the crop of AI art that exists today is "good." Some of what's out there is pretty novel, but a vast, vast majority of it looks extremely same-y. Same color hues, same styles (see also: the pervasive Studio Ghibli look), DEFINITELY same fonts, etc. It's also kind-of low res, so it always looks sloppy when printed on large format media. That's before the garbled text that gets left in. Horrible look IMO.
AI-generated audio is worse. Soundstage is super compressed and the output sounds low-bandwidth. This works great for lo-fi (I'm sure lo-fi artists will disagree though), however.
I'm sure all of this will get better as time goes on and more GPUs are sacrificed for better training.
If I look at a piece of art that was made by a human who earned money for making that art, then it means an actual real human out there was able to put food on their table.
If I look at a piece of "art" produced by a generative AI that was trained on billions of works from people in the previous paragraph, then I have wasted some electricity even further enriching a billionaire and encouraging a world where people don't have the time to make art.
Yes, but that electricity consumption benefits an actual person.
I'm so surprised that I often find myself having to explain this to AI boosters but people have more value than computers.
If you throw a computer in a trash compactor, that's a trivial amount of e-waste. If you throw a living person in a trash compactor, that's a moral tragedy.
The people who build, maintain, and own the datacenters. The people who work at and own the companies that make the hardware in the datacenters. The people who work to build new power plants to power the data centers. The truck drivers that transport all the supplies to build the data centers and power plants.
Call me crazy, but I'd rather live in a world with lots of artists making art and sharing it with people than a world full of data centers churning out auto-generated content.
One thing I've noticed - artists view their own job as more valuable, more sacred, more important than virtually any other person's job.
They canonize themselves, and then act all shocked and offended when the rest of the world doesn't share their belief.
Obviously the existence of AI is valuable enough to pay the cost of offsetting a few artists' jobs, it's not even a question to us, but to artists it's shocking and offensive.
It's shocking and offensive to artists and to like-minded others because AI labs have based the product that is replacing them off of their existing labor with no compensation. It would be one thing to build a computerized artist that out-competes human artists on merit (arguably happening now), this has happened to dozens of professions over hundreds of years. But the fact that it was built directly off of their past labors with no offer, plan, or even consideration of making them whole for their labor in the corpus is unjust on its face.
Certainly there are artists with inflated egos and senses of self-importance (many computer programmers with this condition too), but does this give us moral high ground to freely use their work?
How many people is it OK to exploit to create "AI"?
Every piece of work is built off of previous work. Henry Ford designed his car based off of the design of previous cars, but made them much more efficiently. No difference here. It's always been the case that once your work is out in the world the competition is allowed to learn from it.
compensation for previous products just hasn't been the norm, and if it becomes the norm, countless humans will suffer due to the slowing of progress.
we will never be able to automate anything because the risk:reward ratio simply won't be there if you have to pay off millions/billions of people. progress will grind to a halt but i guess we'll have preserved some archaic jobs while our children are denied a better world
certainly the Industrial Revolution would never have happened with this mindset
I read this comment as implying a similar kind of exceptionalism for technology, but expressing a different set of values. It reminds me of the frustration I’ve heard for years from software engineers who work at companies where the product isn’t software and they’re not given the time and resources to do their best work because their bosses and nontechnical peers don’t understand the value of their work.
The opposite is also true, the tech world views itself as more sacred that any other part of humanity.
You say it's obvious that the existence of AI is valuable to offset a few artists' jobs, but it is far from obvious. The benefits of AI are still unproven (a more hallucinatory google? a tool to help programmers make architectural errors faster? a way to make ads easier to create and sloppier?). The discussion as to whether AI is valuable is common on hackernews even, so I really don't buy the "it's obvious" claim. Furthermore, the idea that it is only offsetting a few artists' jobs is also unproven: the future is uncertain, it may devastate entire industries.
> One thing I've noticed - artists view their own job as more valuable, more sacred, more important than virtually any other person's job.
> They canonize themselves, and then act all shocked and offended when the rest of the world doesn't share their belief.
You could've written this about software engineers and tech workers.
> Obviously the existence of AI is valuable enough to pay the cost of offsetting a few artists' jobs, it's not even a question to us
No, it's not obvious at all. Current AI models have made it 100x easier to spread disinformation, sow discord, and undermine worker rights. These have more value to me than being able to more efficiently Add Shareholder Value
I have noticed this, but it's not artists themselves. It's mostly coming from people who have zero artistic talent themselves, but really wish they did.
I would be fine if data centers paid the full cost of their existence, but that isn't what happens in our world.
Instead the cost of pollution is externalised and placed on the backs of humanity's children. That includes the pollution created by those datacentres running off fossil fuel generators because it was cheaper to use gas in the short term than to invest in solar capacity and storage that pays back over the long term. The pollution from building semiconductors in servers and GPUs that will likely have less than a 10 year lifespan in an AI data center as newer generations have lower operating cost. The cost of water being used for evaporative cooling being pulled from aquifers at a rate that is unsustainable because it's cheaper than deploying more expensive heat pumps in a desert climate.... and the pollution of the information on the internet from AI slop.
The short term gains from AI have a real world cost that most of us in the tech industry are isolated from. It is far from clear how to make this sustainable. The sums of money being thrown at AI will change the world forever.
Given that data centers use less energy than the alternative human labor they replace, they actually improve pollution. Replacing those GPUs with more efficient models also improves pollution because those replacements use less electricity for the same workload than the units they replaced.
This is such a wild take. You're 100% correct that AI-generated art consumes less resources that humans making art and having to, you know, eat food and stuff.
Obviously, the optimal solution is to eliminate all humans and have data centers do everything.
> I'm so surprised that I often find myself having to explain this to AI boosters but people have more value than computers.
That is true, but it does not survive contact with Capitalism. Let's zoom out and look at the larger picture of this simple scenario of "a creator creates art, another person enjoys art":
The creator probably spends hours or days painstakingly creating a work of art, consuming a certain amounts of electricity, water and other resources. The person enjoying that derives a certain amount of appreciation, say, N "enjoyment units". If payment is exchanged, it would reasonably be some function of N.
Now an AI pops up and, prompted by another human produces another similar piece of art in minutes, consuming a teeny, teeny fraction of what the human creator would. This Nature study about text generation finds LLMs are 40 - 150x more efficient in term of resource consumption, dropping to 4 - 16 for humans in India: https://www.nature.com/articles/s41598-024-76682-6 -- I would suspect the ratio is even higher for something as time-consuming as art. Note that the time taken for the human prompter is probably even less, just the time taken to imagine and type out the prompt and maybe refine it a bit.
So even if the other person derives only 0.1N "enjoyment" units out of AI art, in purely economic terms AI is a much, much better deal for everyone involved... including for the environment! And unfortunately, AI is getting so good that it may soon exceed N, so the argument that "humans can create something AI never could" will apply to an exceptionally small fraction of artists.
There are many, many moral arguments that could be made against this scenario, but as has been shown time and again, the definition of Capitalism makes no mention of morality.
But it sounds like in this case the "morality" that capitalism doesn't account for is basically just someone saying "you should be forced to pay me to do something that you could otherwise get for 10x cheaper." It's basically cartel economics.
In isolation that makes sense, but consider that these AIs have been trained on a vast corpus of human creative output without compensating the human creators, and are now being used to undercut those same humans. As such there is some room for moral outrage that did not exist in prior technical revolutions.
Personally, I think training is "fair use", both legally and practically -- in my mind, training LLMs is analogous to what happens when humans learn from examples -- but I can see how those whose livelihood is being threatened can feel doubly wronged.
The other reason I'm juxtaposing Capitalism and morality is the disruption AI will likely cause to society. The scale at which this will displace jobs (basically, almost all knowledge work) and replace them with much higher-skilled jobs (basically, you need to be at the forefront of your field) could be rather drastic. Capitalism, which has already led to such extreme wealth inequality, is unsuited to solve for this, and as others have surmised, we probably need to explore new ways of operating society.
I think we're all just sick of having everything upended and forced on us by tech companies. This is true even if it is inevitable. It occurred to me lately that modern tech and the modern internet has sort of turned into something which is evil in the way that advertising is evil. (this is aside from the fact of course that the internet is riddled with ads)
Modern tech is 100% about trying to coerce you: you need to buy X, you need to be outraged by X, you must change X in your life or else fall behind.
I really don't want any of this, I'm sick of it. Even if it's inevitable I have no positive feelings about the development, and no positive feelings about anyone or any company pushing it. I don't just mean AI. I mean any of this dumb trash that is constantly being pushed on everyone.
Well you don't, and no tech company can force you to.
> you must change X in your life or else fall behind
This is not forced on you by tech companies, but by the rest of society adopting that tech because they want to. Things change as technology advances. Your feeling of entitlement that you should not have to make any change that you don't want to is ridiculous.
I honestly cannot agree more with this, while still standing behind what I said on the parent comment.
As someone who's been in tech for more than 25 years, I started to hate tech because of all things that you've said. I loved what tech meant, and I hate what it became (to the point I got out of the industry).
But majority of these disappear if we talk about offline models, open models. Some of that already happened and we know more of that will happen, just matter of time. In that world how any of us can say "I don't want a good amount of the knowledge in the whole fucking world in my computer, without even having an internet or paying someone, or seeing ads".
I respect if your stand is just like a vegetarian says I'm ethically against eating animals", I have no argument to that, it's not my ethical line but I respect it. However behind that point, what's the legitimate argument, shall we make humanity worse just rejecting this paradigm shifting, world changing thing. Do we think about people who's going to able to read any content in the world in their language even if their language is very obscure one, that no one cares or auto translate. I mean the what AI means for humanity is huge.
What tech companies and governments do with AI is horrific and scary. However government will do it nonetheless, and tech companies will be supported by these powers nonetheless. Therefore AI is not the enemy, let's aim our criticism and actions to real enemies.
Well it's not really about AI is it then; it's about Millenia of human evolution and the intrinsically human behaviours we've evolved.
Like greed. And apathy. Those are just some of the things that have enabled billionaires and trillionaires. Is it ever gonna change? Well it hasn't for millions of years, so no. As long as we remain human we'll always be assholes to each other.
> I can only say being against this is either it’s self-interest or not able to grasp it.
So we're just waving away the carbon cost, centralization of power, privacy fallout, fraud amplification, and the erosion of trust in information? These are enormous society-level effects (and there are many more to list).
Dismissing AI criticism as simply ignorance says more about your own.
> I understand artists etc. Talking about AI in a negative sense, because they don’t really get it completely, or just it’s against their self interest which means they find bad arguments to support their own interest subconsciously
>But somehow if a person “influenced” by people, ideas, art in less efficient way almost we applaud that because what else, invent the wheel again forever?
I understand AI perfectly fine, thanks. I just reject the illegal vacuuming up of everyones' art for corporations while things like sampling in music remain illegal. This idea that everything must be efficient comes from the bowels of Silicon Valley and should die.
> However tech people who thinks AI is bad, or not inevitable is really hard to understand. It’s almost like Bill Gates saying “we are not interested in internet”. This is pretty much being against the internet, industrialization, print press or mobile phones. The idea that AI is anything less than paradigm shifting, or even revolutionary is weird to me. I can only say being against this is either it’s self-interest or not able to grasp it.
Again, the problem is less the tech itself and the corporations who have control of it. Yes, I'm against corporations gobbling up everyones' data for ads and AI surveillance. I think you might be the one who doesn't understand that not everything is roses and their might be more weeds in the garden than flowers.
It's not corps only though, AI at this point include many open models, and we'll have more of it as we go if needed. Just like how the original hacker culture was born, we have the open source movements, AI will follow it.
When LLMs first gotten of, people were talking about how governments will control them, but anyone who knows the history of personal computing and hacker culture knew that's not the way things go in this world.
Do I enjoy corpos making money off of anyone's work, including obvious things like literally pirating books and training their models (Meta), absolutely not. However you are blaming the wrong thing in here, it's not technology's fault it's how governments are always corrupted and side with money instead of their people. We should be lashing out to them not each other, not the people who use AI and certainly not the people who innovate and build it.
Free AI evangelists need to get better about teaching people how "AI compute" away from the big tech platforms. They need to show that local models are equal or better than ChatGPT et al., and show how people can set those systems up, without being ornery or obtuse.
The same actually goes for artists who want people to create and value human-made art. Everyone seems so resentful of the people who they should want to be on their side. You have to practice what you preach, man.
The people who "innovate" and build it are working for Meta and the companies that shamelessly steal from individuals. Companies are made up of individuals who make these decisions.
> I understand artists etc. Talking about AI in a negative sense, because they don’t really get it completely, or just it’s against their self interest which means they find bad arguments to support their own interest subconsciously.
Running this paragraph through Gemini, returns a list of the fallacies employed, including - Attacking the Motive - "Even if the artists are motivated by self-interest, this does not automatically make their arguments about AI's negative impacts factually incorrect or "bad."
Just as a poor person is more aware through direct observation and experience, of the consequences of corporate capitalism and financialisation; an artist at the coal face of the restructuring of the creative economy by massive 'IP owners' and IP Pirates (i.e.: the companies training on their creative work without permission) is likely far more in touch the the consequences of actually existing AI than a tech worker who is financially incentivised to view them benignly.
> The idea that AI is anything less than paradigm shifting, or even revolutionary is weird to me.
This is a strange kind of anti-naturalistic fallacy. A paradigm shift (or indeed a revolution) is not in itself a good thing. One paradigm shift that has occurred for example in recent goepolitics is the normalisation of state murder - i.e.: extrajudicial assassination in the drone war or the current US govts use of missile attacks on alleged drug traffickers. One can generate countless other negative paradigm shifts.
> if I produce something art, product, game, book and if it’s good, and if it’s useful to you, fun to you, beautiful to you and you cannot really determine whether it’s AI. Does it matter?
1) You haven't produced it.
2) Such a thing - a beautiful product of AI that is not identifiably artificial - does not yet, and may never exist.
3) Scare quotes around intellectual property theft aren't an argument. We can abandon IP rights - in which case hurrah, tech companies have none - or we can in law at least, respect them. Anything else is legally and morally incoherent self justification.
4) Do you actually know anything about the history of art, any genre of it whatsoever? Because suggesting originality is impossible and 'efficiency' of production is the only form of artistic progress suggests otherwise.
I understand artists etc. Talking about AI in a negative sense, because they don’t really get it completely, or just it’s against their self interest which means they find bad arguments to support their own interest subconsciously
Why do you think, depicting artists with a negative sense re. AI as dumb or self-interested and following up with dismissing their arguments as "bad" will foster an open discussion?
The fact that you are quoting Gates and not anybody else makes it clear why you don't understand the techies' with anti AI arguments. This is why it matters: LLMs are a tool, currently provided for use by the general population, but it's still a tool owned by massive corporations which means it has to produce revenue and profit at some point. How would you feel, if in every AI art piece you make, a Coca-Cola bottle is auto-integrated and you cannot delete it? Pay the premium service and you get the same bottle, but a lot smaller. AI is first and foremost a software product. Product means marketing. A product, that would not exist, had the owning companies not used human-made art without either consent or compensation to make a profit for themselves. You do not enter in this equation, because you are just a potential consumer. The complete dehumanisation of all of us is why it matters.
Contemporary AI is bad in the same way a Walther P.38 is bad: it's a tool designed by an objectively, ontologically evil force specifically for their evil ends. We live in a world where there are no hunting rifles, no pea-shooters for little old women to protect themselves, no sport pistols. Just the AI equivalent or a weapon built for easy murder by people whose express end is taking over the world.
...Okay, now maybe take that and dial it back a few notches of hyperbole, and you'll have a reasonable explanation for why people have issues with AI as it currently exists. People are not wrong to recognize that, just because some people use AI for benign reasons, the people and companies that have formed a cartel for the tech mainly see those benign reasons as incidental to becoming middle men in every single business and personal computing task.
Of course, there is certainly a potential future where this is not the case, and AI is truly a prosocial, democratizing technology. But we're not there, and will have a hard time getting there with Zuckerburg, Altman, Nadella, and Musk at the helm.
As someone who spends quite a bit of time sketching and drawing for my own satisfaction, it does matter to me when something is created using AI.
I can tell whether something is a matte painting, Procreate, watercolor, or some other medium. I have enough taste to distinguish between a neophyte and an expert.
I know what it means to be that good.
Sure, most people couldn’t care less, and they’re happy with something that’s simply pleasant to look at.
But for those people, it wouldn’t matter even if it weren’t AI-generated. So what is the point?
You created something without having to get a human to do it. Yaay?
Except we already have more content than we know what to do with, so what exactly are we gaining here? Efficiency?
Generative AI was fed on the free work and joy of millions, only to mechanically regurgitate content without attribution. To treat creators as middlemen in the process.
Yaay, efficient art. This is really what is missing in a world with more content than we have time to consume.
The point of markets, of progress, is the improvement of the human condition. That is the whole point of every regulation, every contract, and every innovation.
I am personally not invested in a world that is worse for humanity
I mean we have already stopped caring about dump stock photos at the beginning for every blog post, so we already don't care about shit that's meaningless, let's it's still happening because there is an audience for it.
Art can be about many things, we have a lot of tech oriented art (think about demo scene). Noe one gives a shit about art that evokes nothing for them, therefore if AI evokes nothing who cares, if it does, is it bad suddenly because it's AI? How?
Actually I think AI will force good amount of mediums to logical conclusion if what you do is mediocre, and not original and AI can do same or better, then that's about you. Once you pass that threshold that's how the world cherish you as a recognized artist. Again you can be artist even 99.9% of the world thinks what you produced is absolute garbage, that doesn't change what you do and what that means to you. Again nothing to do with AI.
>I can only say being against this is either it’s self-interest or not able to grasp it.
I'm massively burnt out, what can I say? I can grasp new tech perfectly fine, but I don't want to. I quite honestly can't muster enough energy to care about "revolutionary" things anymore.
If anything I resent having to deal with yet more "revolutionary" bullshit.
The problem is that LLMs are just parrots who swoop into your house and steal everything, then claim it as theirs. That's not art, that's thievery and regurgitation. To resign oneself that this behavior is ok and inevitable is sad and cowardly.
To conflate LLMs with a printing press or the internet is dishonest; yes, it's a tool, but one which degrades society in its use.
Yep that's the usual "art is a problem that must be solved, the process is an inconvenience" mindset. The CEO of SUNO AI also said something similar, "people don't enjoy making music".
I mean, I understand the CEO of SUNO etc. Talking about self-actualization in a negative sense, because they don’t really get it completely, or just it’s against their business interest which means they find bad arguments to support their own interest openly.
As someone else put it succinctly, there's art and then there's content. AI generated stuff is content.
And not to be too dismissive of copywriters, but old Buzzfeed style listicles are content as well. Stuff that people get paid pennies per word for, stuff that a huge amount of people will bid on on a gig job site like Fiverr or what have you is content, stuff that people churn out by rote is content.
Creative writing on the other hand is not content. I won't call my shitposting on HN art, but it's not content either because I put (some) thought into it and am typing it out with my real hands. And I don't have someone telling me what I should write. Or paying me for it, for that matter.
Meanwhile, AI doesn't do anything on its own. It can be made to simulate doing stuff on its own (by running continuously / unlimited, or by feeding it a regular stream of prompts), but it won't suddenly go "I'm going to shitpost on HN today" unless told to.
…and the sting is that the majority of people employed in creative fields are hired to produce content, not art. AI makes this blatantly clear with no fallbacks to ease the mind.
To me, it matters because most serious art requires time and effort to study, ponder, and analyze.
The more stuff that exists in the world that superficially looks like art but is actually meaningless slop, the more likely it is that your time and effort is wasted on such empty nonsense.
However tech people who thinks AI is bad, or not inevitable is really hard to understand. It’s almost like Bill Gates saying “we are not interested in internet”. This is pretty much being against the internet, industrialization, print press or mobile phones. The idea that AI is anything less than paradigm shifting, or even revolutionary is weird to me. I can only say being against this is either it’s self-interest or not able to grasp it.
So if I produce something art, product, game, book and if it’s good, and if it’s useful to you, fun to you, beautiful to you and you cannot really determine whether it’s AI. Does it matter? Like how does it matter? Is it because they “stole” all the art in the world. But somehow if a person “influenced” by people, ideas, art in less efficient way almost we applaud that because what else, invent the wheel again forever?