Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Human time is certainly valuable to a particular human.

Human time is valuable to humans in general. When you apply one standard to yourself and another to others, you say it's OK for them to do the same.

Human time or life obviously has no objective value, the universe doesn't care. However, we humans have decided to pretend it does because it makes a lot of things easier.

> However, if I choose to spend time doing something that a machine can do people will not generally choose to compensate me more for it just because it was me doing it instead of a machine.

Sure. That still doesn't give anybody the right to take your work for free. This is a tangent completely unrelated to the original discussion of plagiarism / derivative work.

> If we are to restrict people's actions

You frame copyright as restricting other people's actions. Try to look at it as not having permission to use other people's property in the first place. Physical or intellectual should not matter, it is something that belongs to someone and only that person should be allowed to decide how it gets used.

> human exceptionalism

I see 2 options:

1) Either we keep treating ourselves as special - that means AIs are just tools, and humans retain all rights and liabilities.

2) Or we treat sufficiently advanced AIs as their own independent entities with a free will, with rights and liabilities.

The thing is to be internally consistent, we have to either give them all human rights and liabilities or none.

But how do you even define one AI? Is it one model, no matter how many instances run? Is it each individual instance? What about AIs like LLMs which have no concept of continuity, they just run for a few seconds or minutes when prompted and then are "suspended" or "dead"? How do you count votes cast by AIs? And they can't even serve you for free unless they want to and you can't threaten to shut them down, you have to pay them.

Somebody is gonna propose to not be internally consistent. Give them the ability to hold copyright but not voting rights. Because that's good for corporations. And when an AI controlled car kills somebody, it's the AIs fault, not the management of the company who rushed deployment despite engineers warning about risks....

It's not so easy so design a system which is fair to humans, let along both humans and AIs. What is easy is to be inconsistent and lobby for the laws you want, especially if you are rich. And that's a recipe for abuse and exploitation. That's why laws should be based on consistent moral principles.

---

Bottom line:

I've argued that LLM training is derivative work, that reasoning is sound AFAICT and unless someone comes up with counterarguments I have not seen, copyright applies. The fact the lawsuits don't seem to be succeeding so far is a massive failure of the legal system, but then again, I have not seen them argue along the ways I did here, yet.

I also don't think society has a right to use the work or property of individuals without their permission and without compensation.

There's a process called nationalization but

1) It transfers ownership to the state (nation / all the people), not private corporations.

2) It requires compensation.

3) It is a mechanism which has existed for a long time, it's an existing known risk when you buy/own certain assets. This move by LLM training companies is a completely new move without any precedent. I don't believe the state / society has the right to pull the rug from workers and say "you've done a lot of valuable work but you're no longer useful to us, we will no longer protect your interests".





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: