Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> I don’t mind big companies training on generally available data, I mind the IP-laundering

large platforms saw AI and instantly closed their platforms making it hard or impossible for external actors to mine that "generally available data," hurting their own users and the open web in the process, and then they mined the data themselves.



As long as a "user" can access those platforms then that data can and will be mined. The people working on such solutions just dont publish them publicly until the debate is settled. If I can view information on any website, authenticated or not, then I can build a bot that will do the same. This does not mean its being done for nefarious reasons. Simply automating the process of bringing what I consider valuable information to me is enough motivation to do it. In my case, the only profit I make is saving time not manually clicking around to access the data I read over morning coffee.

The internet routes around censorship. Its impossible to hide information as long as its meant to be accessed by a human. If companies want to spend engineering hours putting locks then thats their waste.

Many businesses will fail by wasting time and money creating locks that can and will be circumvented.

I agree that a new social contract is inevitable because the only way to prevent data from being mined is to not produce it to begin with. Period. This I know.


When I was young I used to upload fan art of Naruto to DeviantArt. My badly scanned drawings sucked. Everyone else's sucked. It was cool.

Today DeviantArt has its own AI which it promotes over their own users' work. I've read some threads by artists discussing where to go next, between DA, Instagram, ArtStation, and several other new and likely not much better platforms, and one comment that struck me was someone saying it was just not worth it, and their time was better spent networking offline at a gallery.

AI art might actually kill online art communities.

AI-generated articles might kill online publishing.

AI-generated spam bots might kill social media.

We've taken the Internet for granted as grandma and grandpa joined it. Tomorrow people may just get sick of all these algorithms, let go of their smartphones, and go touch some grass. Then every website is just going to be AI bots regurgitating each others' content ad nauseum.

Humans are on the web because of the reach. If AI-generated content steals all the reach, why would anyone post anything on publicly accessible venues instead of just using private ones?

"A new social contract is inevitable because the only way to prevent data from being mined is to not produce it to begin with." But is it though? You are assuming that "not produce it to begin with" is impossible. I'm afraid it's not impossible and the web experiment is at real danger. Maybe not immediately, but will it survive another 20 years in this environment?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: