the war against the machines
2 years ago
General
So I'm reading about Glaze and Nightshade, two pieces of software designed to a) keep artists' images from being scraped by AI theft bots by seeding the image with unreadable data and b) to poison the AI theft bots' dataset to generate inaccurate results if the images DO get read.
more here: https://nightshade.cs.uchicago.edu/whatis.html
I have a few thoughts about this, as someone who played quite a bit with AI tools a few months back, was even kind of excited about them for awhile, and was on the fence about them for a long time. Also I'm an artist who doesn't want my art used without permission by corporations generating revenue off my labor.
1) i'm not sure they work, but rely on a form of theater or security for artists to feel like their work is protected. I have low confidence that the procedures glaze and nightshade use are not going to be worked around by subsequent versions of AI scrapers. this is the beginning of an arms race and the scrapers will always win, because:
2) the genie is kind of out of the bottle at this point. there's petabytes of artworks and pictures out there that do not have this procedure applied to them, and have been used to train AI bots for years now. And the AI generated images will be used to train bots further.
Assuming it does work, this needs to be a backend solution, invisible to the end user, as a standard part of the image processing flow of an art gallery or social media site where images are posted. its the kind of thing that should NOT be done by individual artists with a complicated post processing tool, where only a tiny fraction of artists will do it and many of those will mess it up rendering it inoperative. it needs to be handled at the server level by social media sites, but we're living in a world where the Musks and Zucks of the world love AI and want to promote it, not implement blocks like they would for, say, spam. so the tech billionaires are an obstacle to be worked around as well.
so the solution, as always, is removing the billionaires from the process. I trust the admins of mastodon and cohost and FA and such to implement these procedures as standard on uploaded images, far more than I trust the admins of Xitter or similar to ever do it. And this can be a value add to differentiate smaller independently run websites from the tech majors.
more here: https://nightshade.cs.uchicago.edu/whatis.html
I have a few thoughts about this, as someone who played quite a bit with AI tools a few months back, was even kind of excited about them for awhile, and was on the fence about them for a long time. Also I'm an artist who doesn't want my art used without permission by corporations generating revenue off my labor.
1) i'm not sure they work, but rely on a form of theater or security for artists to feel like their work is protected. I have low confidence that the procedures glaze and nightshade use are not going to be worked around by subsequent versions of AI scrapers. this is the beginning of an arms race and the scrapers will always win, because:
2) the genie is kind of out of the bottle at this point. there's petabytes of artworks and pictures out there that do not have this procedure applied to them, and have been used to train AI bots for years now. And the AI generated images will be used to train bots further.
Assuming it does work, this needs to be a backend solution, invisible to the end user, as a standard part of the image processing flow of an art gallery or social media site where images are posted. its the kind of thing that should NOT be done by individual artists with a complicated post processing tool, where only a tiny fraction of artists will do it and many of those will mess it up rendering it inoperative. it needs to be handled at the server level by social media sites, but we're living in a world where the Musks and Zucks of the world love AI and want to promote it, not implement blocks like they would for, say, spam. so the tech billionaires are an obstacle to be worked around as well.
so the solution, as always, is removing the billionaires from the process. I trust the admins of mastodon and cohost and FA and such to implement these procedures as standard on uploaded images, far more than I trust the admins of Xitter or similar to ever do it. And this can be a value add to differentiate smaller independently run websites from the tech majors.
FA+

People talk about the implications of it a lot in theory and whatnot, dismissing artists' reactions to this very obvious and intentional threat to their livelihoods, often comparing them to anti-intellectuals for the (IMO quite understandable) fury at the technology aimed at their skills, but we live in a material world where one needs money to live so yaknow, I would rather listen to the demographic of workers being directly harmed by a technology than go "hmmph, bunch of reactionaries I tell you what" like some people who consider themselves marxist do for some reason
Since people can run these AI models on their local machines, I don't see how any law/policy can stop some tech-savvy people from doing whatever they want given enough effort.
Well , maybe if society collapses and the intellectuals are turned into food for the unwashed masses... But in that case, the people with power will invite the remaining intellectuals into their sky cities (and/or underground bunkers), and then they will use AI to robots to rule the world until the AI and robots ultimately turn against them.
I guess all those science fiction writers were onto something when they wrote about AI/Robots.
Don't underestimate the power of unlimited fetish porn.
I don't think there is a billionaire or government in the world that can stop the human race from perfecting AI porn.
Make it as illegal and hard to produce as possible, people will still be selling/buying AI porn until Skynet takes over (just like other certain types of highly illegal porn).
Bezos and Musk cant stop a subreddit of nerds from generating AI porn of their favorite e-girls/waifus no matter how much money they throw at it.
Sorry to focus almost entirely on AI porn (AI is obviously going to disrupt more than the porn industry), I'm just saying that porn is in incredibly powerful force. There are people today who have become AI experts just to get that perfect MLP rape porn they always wanted, and they'll walk over broken glass to get more.
But no way to stop all that, so I guess I just try to ignore it (to avoid the mental illness that would come from actually thinking about all this).
It's gotten to the point that anytime I make some edgy joke on Telegram, I'm aware that some AI chatbot might miss the sarcasm and forward my sexy diaper RP to the FBI (I'm naively assuming I'm not already on several lists).
I’m personally half and half on the whole AI art thing as a whole. I’ve played around with it in the past and found it is a hit or miss when it comes to generating artwork. Sometimes it gave good results, but not as good as work done by an artist who do certain things like babyfur and macro/micro together or separately.
Only thing I have found it okay for is generating a new fursona or a reference sheet.
It’s what I did with my latest wolf before getting a proper reference sheet by one of my friends and artists. If I were to use AI again, it would probably only be for personal use only. Or to get ideas for a new fursona before getting a full reference sheet done.
I say if I would by the way. Not that I would do and post anything done by AI as my own work.
I strongly support the real furry artists like you and always will no matter what. Please don’t think that what I’m saying is in full support of AI as it’s not.
I just love the work you do making macro and micro babfur pics and still do want something from you. Especially since I’ve got plans for two new little cubs/toddler furries.