817
Nightshade, the free tool that ‘poisons’ AI models, is now available for artists to use
(venturebeat.com)
This is a most excellent place for technology news and articles.
That's not something a technical solution will work for. We need copyright laws to be updated.
You should check out this article by Kit Walsh, a senior staff attorney at the EFF. The EFF is a digital rights group who recently won a historic case: border guards now need a warrant to search your phone.
A few quotes:
and
Yeah, that's what I'm saying - our current copiright laws are insufficient to deal with AI art generation.
They aren't insufficient, they are working just fine. In the US, fair use balances the interests of copyright holders with the public’s right to access and use information. There are rights people can maintain over their work, and the rights they do not maintain have always been to the benefit of self-expression and discussion. We shouldn't be trying to make that any worse.
Yep. Copyright should not include "viewing or analyzing the picture" rights. Artists want to start charging you or software to even look at their art they literally put out for free. If u don't want your art seen by a person or an AI then don't publish it.
Copyright should absolutely include analyzing when you're talking about AI, and for one simple reason: companies are profiting off of the work of artists without compensating them. People want the rewards of work without having to do the work. AI has the potential to be incredibly useful for artists and non artists alike, but these kinds of people are ruining it for everybody.
What artists are asking for is ethical sourcing for AI datasets. We're talking paying a licensing fee or using free art that's opt-in. Right now, artists have no choice in the matter - their rights to their works are being violated by corporations. Already the music industry has made it illegal to use songs in AI without the artist's permission. You can't just take songs and make your own synthesizer out of them, then sell it. If you want music for something you're making, you either pay a licensing fee of some kind (like paying for a service) or use free-use songs. That's what artists want.
When an artist, who does art for a living, posts something online, it's an ad for their skills. People want to use AI to take the artist out of the equation. And doing so will result in creativity only being possible for people wealthy enough to pay for it. Much of the art you see online, and almost all the art you see in a museum, was paid for by somebody. Van Gogh died a poor man because people didn't want to buy his art. The Sistine Chapel was commissioned by a Pope. You take the artist out of the equation and what's left? Just AI art made as a derivative of AI art that was made as a derivative of other art.
You should check out this article by Kit Walsh, a senior staff attorney at the EFF. The EFF is a digital rights group who recently won a historic case: border guards now need a warrant to search your phone.
MidJourney is already storing pre-rendered images made from and mimicking around 4,000 artists' work. The derivative works infringement is already happening right out in the open.
Something being derivative doesn't mean it's automatically illegal or improper.
You are expressly allowed to mimic others' works as long as you don't substantially reproduce their work. That's a big part of why art can exist in the first place. You should check out that article I linked.
I actually did read it, that's why I specifically called out MidJourney here, as they're one I have specific problems with. MidJourney is currently caught up in a lawsuit partly because the devs were caught talking about how they launder artists' works through a dataset to then create prompts specifically for reproducing art that appears to be made by a specific artist of your choosing. You enter an artist's name as part of the generating parameters and you get a piece trained on their art. Essentially using LLM to run an art-tracing scheme while skirting copyright violations.
I wanna make it clear that I'm not on the "AI evilllll!!!1!!" train. My stance is specifically about ethical sourcing for AI datasets. In short, I believe that AI specifically should have an opt-in requirement rather than an opt-out requirement or no choice at all. Essentially creative commons licensing for works used in data sets, to ensure that artists are duly compensated for their works being used. This would allow artists to license out their portfolios for use with a fee or make them openly available for use, however they see fit, while still ensuring that they still have the ability to protect their job as an artist from stuff like what MidJourney is doing.
I'm pretty sure that's all part of the discovery from the same case where Midjourney is named as a defendant along with Stability AI, it isn't its own distinct case. It's also not illegal or improper to do what they are doing. They aren't skirting copyright law, it is a feature explicitly allowed by it so that you can communicate without the fear of reprisals. Styles are not something protected by copyright, nor should they be.
You can't extract compensation from someone doing their own independent analysis for the aim of making non-infringing novel works, and you don't need licenses or permission to exercise your rights. Singling out AI in this regard doesn't make sense because it isn't a special system in that regard. That would be like saying dolphin developers have to pay Nintendo every time someone downloads their emulator.
You do realize that you basically just confirmed every fear that artists have over AI, right? That they have no rights or protections to prevent anybody from coming along and using their work to train an LLM to create imitation works for cheaper than they can possibly charge for their work, thereby putting them out of business? Because in the end, a professional in any field is nothing more than the sum of the knowledge and experience they've accrued over their career; a "style" as you and MidJourney put it. And so long as somebody isn't basically copy+pasting a piece, then it's not violating copyright, because it's not potentially harming the market for the original piece, even if it is potentially harming the market for the creator of said piece.
The Dolphin analogy is also incorrect (though an interesting choice considering they got pulled from the Steam store after the threat of legal action by Nintendo, but I think you and I feel the same way on that issue - Dolphin has done nothing wrong). A better analogy would be if Unreal created an RPGMaker style tool for generating an entire game of any genre you want in Unreal Engine at the push of a button by averaging a multitude of games across different genres to generate the script. If they didn't get permission to use said games, either by paying a one time fee, an ongoing fee, or using games that expressly give permission for said use, I'm sure the developers/publishers would be rather unhappy with Unreal. Could it be incredibly beneficial and vastly improve the process of creating games for the industry? Absolutely. If they released it for free, could it be used by anybody and everybody to make imitation Ubisoft games, or any other developer, and run the risk of strangling the industry with even more trash games with no soul in them? Also absolutely. And a big AAA publisher has a lot more ability to deal with knock-offs/competition like that than your average starving artist. The indie game scene is the strongest it's ever been thanks to the rise of digital storefronts, but how many great indie game developers go under after producing their first game and never make a second? The vast majority. Because indie games almost never make a profit, meaning they can't afford to make another.
The issue with AI is that it opens a whole can of worms in the form of creating an industrial scale imitation generator that anybody can use at the push of a button. And the general public have long made known their disdain for properly compensating artists for the work that they do, and have already been gleefully doing a corporation by using AI to avoid having to hire artists. This runs the risk of creating a chilling effect in the field of creativity and the arts, as your average independent artist can no longer afford to keep doing art thanks to the wonders of capitalism. There will always be people who do art as a hobby, but professional artists as we think of them today? Why go into debt by training at an art school if all your job prospects have been replaced because people generate art for free with some form of LLM instead of hiring artists. I myself never went into art beyond a hobby level despite wanting to because of how abysmal the job prospects were even 15 years ago. And I simply cannot afford to do it as much as I'd like (if at all) between work, the time investment, and the expense of it. And that's not even getting into the issues of LLM generated porn of people, advertisements generated using the voices of dead (and still alive) celebrities, scams made using the voices of relatives, and all the other ethical issues.
I used to work at a fish market with a kid who was a trained electrician who was set to follow in the footsteps of his dad who had been one of the highest paid electricians in the US, except he gave up on it because the thing he liked doing the most in the field was replaced with a machine by the time he graduated from technical school. Obviously the machine is more efficient (and probably safer), but instead of entering the field at all, he ended up working a job he hated and to this day has never found a job he has any passion for. What happens to art when professional artists are only NEETs, who have minimal living expenses, and those hired by corporations and the wealthy? Are we going to get the fine art market on steroids, with the masses only having access to AI generated art that will degrade in quality over time as the only new inputs are previous AI generated pieces, unless there's enough hobby artists to provide sufficient new art, while the wealthy hold a monopoly on human-made art that the rest of us will probably never see?
This is all pure speculation, but it's the Jurassic Park question: "Your scientists were so preoccupied with whether or not they could, they didn't stop to think if they should."
A professional career can't be reduced down to a style. There's a lot more that goes into art than styles.
Profit shouldn't be the sole motivator for creative endeavors. If a tool like the one you describe existed, we wouldn't need to have to "afford" to make things. We could have more collaborative projects like SCP, but with more fleshed out rich detail. I certainly don't spend my time lamenting the fact that I can't monetize every one of my comments and posts.
Your problem is with Capitalism. Your friend is a victim of the capitalist logic of prioritizing cost-cutting over human well-being. The question of what happens to art under capitalism is also a valid one, as capitalism tends to reduce everything into a product that can be bought and sold, but I think the potential outcomes for art are less predetermined than you make them seem. As long as we keep encouraging and nurturing diverse voices, I think we can come out winners.
I don't think it's wrong to give people a free tool to expand their ability to communicate and collaborate.
See, I agree with pretty much everything you say here, because my (and the artists who are opposed to AI) problem is with Capitalism, full stop. People have to monetize skills in order to survive if they want to spend their time doing something they love. Even hobbies have now become "side hustles." Many of the indie game studios start out as a hobby, before the people working on the game use their savings to move to developing full-time as their job so they can work on their passion more. And then they don't turn a profit and have to go back to making smaller projects as hobbies while they do something else for work. This is where the fear is - artists love making art, but if you do it professionally, AI that mimics your art is basically on the same level as knock-off products of name brand designs.
My issue with MidJourney, for example, wouldn't be an issue if the concern over taking business away from artists was made moot. You say that a professional career can't be reduced down to a style, but then what is MidJourney doing and what is the difference? Because 4,000 of the 5,000 "style" prompts that you can input into MidJourney are artists' names, and that list is growing apparently according to the Discord logs - somebody mentioned having a list of 15,000 new artists' names to add to the prompts after they scrape their art. You can say "make me an impressionist landscape," but you can also say "make me a landscape by Sara Winters." Would having MidJourney make you paintings by a specific artist and then selling them be okay? Is that just a style or is it copyright infringement? Because I can easily see the case where that could be considered damaging to Sara's (in this example) "market" as a professional, even if you aren't selling the paintings you make. Because MidJourney is explicitly providing the tools to create work that mimics her art with the intent of cutting her out of the equation. At that point, have we crossed the line into forgery?
We unfortunately live in a capitalist society and we have to keep that in mind. People need to market their skills as a job in order to afford the essentials to live. Beyond a certain point, the time investment in something demands that you make money doing it. AI as a tool has the capability to be absolutely monumentally helpful, we could even see a fundamental paradigm shift in how we think of the act of creativity. But it also has the possibility to be monstrously harmful, as we've already seen with faked nudes of underage teens and false endorsements for products and political campaigns. Somebody tried to threaten an artist by claiming they had the copyright to a picture the artist was working on on Twitch after they took a screenshot of it and supposedly ran it through some sort of image generator. There was even a DA who somebody tried to scam using an AI generated copy of his son's voice claiming that he was in prison. Letting it be unregulated is incredibly risky, and that goes for corporate AI uses as well. We need to be able to protect us from them as much as we need to be able to protect ourselves from bad actors. And part of that is saying what is and what isn't an acceptable use of AI and the data that goes into training it. Otherwise, people are going to use stuff like Nightshade to attempt to protect their livelihoods from a threat that may or may not be imagined.
You should read the article I linked earlier. There's no problem as long as you're not using their name to sell your works. Styles belong to everyone, no one person can lay claim to them.
Specific expressions deserve protection, but wanting to limit others from expressing the same ideas differently is both is selfish and harmful, especially when they aren't directly copying or undermining your work.
We already have countless laws for the misuse of computer systems, and they adequately cover these cases. I'm confident we'll be able to deal with all of that and reap the benefits.
Open-source AI development offers critical solutions. By making AI accessible, we maximize public participation and understanding, foster responsible development, and prevent harmful control attempts. Their AI will never work for us, and look at just who is trying their hand at regulatory capture. I believe John Carmack put it best.
It's sad some people feel that way. That kind of monopoly on expression and ideas would only serve to increase disparities and divisions, manipulate discourse in subtle ways, and in the end, fundamentally alter how we interact with each other for the worse.
What they want would score a huge inadvertent home run for corporations and swing the doors open for them hindering competition, stifling undesirable speech, and monopolizing spaces like nothing we’ve seen before. There are very good reasons we have the rights we have, and there's nothing good that can be said about anyone trying to make them worse.
Also, rest assured they'd collude with each other and only use their new powers to stamp out the little guy. It'll be like American ISPs busting attempts at municipal internet all over again.
Disney lawyers just started salivating
Seems like Disney is as eager to adopt this technology as anyone
A few goofy Steamboat Willie knock offs pale beside the benefit of axing half your art department every few years, until everything is functionally a procedural generation.
They're playing both sides. Who do you think wins when model training becomes prohibitively expensive to for regular people? Mega corporations already own datasets, and have the money to buy more. And that's before they make users sign predatory ToS allowing them exclusive access to user data, effectively selling our own data back to us.
Regular people, who could have had access to a competitive, corporate-independent tool for creativity, education, entertainment, and social mobility, would instead be left worse off and with less than where we started.
We passed that point at inception. Its always been more efficient for Microsoft to do its training at a 10,000 Petaflop giga-plant in Iowa than for me to run Stable Diffusion on my home computer.
Already have that. It's called a $5 art kit from Michael's.
This isn't about creation, its about trade and propagation of the finished product within the art market. And its here that things get fucked, because my beautiful watercolor that took me 20 hours to complete isn't going to find a buyer that covers half a week's worth of living expenses, so long as said market place is owned and operated by folks who want my labor for free.
AI generation serves to mine the market at near-zero cost and redistribute the finished works for a profit.
Copyright/IP serves to separate the creator of a work from its future generative profits.
But all this ultimately happens within the context of the market itself. The legal and financial mechanics of the system are designed to profit publishers and distributors at the expense of creatives. That's always been true and the latest permutation in how creatives get fucked is merely a variation on a theme.
AI Art does this whether or not its illegal, because it exists to undercut human creators of content by threatening them with an inferior-but-vastly-cheaper alternative.
The dynamic you're describing has nothing to do with AI's legality and everything to do with Disney's ability to operate as monopsony buyer of bulk artistic product. The only way around this is to break Disney up as a singular mass-buyer of artwork, and turn the component parts of the business over to the artists (and other employees of the firm) as an enterprise that answers to and profits the people generating the valuable media rather than some cartel of third-party shareholders.
You don't need industrial level efficiency or insane overhead costs, that's why it's a big deal. It's something regular people can do at home.
An art set from Michaels can only do so much. Having access to the most cutting edge tools and techniques has always propelled artists and art forward. Imagine not having access to digital art tools, computer animation, digital photography, digital sculpting, and interactive media tools to expand artistic expression, and allow for the creation of new forms, styles, and genres of art that weren't possible before?
Fighting their fight for them won't help in the end, don't make it easier for them.
It isn't necessarily a competitor or a threat, the tools are open source and free for all artists to use to enhance their creative process, explore new possibilities, and imagine novel outcomes. You can use it to help you reach new audiences, and discover new forms of expression. It's not a zero-sum game like you suggest.
That would still leave the baby-disneys with way more money than your average Joe, solving nothing. Training models isn't so expansive that they wouldn't enough have the money to train their own, that cost is only prohibitive to the working man.
The issue is simply reproduction of original works.
Plenty of people mimic the style of other artists. They do this by studying the style of the artist they intend to mimic. Why is it different when a machine does the same thing?
No, the issue is commercial use of copirighted material as data to train the models.
It's not. People are just afraid of being replaced, especially when they weren't that original or creative in the first place.
Honestly, it extends beyond creative works.
OpenAI should not be held back from subscribing to a research publication, or buying college textbooks, etc. As long as the original works are not reproduced and the underlying concepts are applied, there are no intellectual property issues. You can't even say the commercial application of the text is the issue, because I can go to school and use my knowledge to start a company.
I understand that in some select scenarios, ChatGPT has been tricked into outputting training data. Seems to me they should focus on fixing that, as it would avoid IP issues moving forward.
AI image creation tools are apparently both artistically empty, incapable of creating anything artistically interesting, and also a existential threat to visual artists. Hmm, wonder what this says about the artistic merits of the work of furry porn commission artist #7302.
Retail workers can be replaced with self checkout, translators can be replaced with machine translation, auto workers can be replaced with robotic arms, specialist machinists can be replaced with CNC mills. But illustrators must be where we draw the line.
It's different because a machine can be replicated and can produce results at a rate that hundreds of humans can't match. If a human wants to replicate your art style, they have to invest a lot of time into learning art and practicing your style. A machine doesn't have to do these things.
This would be fine if we weren't living in a capitalist society, but since we do, this will only result in further transfer of assets towards the rich.
copyright laws need to be abolished
That would make it harder for creative people to produce things and make money from it. Abolishing copyright isn't the answer. We still need a system like that.
A shorter period of copyright, would encourage more new content. As creative industries could no longer rely on old outdated work.
That would be an update, not sure it would be a good thing. As an artist I want to be able to tell where my work is used and where not. Would suck to find something from me used in fascist propaganda or something.
Truly a "Which Way White Man" moment.
I'm old enough to remember people swearing left, right, and center that copyright and IP law being aggressively enforced against social media content has helped corner the market and destroy careers. I'm also well aware of how often images from DeviantArt and other public art venues have been scalped and misappropriated even outside the scope of modern generative AI. And how production houses have outsourced talent to digital sweatshops in the Pacific Rim, Sub-Saharan Africa, and Latin America, where you can pay pennies for professional reprints and adaptations.
It seems like the problem is bigger than just "Does AI art exist?" and "Can copyright laws be changed?" because the real root of the problem is the exploitation of artists generally speaking. When exploitation generates an enormous profit motive, what are artists to do?
What is a "which way white man" moment?
They dutifully note that, this is the next best thing.