Kickstarter ended campaign for AI porn group Unstable Diffusion amid changing policies • InNewCL
Kickstarter ended campaign for AI porn group Unstable Diffusion amid changing policies • InNewCL
#Kickstarter #ended #campaign #porn #group #Unstable #Diffusion #changing #policies #InNewCL Welcome to InNewCL, here is the new story we have for you today:
Click Me To View Restricted Videos
The group trying to monetize AI porn generation Unstable Diffusion raised more than $56,000 on Kickstarter from 867 backers. Now that Kickstarter is changing its mind about what kind of AI-based projects it will allow, the crowdfunding platform has shut down Unstable Diffusion’s campaign. Since Kickstarter operates an all-or-nothing model and the campaign was ongoing, any funds raised by Unstable Diffusion will be returned to the backers. In other words, Unstable Diffusion won’t see that $56,000, which more than doubled its original target of $25,000.
“Over the past few days we’ve engaged our Community Advisory Council and we’ve read your feedback to us through our team and social media,” CEO Everette Taylor said in a blog post. “And one thing is clear: Kickstarter must and always will stand on the side of the creative work and the people behind it. We are here to help creative work succeed.”
Kickstarter’s new approach to hosting AI projects is intentionally vague.
“This technology is really new and we don’t have all the answers,” Taylor wrote. “The decisions we make now may not be the ones we make in the future, so we want this to be an ongoing conversation with all of you.”
Currently, the platform says it is considering how projects interact with copyrighted material, particularly when artists’ work appears in an algorithm’s training data without consent. Kickstarter will also consider whether the project will “exploit a particular community or cause harm to anyone.”
In recent months, tools like OpenAI’s ChatGPT and Stability AI’s Stable Diffusion have had great success, pushing conversations about the ethics of AI artworks to the forefront of public debate. If apps like Lensa AI can leverage open source stable diffusion to instantly create artistic avatars that look like the work of a professional artist, how will this affect those same working artists?
Some artists took to Twitter to pressure Kickstarter to drop the Unstable Diffusion project, raising concerns about how AI art generators could jeopardize artists’ careers.
He @Kickstarter So you’re just allowing an AI project whose main premise is to generate (possibly non-consensual) porn, and the main selling point is that people can steal from Greg Rutkowski? Or will you do something to protect creators and the public?
— Karla Ortiz 🐀 (@kortizart) December 10, 2022
shame about @Kickstarter for approving the Unstable Diffusion crowdfund. They enable flagrant theft and fund a tool that can create abusive content like non-consensual pornography.
— Sarah Andersen (@SarahCAndersen) December 11, 2022
Many cite the fate of Greg Rutkowski’s work as an example of what can go wrong. A living illustrator who has created detailed, high-fantasy artwork for franchises like Dungeons & Dragons, Rutkowski’s name was one of Stable Diffusion’s most popular search terms when it launched in September, allowing users to easily identify his distinctive style could imitate. Rutkowski has never consented to his artworks being used to train the algorithm, leading him to be vocal about how AI art generators affect working artists.
“With $25,000 in funding, we can afford to train the new model with 75 million high-quality images composed of approximately 25 million anime and cosplay images, approximately 25 million artistic images from Artstation/DeviantArt/Behance, and there are about 25 million photos.” Unstable Diffusion wrote in its Kickstarter.
Spawning, a suite of AI tools designed to support artists, has developed a website called Have I Been Trained where artists can see if their work appears on popular datasets and opt out. According to a court case in April, there is a precedent to defend the scraping of publicly available data.
Inherent problems in AI porn generation
Ethical issues surrounding AI artworks become even more murky when you think of projects like Unstable Diffusion, which focus on developing NSFW content.
Stable Diffusion uses a dataset of 2.3 billion images to train its text-to-image generator. But only an estimated 2.9% of the dataset contains NSFW material, giving the model little guidance in terms of explicit content. This is where unstable diffusion comes into play. The project, which is part of Equilibrium AI, recruited volunteers from its Discord server to develop more robust porn datasets to refine their algorithm, just like you would upload more pictures of sofas and chairs to a dataset if you had one wanted to create AI for furniture generation.
But any AI generator tends to fall victim to the biases of the people behind the algorithm. Much of the free and easily accessible porn is designed for the male gaze, which means the AI will likely spit that out, especially when it comes to the type of images that users are entering into the dataset.
In its now-pending Kickstarter, Unstable Diffusion said it would work towards developing an AI art model that could “better handle human anatomy, produce in diverse and controllable artistic styles, more accurately represent undertrained concepts like LGBTQ and race and gender equality.” ”
Also, there is no way to verify that much of the free porn available on the web is consensual (although adult creators using paid platforms like OnlyFans and ManyVids are required to verify their age and identity before using these services). Even if a model agrees to appear in porn, that doesn’t mean they consent to their images being used to train an AI. While this technology can produce amazingly realistic images, it also means it can be weaponized to create non-consensual deepfake pornography.
Currently, few laws around the world address non-consensual deepfake porn. In the US, only Virginia and California have regulations restricting certain uses of fake and deeply fake pornographic media.
“One aspect I’m particularly concerned about is the differential impact of AI-generated porn on women,” Ravit Dotan, vice president of executive AI at Mission Control, told InNewCL last month. “For example, a previous AI-based app that can ‘undress’ people only works on women.”
Unstable Diffusion has not responded to a request for comment at the time of publication.