News

AI Backlash: Cara and ‘Not By AI’ Rage Against the Machine

AI Backlash: Cara and ‘Not By AI’ Rage Against the Machine

A growing movement of creators and online platforms are expressly banning artificial intelligence (AI), labeling their websites as being free of generative AI content and rebuffing attempts by AI developers to ingest their work for model training.

Seattle-based social network and artist portfolio platform Cara has captured this anti-AI zeitgeist, while Philadelphia-based Not By AI has set up a campaign to help artists and creators certify their human-created work.

Launched in October 2023 by artist and photographer Jingna Zhang, Cara aims to be an alternative to websites like DeviantArt and ArtStation, which have allowed images created with generative AI.

“I waited and waited for ArtStation to revise their stance,” Zhang said in a blog post. “I don’t know if they don’t care because they think people have nowhere left to go. So I thought, if that’s the case, I will build it.”

As Zhang explained, she wanted to build a platform that facilitates discussion and education and, critically, opts its content out of scraping by default. Her service will also not accept AI art until laws have been passed to protect artists, and AI datasets are ethically sourced.

Curious about https://t.co/bIBAmGiZdX? We’re a platform with portfolio AND social feed. We have an AI detector that filters out AI images, custom home feed you can control, and custom crop tool for image thumbnails! ✨ Here’s a thread on how it works:👇🧵 pic.twitter.com/sTDYTuym5Z

— Cara: Art & Social (@Cara_HQ) April 12, 2023

According to the Cara FAQ, the site automatically adds “NoAI” tags to images uploaded to the platform, which is intended to instruct content scraping bots from generative AI companies not to collect images on Cara. But the site acknowledged that there are limitations to what the platform can do.

“No image that is publicly visible on the internet can be fully protected from unethical scraping if companies and bad actors are determined to do so,” Cara explains. “So while we are committed to taking as much of the burden off of artists to protect their creations from being added to AI datasets as possible, we are facing the same limitations every other website is currently navigating, and cannot guarantee that the tags alone will fully shield creators.”

In addition to AI-generated images, Cara’s policy also prohibits the posting of NFTs created using generative AI.

“As a platform, we do not seek to foster a culture of get-rich-quick schemes or grifts, and using Cara to advertise such activities falls outside of our Terms & Conditions,” Cara said. “Additionally, many NFT projects use artwork created via AI image generation, which is considered a violation of our rules and would not be allowed in portfolios.”

To detect AI images, Cara turned to an unnamed third party, acknowledging that they “do not have the manpower to manually review every image upload,” and that the service does not involve AI model training.

Cara has separately partnered with the SAND Lab, developer of the Glaze Project, which is building a technology to broadly disrupt AI model training on artists’ images.

Glaze claims to use machine learning algorithms to make subtle changes to digital images. While these changes are invisible to the human eye, the image is altered sufficiently behind the scenes to prevent AI model trainers from using them.

Hi everyone, sorry for the absence—we hit 300k users on Cara!!!!!! 🥹🤯🥳

We made it to top 5 in App Store for social networks in the US—passing Twitter, Discord, Reddit, LinkedIn, and Messenger! Unfortunately this traffic spike has caused the site to crash like crazy &— 1/ pic.twitter.com/xsgB9pewwf

— Jingna Zhang @ cara.app/zemotion (@zemotion) June 3, 2024

Cara and its anti-AI stance has drawn a huge influx of users, which has brought some growing pains. When Zhang announced Cara hit 300,000 users on June 2, the site was suffering from outages due to the load.

“Now we’re at nearly 900,000 users, but it’s been stressful,” Zhang told Wired yesterday. “I saw a bill from a service provider, and it was almost $100,000 for six days.”

Meanwhile, Not By AI provides artists, bloggers, and content creators a subscription-based badge that can be added to their websites to declare that 90% of the content on the site is human-made.

The rule doesn’t exclude all AI—inspiration, editing, or translating content are acceptable uses, the project explains.

“This means that if you use AI for brainstorming or concepting, generate 9% of your content, and employ AI to find typos in all content, you remain in compliance with the Not By AI 90% Rule,” the group said.

Subscriptions range from a $5.00 per month “Starter” tier to a $12.00 “Enterprise” tier, depending on the user’s needs, as well as a flat $99.00 badge-only option.

“Adding our Not By AI badges to your content is an efficient way to demonstrate your human-created content,” the group said. “However, recognizing that our badges can be misused, we strongly encourage you to set up a project page to describe your human-centric content approach, if using our badges commercially.”

Not By AI and Cara did not respond to Decrypt’s request for comment.

Artists and writers have faced an uphill battle to keep their work from being harvested by AI website scrapers who claim fair use. Last year, a court in Japan declared that using datasets to train AI models did not violate local copyright laws.

Even when an AI model developer complies with a request to remove the content, some, like digital artist Greg Rutkowski’s work, were reuploaded by users after it was removed from Stable Diffusion in July.

Artists have also turned to the legal system was varying degrees of success. In October, a federal judge ruled that plaintiffs did not provide enough evidence to support their claim of copyright infringement in a lawsuit brought against AI-image generator Midjourney and digital art platform DeviantArt.

“Plaintiffs fail to allege specific plausible facts that DeviantArt played any affirmative role in the scraping and using of Anderson’s and other’s registered works to create the training images,” the judge wrote. “The complaint, instead, admits that the scraping and creation of training images was done by LAION at the direction of Stability, and that Stability used the training images to train Stable Diffusion.”

Edited by Ryan Ozawa.

Generally Intelligent Newsletter

A weekly AI journey narrated by Gen, a generative AI model.





Source: https://decrypt.co/235334/anti-ai-cara-artists-platform-not-by-ai

Leave a Reply

Your email address will not be published. Required fields are marked *