Vote for Cryptopolitan on Binance Square Creator Awards 2024. Click here to support our content!

Alibaba Cloud Unveils Cutting-Edge AI Text-to-Video Generator

In this post:

  • Artists are teaming up with researchers to stop AI from copying their work; free software like Glaze helps make art unrecognizable to AI.
  • Kudurru software detects image harvesting online and lets artists block access, while AntiFake safeguards voice recordings against deepfakes.
  • The push is for ethical AI data use, with hopes of a world where all data used is subject to consent and payment to protect artists and creators.

In an age where artificial intelligence continues to push the boundaries of technological innovation, a new collaboration between artists and university researchers is emerging to safeguard the intellectual property of creative minds. US illustrator Paloma McClain’s recent encounter with unauthorized AI replication of her artwork has sparked a proactive response within the artistic community, resulting in innovative software solutions to prevent AI-driven copycat activity.

US-based illustrator Paloma McClain found herself in the crosshairs of AI’s relentless pursuit of creative replication. Several AI models had “trained” using her art without her knowledge or consent, leaving her without due credit or compensation. McClain, a staunch advocate for ethical, technological advancement, expressed her concerns: “I believe truly meaningful technological advancement is done ethically and elevates all people instead of functioning at the expense of others.”

In response to the growing issue of AI imitation, Paloma McClain turned to Glaze, a revolutionary free software created by researchers at the University of Chicago. Glaze’s primary function is to outsmart AI models during their training process, manipulating pixels in subtle ways that remain invisible to the human eye while drastically altering the appearance of digitized art to AI algorithms.

Protecting human creators

Professor Ben Zhao, a key member of the Glaze development team, underscored their mission: “We’re basically providing technical tools to help protect human creators against invasive and abusive AI models.” This software, developed in four months, builds upon technology initially designed to disrupt facial recognition systems. Zhao emphasized the urgency of the situation, explaining, “We were working at super-fast speed because we knew the problem was serious. A lot of people were in pain.”

While some generative AI giants have formal agreements for data usage, a significant portion of the data employed to train AI, encompassing digital images, audio, and text, is scraped from the internet without explicit consent. This practice raises critical questions about the ethical use of AI and intellectual property rights.

Since its release in March 2023, Glaze has experienced widespread adoption, with over 1.6 million downloads, reflecting the urgency of the need for tools that protect artists from AI replication. The Glaze team is developing an enhancement known as Nightshade, which aims to further confound AI by distorting its interpretations, such as making it perceive a dog as a cat.

While endorsing Nightshade, Paloma McClain noted its potential to make a significant impact if widely adopted: “According to Nightshade’s research, it wouldn’t take as many poisoned images as one might think.” Several companies have already approached Zhao’s team to explore the use of Nightshade, underscoring its relevance in protecting both individual artists and organizations with substantial intellectual property.

Read Also  Mobile World Congress 2024 Kicks Off in Barcelona - 5G and AI Take Center Stage

Kudurru: Defending against image harvesting

Viva Voce Startup Spawning has contributed to the defense of creative works with its Kudurru software. Kudurru detects attempts to collect large images from online sources, empowering artists to block access or provide misleading data to disrupt AI’s learning process. Over a thousand websites have already integrated into the Kudurru network.

Spawning has extended its efforts with the launch of haveibeentrained.com, a website featuring an online tool that allows artists to ascertain whether their digitized works have been used to train AI models. This platform empowers artists to opt out of future unauthorized use, providing much-needed control over their creative content.

Researchers at Washington University in Missouri have ventured into safeguarding voice recordings with their AntiFake software. The program enriches digital voice recordings with inaudible noises, rendering it “impossible to synthesize a human voice,” according to Zhiyuan Yu, the PhD student behind the project. AntiFake’s objective extends beyond thwarting unauthorized AI training to prevent the creation of deepfakes, a growing concern in the age of AI-driven misinformation.

A call for ethical data usage

As these software solutions evolve, the conversation surrounding the ethical use of data for AI remains paramount. Jordan Meyer, a co-founder of Spawning, articulated the ultimate goal: “The best solution would be a world in which all data used for AI is subject to consent and payment.” Advocates hope to drive developers in this direction, prioritizing consent and fairness in AI data usage.

In conclusion, the collaboration between artists and researchers is spearheading the development of innovative software solutions to combat AI imitation of creative works. The Glaze, Nightshade, Kudurru, and AntiFake software platforms represent significant strides in protecting intellectual property in an increasingly AI-driven world. While these technologies provide valuable defense mechanisms, they also underscore the need for broader discussions on the ethical use of data in artificial intelligence. As artists and creators continue to assert their rights, the future of AI and intellectual property is poised for a more ethical and balanced evolution.

A Step-By-Step System To Launching Your Web3 Career and Landing High-Paying Crypto Jobs in 90 Days.

Share link:

Disclaimer. The information provided is not trading advice. Cryptopolitan.com holds no liability for any investments made based on the information provided on this page. We strongly recommend independent research and/or consultation with a qualified professional before making any investment decisions.

 

Editor's choice

Loading Editor's Choice articles...

Stay on top of crypto news, get daily updates in your inbox

Most read

Loading Most Read articles...
Subscribe to CryptoPolitan