Adobe is testing a new AI tool that can create music based on text prompts

The era of bad robot music is coming. Adobe is developing a new artificial intelligence tool that will let anyone become a music producer, no instrument or editing experience required.

Company announced “Music GenAI Control Plan”— a long name — this week. The tool allows users to create and edit music simply by entering text prompts into a generative artificial intelligence model. Adobe explains that these descriptions can include descriptions such as “powerful rock,” “happy dance,” or “sad jazz.”

Project Music GenAI Control will then create an initial tune based on the user’s prompts, which the user can also edit using text. Adobe said users can edit the intensity of the generated music, extend the length of the music clip or create repeatable loops, etc.

The new tool’s target audience includes podcasters, broadcasters and “anyone else who needs audio with just the right mood, tone and length,” said Nicholas Bryan, a senior research scientist at Adobe and one of the creators of the technology. “.

“One of the exciting things about these new tools is that they don’t just generate audio, they take it to the level of Photoshop by giving creatives the same deep control to shape, adjust and edit audio,” Brian said in said in a statement Adobe Blog. “It’s pixel-level music control.”

Adobe uploaded a video showing how Project Music GenAI Control works, and it’s surprising how easy it is to create music with the tool. It also seems to work very quickly. While the music it generates won’t win any Grammys, I can definitely imagine hearing it in the background of a YouTube video, TikTok, or Twitch stream.

Project Music GenAI Control | Adobe Research

This isn’t entirely a good thing. Artificial intelligence is already having an impact on careers like writing and acting, forcing workers to take a stand and prevent their livelihoods from being stolen. Comments on the company’s YouTube videos echoed these concerns, criticizing the company for creating “music by and for robots” and “good corporate cringe.”

“Thank you Adobe for trying to find more ways for businesses to lose creative people’s jobs. Also, which artist did you steal the materials you used to train the AI ​​from?” one user wrote.

When reached for comment by Gizmodo, Adobe did not disclose details about the music used to train the Project Music GenAI Control artificial intelligence model. However, it did note that for its family of AI image generators, Firefly, it only trains its models under open licenses and in the public domain where copyright has expired.

“Project Music GenAI Control is a very early stage technology being developed by Adobe Research, and while we haven’t revealed the details of the model yet, what we can share is this: Adobe always takes a proactive approach to ensuring we responsibly to innovate,” Adobe spokesperson Anais Gragueb told Gizmodo in an email.

Music is art and the essence of human nature. Therefore, we must be careful when using new tools like Adobe’s, otherwise the music of the future will sound as hollow as the machines that generate it.

Updated March 1, 2024 at 5:56 pm ET: This article has been updated with additional comments from Adobe.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *