According to French music streaming service Deezer, around 50,000 fully AI-generated songs are uploaded to its platform every day. Many of these songs won’t reach a wide audience, but over the past year, some have garnered millions of listens.
Which begs the question: If our future is going to be filled with this kind of AI music, what does that future sound like?
Deni Béchard is a senior science writer at Scientific American.. For the better part of a month, Béchard has only allowed herself to listen to her own AI-generated music using the Suno AI music app. He says the experiment is an attempt to think more critically about how we might relate to this type of music in the future.
Béchard spoke with Today, explained Host Noel King talked about what he’s learned so far and how his AI creations compare to human-made music. The conversation has been edited for length and clarity.
There’s plenty more in the full podcast, including snippets of Béchard’s songs, so give it a listen Today, explained wherever you get your podcasts, including Apple Podcasts, Pandora, and Spotify.
Okay, so you’re using Suno, you said, to create the songs.
I come up with a message and I plug it in, and each message creates two songs, and I’ll try to be as creative as possible. I usually plug it in two or three times and change it up, add different types of instruments or different types of vocals, and just plug in a bunch of them. One that made me laugh was a song called “Organ Trafficking.” I asked for a contemporary rap song with a female voice, and I asked for fun, ironic lyrics, and this song came up, where organ trafficking is kind of the central metaphor. I was quite surprised.
I think one of the things I’ve realized is that a lot of the music I listen to that is mainstream is, I would consider, very processed music: music designed to have a large market. And it doesn’t feel very personal anyway, so I realized that in that particular context, [the music I made with AI] I didn’t feel much different most of the time.
Do you think that if someone had given you a playlist of 10 songs, five were AI and five were not, do you think you would be able to tell the difference?
Wow. And what does that tell you?
I mean, it tells me that AI is getting really good.
One thing I noticed during this process was that a lot of the AI music that is popular and that people listen to on Spotify has millions of listeners. [are] songs that are very moving, very brave.
It’s like Xania Monet’s “Don’t Tread on Me” or Solomon Ray or Cain Walker, and Cain Walker is not a person. It’s an AI avatar, right? Or “Livin’ on Borrowed Time” by Breaking Rust. All of those songs feel really authentic. This person really suffered these things and felt these things. This is how they are found.
I think AI tends to work better when it just leans into that authenticity because it kind of helps overcome the cognitive dissonance that we’re thinking about. This isn’t really a deeply felt song and it strays away from mainstream human-generated music, human-made music, which is often very designed to be a summer hit or go viral in some way. And a lot of times it doesn’t have that level of authenticity, that feeling of authenticity. I think when AI replicates that, we’re more aware that it’s superficial or artificial, because there’s already an element of artificiality there.
Do you think that when you finish your experiment you will continue making music with AI?
My God, you love power.
I think what’s surprised me is that I’ll walk somewhere and think, “What if I asked him to combine these styles or put a banjo on a hip hop track and add these kinds of vocals? What would he get?” I’m curious now.
I would say that I am now at the point where I don’t worry about the connection with the human. I did it at the beginning. At first I was really wondering, “Who is this person?” When you’re reading a book, you’re halfway through and you think, “What human mind did this book come from?” And you turn the book over and you look and you see who the author was and you Google it and you’re like, “How the hell did they come up with this?”
At first I very often had that impulse of wanting to know who felt this, who thought this. You would simply have cognitive dissonance. I was like, “This is a machine. This machine didn’t fall in love. This machine didn’t suffer these experiences. This machine didn’t wake up at two in the morning and write this song just needing to express itself.” It was actually bothering me a lot. In a way it would prevent me from being able to enjoy the song.
And I thought, “Well, if someone created an AI avatar and gave it a personality and it was a fictional character that existed in the Metaverse, and that AI avatar was a song creator and sang this song, would that make it easier?” And, curiously, that’s how it would be. It would make it a little easier. So I was imagining these AI avatars and I was like, “Okay, I’m imagining a fictional character singing this song.” And that lasted maybe four or five days, and then I got used to listening to the music and stopped thinking about it.
Does doing this experiment and seeing how you react to this music change the way you think about AI?
I think my takeaway from this is that in 10, 15, 20 years, there will be a lot of teenagers who will look at the discussions we’re having now and say, “What are these people talking about? This is totally normal. Why would anyone feel so conflicted about this?”
I think we’re going to adapt pretty quickly. That’s my feeling. There are a lot of important questions around creators and protecting artists and what it means to be an artist. There are many questions that will arise from this, and I really hope that artists are protected as much as possible and adequately compensated. But I think this will fit into our lives much better than we think at the moment.

