Human vs AI Music: Study Reveals Surprising Emotional Reactions

Music made by AI can stir emotions like human compositions—but your brain may need more effort to make sense of the sound.

AI-generated music excites listeners but also demands more brainpower, study finds.

AI-generated music excites listeners but also demands more brainpower, study finds. (CREDIT: CC BY-SA 4.0)

Generative artificial intelligence is changing how music is created, experienced, and understood—especially in film and video. A new study asks a bold question: Can music made by machines move people the same way as music written by human hands? This research, led by the Neuro-Com Research Group at the Autonomous University of Barcelona, worked with collaborators from the RTVE Institute in Barcelona and the University of Ljubljana in Slovenia. It focused on emotional impact—how AI-made music affects you emotionally and physically compared to traditional music.

The team used biometric tools and self-reported reactions to compare three music types played over identical video clips. The first was human-created music. The second came from AI, using carefully crafted keyword prompts that gave more emotional detail. The third used simpler, less nuanced prompts, mainly based on emotional labels and scales.

AI Music Hits the Brain—and the Body—Differently

When 88 participants viewed the videos, researchers recorded pupil size, blinking patterns, skin responses, and reported feelings. These data points helped them figure out whether listeners were emotionally stirred, paying close attention, or mentally overloaded.

Setup of the laboratory. (CREDIT: Nikolaj Fišer, et al.)

The clearest finding? AI-generated music—regardless of the prompt style—caused more pupil dilation than human-composed music. That’s a sign of emotional arousal. People’s eyes literally widened while listening to machine-made soundtracks.

However, when the music was made with more complex keyword prompts, participants blinked more often and had higher skin impedance readings. These are signs of increased attention and mental effort. That means this version of AI music didn’t just grab attention—it demanded more cognitive resources to process. Interestingly, both types of AI music stirred stronger excitement, but the human compositions felt more familiar. Even though AI music can stimulate you, it might not yet capture the emotional patterns your brain is used to hearing.

Emotional Stimulation Isn’t Enough

Lead author Nikolaj Fišer highlighted a key difference. “Both types of AI-generated music led to greater pupil dilation and were perceived as more emotionally stimulating compared to human-created music,” he explained. But that’s not the whole story. Fišer also noted that “decoding the emotional information in AI-generated music may require greater cognitive effort.”



In other words, AI can provoke emotion, but your brain might need to work harder to understand what it's feeling. That extra mental strain may not always be welcome, especially when you're watching a film or engaging with other content that also demands focus. This introduces a new design challenge. Content creators might need to weigh the emotional power of AI-made music against its potential to pull too much cognitive bandwidth from the viewer.

The Role of Prompts: Simple vs. Sophisticated

A surprising outcome came from comparing the two AI methods. Music created with basic emotional inputs—like labeling a mood “happy” or “sad”—had the same arousing effect as the more elaborately prompted tracks. This suggests that even basic AI tools can trigger strong responses. However, only the detailed, carefully prompted music raised indicators of deeper mental involvement. That means producers might be able to use simpler tools for raw excitement, but need advanced prompts to layer in complexity and nuance.

The way the AI is guided matters. A few emotional keywords can lead to one kind of experience. A fully mapped emotional arc, spelled out in words, results in something richer—but more demanding.

Chord diagram depicting the interactions among the recorded biometric data. (CREDIT: Nikolaj Fišer, et al.)

A Future with Machine-Made Soundtracks

The potential applications for AI in audiovisual production are vast. With the ability to stir emotion and increase attention, these tools might help customize music to match scenes with greater precision. Instead of spending hours composing or licensing tracks, editors and producers could generate music that fits perfectly—whether they want to inspire fear, joy, or suspense. It also opens the door for automated systems that tweak music on the fly, adapting sound to the emotional tone of a scene or even the mood of the viewer.

However, creators should remain cautious. This study shows that even though AI can match or even exceed emotional stimulation, it may lack the cultural or cognitive familiarity that comes with human-composed music. That missing piece might affect how audiences relate to content on a deeper level.

This also leads to a more philosophical question: What happens to the role of the human artist when machines can evoke feeling so powerfully? While many still believe creativity is something only people possess, studies like this show that algorithms are inching closer to closing that gap. Still, emotion isn’t just about sound or science. It’s also shaped by memory, context, and culture. These are areas where AI has room to grow.

Procedure of the experiment. (CREDIT: Nikolaj Fišer, et al.)

The Brain’s Emotional Circuit May Not Be Fooled

As AI grows more advanced, it will likely become better at mimicking human emotional structures in music. But for now, your brain still senses a difference. While your pupils may dilate, and your skin may respond to AI music, the added effort it takes to “read” that music hints at a disconnect.

That gap could shrink with time. Better algorithms, more refined prompts, and growing familiarity with AI-made art might eventually close it. For now, though, this study offers a unique snapshot of a moment when machines could touch your feelings—but not quite mimic the soul behind a song.

Research findings are available online in the journal PLOS One.

Note: The article above provided above by The Brighter Side of News.


Like these kind of feel good stories? Get The Brighter Side of News' newsletter.


Mac Oliveau
Mac OliveauScience & Technology Writer

Mac Oliveau
Science & Technology Writer | AI and Robotics Reporter

Mac Oliveau is a Los Angeles–based science and technology journalist for The Brighter Side of News, an online publication focused on uplifting, transformative stories from around the globe. Passionate about spotlighting groundbreaking discoveries and innovations, Mac covers a broad spectrum of topics—from medical breakthroughs and artificial intelligence to green tech and archeology. With a talent for making complex science clear and compelling, they connect readers to the advancements shaping a brighter, more hopeful future.