Deepfake videos can fake a heartbeat, making detection harder

A new study shows that deepfakes can fake a heartbeat, challenging old detection methods and raising urgent concerns about digital trust.

New study reveals deepfakes can fake a heartbeat, making them harder to detect and raising the stakes for digital forensics.

New study reveals deepfakes can fake a heartbeat, making them harder to detect and raising the stakes for digital forensics. (CREDIT: Shutterstock)

Deepfakes are spreading fast and spotting them is no longer as easy as noticing a glitch or an unnatural expression. Digital forgeries have grown so advanced that even biological markers, once considered foolproof giveaways, now appear convincingly fake. Researchers are sounding the alarm, warning that the next stage in this technological arms race has already arrived.

When fake becomes nearly flawless

A new study led by Dr. Peter Eisert, a professor at Humboldt University of Berlin, has revealed a striking discovery. Deepfake videos can now reproduce the tiny heartbeat signals that run beneath the skin. These signals normally appear as subtle changes in skin color from blood flow, invisible to the naked eye but measurable with advanced tools.

Until now, researchers identified fakes by spotting their absence. That safeguard, however, may no longer hold. “Here we show for the first time that recent high-quality deepfake videos can feature a realistic heartbeat and minute changes in the color of the face, which makes them much harder to detect,” said Eisert.

Illustration of the temporal alignment process. A reference mesh, composed of 918 triangles formed from MediaPipe facial landmarks, (center) is used to spatially warp each frame from the video sequence (top) to a reference position (bottom). (CREDIT: Peter Eisert, et al.)

This breakthrough means deepfake creators can pass off their work as genuine with far less risk of exposure. The potential consequences range from political sabotage to criminal fraud, and they raise the stakes in the battle to protect truth in the digital age.

A borrowed heartbeat

Deepfakes rely on deep learning, a form of artificial intelligence that allows computers to generate highly realistic images and videos. This technology can be used playfully—apps that transform you into an animal or an older version of yourself are powered by the same principles. But the same tools can also be weaponized.

Traditionally, scientists used remote photoplethysmography, or rPPG, to estimate vital signs through webcam video. This method tracks changes in light absorption as blood moves through the vessels under the skin. In medicine, similar approaches are used in pulse oximeters to measure oxygen levels. For years, researchers believed that while deepfakes could replicate appearance and movement, they could not mimic a pulse.

That assumption has now fallen apart. Eisert and his colleagues built a detector that automatically extracts pulse signals from videos. The system compensates for head movement, filters noise, and needs only 10 seconds of footage to estimate a person’s heartbeat with remarkable accuracy.

Their tests showed that the detector could identify real heartbeats within two to three beats per minute of the true rate, and simultaneous electrocardiogram readings confirmed this. But when they applied the same detector to modern deepfakes, something surprising happened. The videos seemed to contain realistic pulse signals—even though the creators never deliberately inserted one.

Heart rate extraction pipeline. From the registered video sequence, we calculate a global rPPG signal of the face as well as the background. Following, we determine the magnitudes in frequency space for each signal over time. To robustly extract the heart rate both signals are “subtracted”. (CREDIT: Peter Eisert, et al.)

How fakes inherit life-like signals

The team discovered that deepfakes can accidentally inherit a heartbeat from the original “driving” video used to guide facial expressions. Small variations in skin tone that genuine blood flow creates transfer into the forged video, producing a convincing but artificial pulse. “Our results show that an attacker may add a realistic heartbeat on purpose, but the fake can also inadvertently inherit one from the driving genuine video,” explained Eisert.

This finding shows just how far the technology has come. What was once a clear marker of authenticity—the presence of a heartbeat—now appears convincingly forged. For digital forensics, this shift means researchers must continue searching for new detection methods.

A new path for detection

The research team sees hope in looking beyond the heartbeat alone. While a deepfake may mimic the overall rhythm of blood flow, it fails to replicate the complex, localized changes in circulation across different areas of the face. In real humans, these changes vary in space and time in ways that current algorithms cannot fully reproduce. “Our experiments have shown that current deepfakes may show a realistic heartbeat, but do not show physiologically realistic variations in blood flow across space and time within the face,” said Eisert.

This finding points to a new strategy for exposing fakes: analyzing detailed patterns of circulation rather than global averages. If detectors measure these subtle variations, they might regain the advantage in the ongoing contest between forgery and forensics.

This illustration presents two pairs of genuine and fake videos. On the left of each example, frames from each video sequence are displayed. On the right, the extracted reference rPPG signal is plotted for each paired fake and original video. Additionally, the measured heart rate of the person recorded is displayed. (CREDIT: Peter Eisert, et al.)

Why it matters

The stakes in this technological struggle are enormous. In a world where people can no longer trust videos, misinformation spreads unchecked. Criminals forge convincing evidence to manipulate courts, while hostile governments fabricate scandalous footage to destabilize elections or silence critics. Even everyday users fall victim to scams powered by hyper-realistic digital forgeries.

At the same time, the science behind this research promises progress in healthcare. Doctors could measure vital signs remotely using only a webcam, expanding telemedicine and bringing new tools to patients across the globe.

The overlap of health technology and digital forensics shows how tightly these fields connect. For now, though, the race continues.

Deepfake creators show they can imitate life itself. The real question is whether detectors can adapt quickly enough in today’s digital world.

Research findings are available online in the journal Frontiers in Imaging.




Like these kind of feel good stories? Get The Brighter Side of News' newsletter.


Mac Oliveau
Mac OliveauScience & Technology Writer

Mac Oliveau
Science & Technology Writer

Mac Oliveau is a Los Angeles–based science and technology journalist for The Brighter Side of News, an online publication focused on uplifting, transformative stories from around the globe. Passionate about spotlighting groundbreaking discoveries and innovations, Mac covers a broad spectrum of topics—from medical breakthroughs and artificial intelligence to green tech and archeology. With a talent for making complex science clear and compelling, they connect readers to the advancements shaping a brighter, more hopeful future.