Prosthetic arm blurs the line between machine and human body

In VR, a prosthetic arm that moved in about one second felt most natural; too fast or too slow reduced ownership and usability.

Joseph Shavit
Shy Cohen
Written By: Shy Cohen/
Edited By: Joseph Shavit
Add as a preferred source in Google
VR tests show autonomous prosthetic arms feel most “yours” when they move at human-like speed, about a 1-second reach

VR tests show autonomous prosthetic arms feel most “yours” when they move at human-like speed, about a 1-second reach. (CREDIT: Shutterstock)

A virtual forearm can bend in a blink. It can also take its time, easing toward a target as if it is thinking about the move.

In a new virtual reality study, both extremes felt wrong.

When a prosthetic arm moves on its own, speed turns out to be more than a performance setting. It can shape whether the arm feels like it belongs to you, whether you feel any control over it, whether you would want to use it, and even whether the “robot” comes across as capable or unsettling. In this experiment, the sweet spot was a movement that looked a lot like an ordinary human reach: roughly one second.

The work comes from Harin Manujaya Hapuarachchi and colleagues. Hapuarachchi was a doctoral student when the study was done and is now an Assistant Professor in the School of Informatics at Kochi University of Technology. Their larger question sits a step ahead of today’s prosthetics.

Amputated virtual avatar with the robotic prosthetic arm autonomously bending towards the target. (CREDIT: Scientific Reports)

A lot of current research aims to help users control artificial limbs through intention, often by reading biosignals such as electromyography (EMG) or electroencephalography (EEG). But as machine learning improves, it is becoming more realistic to imagine prosthetic devices that sometimes act autonomously or semi-autonomously, helping a user by moving without a direct command.

That promise carries a risk. If a body part moves independently of your will, it can feel “unsettling” or like it is not part of you. That kind of mismatch could become a major barrier to real-world acceptance.

A speed test for “does this feel like my arm?”

To probe that problem safely, the researchers used virtual reality to simulate an amputated avatar whose left lower arm had been replaced by a robotic prosthetic forearm. Nineteen male university students with a mean age of 24.15 years took part. The study was approved by the Ethical Committee for Human Subject Research at Toyohashi University of Technology, and participants provided written informed consent.

Participants wore a high-resolution head-mounted display (Varjo Aero, 2880 x 2720 pixels per eye, 90 Hz) and a motion capture suit tracked by a VICON system with 12 cameras recording at 250 Hz. They also wore a rigid brace on their real left arm to prevent bending at the elbow, so the virtual prosthetic could “do the bending” on its own.

The task was a reaching exercise. A purple sphere, 5 centimeters in diameter, appeared in front of the avatar. Participants moved their upper arm to bring the virtual elbow toward the sphere. Once the elbow got close enough, the virtual prosthetic forearm autonomously flexed toward the target in a minimum-jerk trajectory.

Testing six movement durations

The key experimental variable was how long that autonomous bend took. The study tested six movement durations: 125 milliseconds, 250 milliseconds, 500 milliseconds, 1 second, 2 seconds, and 4 seconds. Each condition ran as a block of 15 reaches, and each participant completed two blocks per condition.

In virtual reality, participants embodied an avatar whose left forearm was replaced by an autonomous prosthetic arm that flexed toward a target at different movement speeds. (CREDIT: Toyohashi University of Technology)

After each block, participants filled out questionnaires measuring embodiment, usability, and social impressions of the prosthetic as a robot. Embodiment was split into two pieces: a sense of agency and a sense of ownership. Agency was measured by agreement with the statement, “The movements of the virtual prosthetic arm seemed to be my movements.”

Ownership was measured by agreement with, “I felt as if the virtual prosthetic arm I saw was my own left arm.” Usability was measured using the System Usability Scale (SUS), which produces a score from 0 to 100. Social impressions were measured using the Robotic Social Attributes Scale (RoSAS), which breaks down into competence, warmth, and discomfort.

One second felt best, and the extremes paid a penalty

Across the board, the one-second condition stood out.

A Friedman test found a significant effect of movement duration on ownership. Ownership in the 1-second condition was significantly higher than in the fastest and slowest conditions. Ownership at 500 ms was also significantly higher than at 125 ms and 4 s.

Agency followed the same pattern. A Friedman test showed a significant effect of movement duration on agency. Agency at 1 second was significantly higher than at 125 ms and 4 s. Agency at 500 ms was also significantly higher than at 125 ms and 4 s.

Usability climbed with the more human-like timing, too. A repeated measures ANOVA showed a significant effect of movement duration on SUS usability. Usability at 1 second was significantly higher than at 125 ms and 4 s. Usability at 500 ms was also significantly higher than at 125 ms. The 2-second condition scored significantly higher than 4 seconds and came close to being lower than 1 second.

In other words, very fast and very slow movements both made the prosthetic feel less like part of the body and less pleasant to use. The middle range, especially 1 second, produced the strongest sense of ownership and agency along with the best usability.

Target area for the reaching task. (A) View from the front in the virtual setup. (B) View from the top in the virtual setup. (CREDIT: Scientific Reports)

Social impressions shift too, especially discomfort

The “robot personality” scores moved with speed, but not all in the same way.

Competence showed a significant effect of speed. Post-hoc tests found competence ratings at 500 ms and 1 second were significantly higher than at 4 seconds. Competence at 1 second was also significantly higher than at 2 seconds. There were no significant differences among the faster and moderate conditions (125 ms, 250 ms, 500 ms, and 1 s).

Warmth did not show a clear dependence on speed. A Friedman test found no significant effect.

Discomfort, however, spiked when the prosthetic moved fastest. A Friedman test showed a significant effect of movement duration on discomfort. The 125 ms condition produced significantly higher discomfort than 500 ms, 1 second, 2 seconds, and 4 seconds.

So if you only optimize speed, you might win points for “capable,” but you may also make the device feel scarier or more awkward. The fastest movement looked like the most uncomfortable one to live with.

Why one second might feel “right”

The researchers point to a simple idea: 1 second may be close to the timing people already expect from natural reaching movements. They cite prior work by Wang et al. (2016), which reported that when people reached “naturally” under instructions to be accurate, they tended to choose a movement duration close to 1 second. That resemblance could help explain why embodiment and usability peaked at 1 second in this experiment.

A physical pole was placed in front of the participant and matched in VR to prevent direct reaching and to ensure that target acquisition relied on the prosthetic arm flexion. (CREDIT: Toyohashi University of Technology)

At the same time, the paper cautions that reaching timing can depend on the task itself. Fitts’ law predicts that movement duration changes with things like the distance to the target and the target’s size. So a “human-like” duration might not always mean one second in every situation.

Another interesting behavioral result showed up during the task. The researchers measured the time it took participants to move their upper arm toward the target, up to the moment the autonomous forearm motion kicked in.

A Friedman test found a significant effect of movement duration on this response time. Response time was significantly higher in the 4-second condition than in the 125 ms, 250 ms, 500 ms, and 1-second conditions. The data suggest participants slowed their own movements when the prosthetic was slow, almost as if they were matching their pace to what the “arm” was about to do.

Limits of a virtual arm, and why VR still helps

This study deliberately used a virtual prosthetic with healthy participants, plus a brace to constrain real arm bending. That design makes it possible to isolate speed as the main variable, but it cannot reproduce everything that would matter for an amputee using a physical device.

The authors note several missing real-world factors: forces generated by a physical prosthesis, its weight, and forces at the connection points with the residual limb. They also point out that in their setup, the target was always visible, which made the prosthetic’s intention easier to predict. Prior work suggests predictability can increase agency and ownership, so future studies may need to explore what happens when intention is less clear.

They also suggest extending the measures beyond subjective questionnaires, using methods such as intentional binding or physiological measures, to capture agency and embodiment in different ways.

Even with those limitations, VR offers a practical advantage. It can simulate prosthetic control styles that are not yet widespread, letting designers test acceptance problems early rather than after a device ships.

Research findings are available online in the journal Scientific Reports.

The original story "Prosthetic arm blurs the line between machine and human body" is published in The Brighter Side of News.



Like these kind of feel good stories? Get The Brighter Side of News' newsletter.


Shy Cohen
Shy CohenScience and Technology Writer

Shy Cohen
Writer

Shy Cohen is a Washington-based science and technology writer covering advances in artificial intelligence, machine learning, and computer science. He reports news and writes clear, plain-language explainers that examine how emerging technologies shape society. Drawing on decades of experience, including long tenures at Microsoft and work as an independent consultant, he brings an engineering-informed perspective to his reporting. His work focuses on translating complex research and fast-moving developments into accurate, engaging stories, with a methodical, reader-first approach to research, interviews, and verification.