New mobile phone app shows how well shoes will fit based on the 3D shape of the user’s foot

The app is designed for online shoe retailers to offer to their customers, to provide accurate fitting of different types of shoes and feet.

[May 10, 2022: Roberto Cipolla, University of Cambridge]

The app technology is designed for online shoe retailers to offer to their customers, to provide accurate fitting of different styles of shoe. (CREDIT: University of Cambridge)

Snapfeet is a new mobile phone app that shows how well shoes will fit based on the 3D shape of the user's foot. It also offers a simple augmented reality (AR) visualization of what the shoes will look like on the feet.

The app technology is designed for online shoe retailers to offer to their customers, to provide accurate fitting of different styles of shoe and the opportunity to see how the shoes will look on the shopper's feet. This should lead to less footwear being returned. There is a huge cost in returns, both monetary and environmental. Many shoe retailers make very little revenue from online sales due to the high rate of returns, so the aim of this app is to change this.

Professor Roberto Cipolla and his team Dr. James Charles and Ph.D. student Ollie Boyne from the Machine Intelligence group have created the app working in collaboration with Giorgio Raccanelli and the team at Snapfeet.

The Snapfeet app allows the customer to wear the shoes virtually via their phone thanks to Augmented Reality (AR) and find their perfect shoe fit in a few moments.

Snapfeet creates, in real time, an accurate 3D copy of the user's feet. In a few seconds it is possible to make a 3D model of both feet, simply by taking a few mobile phone photographs from different viewpoints.

Using the user's foot shape and comparing it to the shoe geometry, Snapfeet is then able to recommend the correct size for each type of shoe, communicating to the user the degree of comfort that can be achieved in the different parts of the foot: toe, instep, heel and sole.


Related Stories:


Giorgio Raccanelli says, "You download the Snapfeet app, register, take a few pictures all the way around the foot, and a 3D model of the foot will appear, allowing you to immediately start shopping. The application automatically compares the three dimensional image of the foot with the chosen shoe style, showing you how it will fit, or will directly suggest a style that is most suited to your foot shape."

Snapfeet have their first big customers in Hugo Boss and Golden Goose.

Snapfeet's parent company, Trya, began by licensing novel photogrammetry software from Professor Cipolla's group in 2011 via Cambridge Enterprise.

The original photogrammetry technology used photos with a calibration pattern. After taking these photos they are uploaded to a server and a multi-view stereo algorithm developed at Cambridge found multiple point correspondences and generated a 3D model that explains all the different view points and locates the cameras in world space. This was state of the art for reconstruction accuracy back in 2011.

Since 2019 Professor Cipolla's team have been working with Snapfeet to evolve the original photogrammetry technology into a mobile phone app which reconstructs the 3D foot shape live on the phone and without the need for any calibration pattern and to correctly size and visualize shoes in AR.

The original photogrammetry software was very accurate to 1mm but it was slow and hard to process. Accuracy was there but usability was not. It also did not exploit any knowledge of the object it was trying to reconstruct.

The team looked at how to make it faster and much more user friendly and the idea was born to do it all on a mobile phone with no calibration pattern and no processing on a server. They were able to exploit exciting new developments in machine learning and powerful processors on modern mobile phones.

A video of the app in action building a 3D copy of the foot, size suggestions using machine learning and realtime AR to visualise the suggested size on the feet. (CREDIT: University of Cambridge)

"We were able to exploit new developments in machine learning (deep learning) for recognizing 3D objects and the advanced sensors and powerful processors on modern mobile phones to run the reconstruction algorithms in real-time on the phone. In summary we can combine a parametrized foot model and novel deep learning algorithms for recognizing curves and surfaces allowing us to run the 3D reconstruction algorithm in real-time on the device," said Professor Cipolla.

They used a parameterized foot model that has been learned from lots of 3D scans of feet using the original photogrammetry technology. The 3D foot model that the app builds can be rendered in any graphics engine to visualize what it looks like. The shape of the foot can be altered and is controlled using 10 different parameters that are learnt with machine learning. The objective is to find out which of these parameters produce a 3D foot that best matches the user.

The "master" foot model is called a "prior," short for prior knowledge about what feet look like. The app user still takes multiple images around the foot but instead of building point clouds (as in photogrammetry) the app uses machine learning to predict the higher level features that control the shape of the foot. The benefits are that the app user needs to take less photos, the returned foot model has less artifacts and the process is more robust should there be errors during a scan. The model is also much quicker to produce thanks to the real-time Deep Learning element of the app.

The team have just released the new version of the app that can do everything on the mobile device. The server is no longer needed.

Talking about the app James Charles says: "I've always had difficulties with getting shoes of the correct size. I dislike the try on process in shops and the environmental impact of ordering lots of shoes online was a big concern for me. However, before this app there really was no other option. So, I'm highly motivated in solving this problem and think we already have pretty good solution."

Initially when the user opens the app there is a calibration phase where the user begins tracking the camera using the latest AR features on mobile phones. On an iOS phone that is AR Kit and on an Android phone it is AR Core, they use the same set of routines that an interior design app would use to map a room and represent the physical space in graphic form.

During the calibration phase the phone camera is being tracked. The app builds upon AR technology to track the camera and calculate how far it is moving, it also detects the foot and the floor providing a good idea of world space. The app knows where the phone is to within 2mm accuracy and it is all done within a few seconds of loading the app.

As the phone moves around certain key points of interest on the foot are detected to help determine the foot length and width, then a 3D mesh is created from these measurements and the model is then overlaid over the user's foot in AR so that they can visually validate if it is correct.

This is another key step and different to the competition. There are apps on the market that can also validate model reconstruction in this manner but they do not allow you to actively adjust the model. Snapfeet allows you to adjust the model in real time and then immediately obtain the 3D model of your foot on the phone itself with no need for the server.

There are three machine learning foot algorithms in play. One is building the parameterized foot model; the second is the machine learning that recovers the parameters of the model from multi-view images as you move the mobile phone around. Finally there is a third machine learning algorithm within the app that compares the 3D foot model against all the shoe shapes, or "lasts," that the customer is interested in and will then return a size of those shoes which will best fit the user's foot. This is the virtual try on.

When manufacturers build a shoe they build a shoe last which is a solid model of the inside of the shoe. Around the shoe last they create the shoe design. The shoe last along with the material used to create the shoe determines the size and comfort level that someone is going to have when they put their foot into that shoe.

The algorithm will take the foot model and digitally place it inside all the shoes that you are interested in and give you a comfort score. You are then able to render a virtual shoe onto your feet using the AR. The app also detects where the legs/trousers are so that it can get the correct occlusion effect, using machine learning to capture the tracking of the foot.

The app also uses AR once you have recovered your foot shape so the user can get the feel you should get when you try the shoe on. The AR element of the app then allows the user to see what the shoes will look like on their foot and whether they go well with a particular outfit.

Snapfeet have generously funded a Ph.D. studentship enabling Ollie Boyne to extend the research in modeling feet from photographs. The app is now live on the App Store and is being used and tested by many shoe vendors to help reduce their returns from online sales. Download the app and try it on your own feet.

For more technology news stories check out our New Innovations section at The Brighter Side of News.


Note: Materials provided above by University of Cambridge. Content may be edited for style and length.


Like these kind of feel good stories? Get the Brighter Side of News' newsletter.


Tags: #New_Innovations, #Mobile_App, #AI, #Retail, #Shoes, #Shopping, #Science, #Research, #Technology, #The_Brighter_Side_of_News


Joseph Shavit
Joseph ShavitSpace, Technology and Medical News Writer
Joseph Shavit is the head science news writer with a passion for communicating complex scientific discoveries to a broad audience. With a strong background in both science, business, product management, media leadership and entrepreneurship, Joseph possesses the unique ability to bridge the gap between business and technology, making intricate scientific concepts accessible and engaging to readers of all backgrounds.