“I see people toe the line (go for a race) without the proper base miles in (training miles). They get hurt and abandon running.... It takes proper training, and smart recovery to run smart and keep running through your 60s”
My primary reseach was with the UCLA track team, as well as local trainers at Equinox and Run with Us. I went down a rabbit hole of body scanning, wearable device analysis, and pressure mapping that left my head spinning with information overload. I was trying to understand recovery, but there had to be a better way to understand where I’m at, and what I need.
“If runners call with an injury or some pain, I ask them to bring their shoes in… Looking at their footbeds can provide insight into problems with their running technique or shoes”
I started pulling footbeds. I hypothesized that if I could find enough footbeds, and annotate them correctly through an expert like Sergio, I could use the power of machine learning to derive the same insights Sergio had through the use of a simple app and phone camera. These took a while to curate, but I was able to start experimenting after around 50 thanks to a used running shoe donation bin.. Gross.
I was able to make some of the information that Sergio saw explicit, through the use of a machine learning segmentation program. This was exciting, but it didn’t always work. Factors like the lighting environment of the photo, how worn in the footbed was, and even if they had big dogs that would pull them around when walking could potentially scew the data set, and therefor the insights. This didn’t bother me much, because it meant there was room to innovate! Yes!
Through prototyping, I learned I could learn more about the overall life left in the shoes through the outsoles than the footbeds, because the outsole rubber physically wears down through use much more than footbeds.I also began to understand that while a solution that’s incorporated within outsoles is great, it must be embedded into the shoe from the manufacturer, as apposed to slipping in a pair of footbeds into any pair of shoes. Both methods were great potential ways of understanding performance and gear insights, but use cases for how this information would be scanned needed to be flushed out.
The idea of a digital footprint came from the need to visualize the data from all these scans in a digestible way for users, and an accurate way for computers. The digital footprint is essentially an evolving signature of the individual runner, that carries with it a ton of valuable insights. This digital footprint can be updated through either a footbed, and or an outsole.
I arrived at the colored square pattern for the footbed and outsole through research into machine readable patterns. These are patterns that are easy for machines to read, and have a high fidelity of data that can be observed through computer vision. Since this pattern would be the element that connected the physical with the digital to provide personalized information, I decided that it should be something that should be celebrated aesthetically in other ares of my design process. Iconography in the UI was specifically influenced by this.
It wa imperative to make recovery data relevant and useful to the user. This sketch below shows how I envisioned that working. First the user would scan the outsole wear pattern or the footbed pattern. Then, the image would be converted into data points based on the scan. That abstracted scan would be added to a data base of all other user’s scans, where it would be analyzed by a machine learning program trained on that data base. It would derive meaning from the scan, such as gate and pronation, and then provide feedback to both the user and Under Armour. The catch is, there needed to be a feedback loop. The ML program is only as smart as it’s data set, and it needs to be trained carefully over time. It was therefor necessary for both users and UA to have the option to correct the feedback if it’s inaccurate. The program would only get smarter and more accurate in its feedback, thanks to the curation of the data set as a whole.
Austin became the chat based interface that users would interact with. Because he is powered by machine learning and truly knows the runner in terms of recovery and goals, Austin is able to provide coach-like advice.
Any advice Austin provided needed to be actionable, so it became necessary to develop a way for users to see an overview of their recovery progress. This home screen UI for Austin was the evolution of that need. Users can see when they need to rest because their bodies are still recovering, and when they are good to push. This distills hundreds of recovery biomarkers down to simple, actionable advice. The scanning of footbeds and outsoles was one way that Austin would continuously learn about the runner's needs, recovery progress, running form, and footwear performance.