I did
Exploration of the Future of Run, with an emphasis on recovery
Under Armour Sponsored Project
@ ArtCenter
UI Design
Interaction Design
Data Visualization
What We Made
During this Under Armour sponsored project, I explored the future of running.  I developed a system called Austin, which provides actionable training and recovery advice through machine learning.  This prevented injuries, and optimized potential, providing a much more meaningful experience for runners and valuable feedback for Under Armour.
Why we made it
The runners we worked with loved their sport, but more often than not they ended up injured. They did all they could to minimize that risk, but the exact path was often unclear. Our solution provided that much needed guidance.
I noticed you're on a mobile device. Here's brief overview, but if you'd like to visit on you're computer next time I'd love to share more of this project!
Design Process
Materials Testing
Prototype Testing
Market Research
Sergio Median - Run With Us Pasadena

Recovery is everything

“I see people toe the line (go for a race) without the proper base miles in (training miles). They get hurt and abandon running.... It takes proper training, and smart recovery to run smart and keep running through your 60s”
Information overload

My primary reseach was with the UCLA track team, as well as local trainers at Equinox and Run with Us. I went down a rabbit hole of body scanning, wearable device analysis, and pressure mapping that left my head spinning with information overload.  I was trying to understand recovery, but there had to be a better way to understand where I’m at, and what I need.

Understanding Recovery
Looking into the sole
As I felt my research had diverged into a massive pool of data and statistics, I felt It could help to focus in on a specific element of recovery for the time being, and maybe that would help inform the bigger picture.  I remembered in an earlier interview with Sergio, he did something interesting with a runner than came into the store.  He immediately pulled out her footbed and examined it.  I returned to Run with Us with the intent to investigate further.
“If runners call with an injury or some pain, I ask them to bring their shoes in… Looking at their footbeds can provide insight into problems with their running technique or shoes”
-Sergio Medina

I started pulling footbeds. I hypothesized that if I could find enough footbeds, and annotate them correctly through an expert like Sergio, I could use the power of machine learning to derive the same insights Sergio had through the use of a simple app and phone camera.  These took a while to curate, but I was able to start experimenting after around 50 thanks to a used running shoe donation bin.. Gross.

Curating a data set

I was able to make some of the information that Sergio saw explicit, through the use of a machine learning segmentation program.  This was exciting, but it didn’t always work. Factors like the lighting environment of the photo, how worn in the footbed was, and even if they had big dogs that would pull them around when walking could potentially scew the data set, and therefor the insights.  This didn’t bother me much, because it meant there was room to innovate! Yes!

Learning through machine learning

Through prototyping, I learned I could learn more about the overall life left in the shoes through the outsoles than the footbeds, because the outsole rubber physically wears down through use much more than footbeds.I also began to understand that while a solution that’s incorporated within outsoles is great, it must be embedded into the shoe from the manufacturer, as apposed to slipping in a pair of footbeds into any pair of shoes.  Both methods were great potential ways of understanding performance and gear insights, but use cases for how this information would be scanned needed to be flushed out.

Outsole wear prototype

The idea of a digital footprint came from the need to visualize the data from all these scans in a digestible way for users, and an accurate way for computers.  The digital footprint is essentially an evolving signature of the individual runner, that carries with it a ton of valuable insights.  This digital footprint can be updated through either a footbed, and or an outsole.

Digital footprint

I arrived at the colored square pattern for the footbed and outsole through research into machine readable patterns.  These are patterns that are easy for machines to read, and have a high fidelity of data that can be observed through computer vision.  Since this pattern would be the element that connected the physical with the digital to provide personalized information, I decided that it should be something that should be celebrated aesthetically in other ares of my design process.  Iconography in the UI was specifically influenced by this.

Machine readable patterns

It wa imperative to make recovery data relevant and useful to the user.  This sketch below shows how I envisioned that working.  First the user would scan the outsole wear pattern or the footbed pattern.  Then, the image would be converted into data points based on the scan.  That abstracted scan would be added to a data base of all other user’s scans, where it would be analyzed by a machine learning program trained on that data base.  It would derive meaning from the scan, such as gate and pronation, and then provide feedback to both the user and Under Armour.  The catch is, there needed to be a feedback loop.  The ML program is only as smart as it’s data set, and it needs to be trained carefully over time.  It was therefor necessary for both users and UA to have the option to correct the feedback if it’s inaccurate.  The program would only get smarter and more accurate in its feedback, thanks to the curation of the data set as a whole.

Creating a feedback loop
Introducing Austin

Austin became the chat based interface that users would interact with. Because he is powered by machine learning and truly knows the runner in terms of recovery and goals, Austin is able to provide coach-like advice.

Any advice Austin provided needed to be actionable, so it became necessary to develop a way for users to see an overview of their recovery progress.  This home screen UI for Austin was the evolution of that need.  Users can see when they need to rest because their bodies are still recovering, and when they are good to push.  This distills hundreds of recovery biomarkers down to simple, actionable advice. The scanning of footbeds and outsoles was one way that Austin would continuously learn about the runner's needs, recovery progress, running form, and footwear performance.

Actionable recovery