"StridesAISteps": The Future of Biomechanical Analysis in Athletics

Hello everyone,

My name is Alejandro Lozano, a Spanish decathlete with a deep passion for both athletics and technology. Your TrHackathon post caught my attention, it’s an exciting call for innovative ideas to transform the sport I love. Together with my training partner Jorge Dávila (also a decathlete and computer engineer), we’ve come up with an idea we believe could revolutionize athletics.

What is “StridesAISteps”?

With “StridesAISteps,” we aim to provide real-time analysis of an athlete’s running form with the same precision as F1 telemetry for race cars. Our goal is to develop an AI-powered system that measures stride length, frequency, acceleration, and even subtle imbalances in technique using only video.

Why It Matters

Today, precise biomechanical data is critical for improvement, but it’s locked behind expensive professional equipment (high-speed cameras, labs). We want to democratize this technology, making it accessible to anyone with a smartphone.

For Athletes:

  • Detect asymmetries (e.g., uneven leg load, injury risks).
  • Analyze stride efficiency (Is it optimal for your height? Can you conserve energy?).
  • Get instant feedback during training sessions.

For Fans & Broadcasts:

  • Live graphics showing why an athlete won (e.g., “Maintained 95% stride efficiency in the final lap”).
  • Video game-style head-to-head comparisons (“Top speed: 42 km/h vs. 39 km/h”).

What Do You Think?

We’d love to hear if you see potential in this or if you have ideas to make it even better! Collaboration is key, and we’re all ears.

Ready to bring this to life for TrHackathon. Let’s innovate together!

Alejandro & Jorge

Dear Alejandro, great idea, if enough consistent video data can be recorded, for training the ai system. I do wonder why you mention only running, where a decathlete is always working to improve all 10 events? Perhaps i can assist you, with a low-cost video recording system that i developed this year, specifically for decathletes. You can take a look (for a preview of one athlete) at https://www.products4sports.be/meerkamp2025/athletes/bib333/athlete.html , where we live uploaded about 240 videos of 18 decathletes (under 18 age) in our local combined events meet. The goal was to record ALL age groups, so i produced 10 computer+camera setups, which would have resulted in >3500 videos to be recorded and uploaded. Unfortunately it was a terrible weather that weekend, and our club did not have 9 tents to permanently cover all tables with portable computers, so we only recorded one age group.

About your solution; i am afraid that recording enough video data, including the “quality ranking” of each (training or performance) session, in a consistent way, by a large group of “users” , is a real challenge. And the size and quality of the (ai) training/learning data will define the quality of the ai system. That is why i propose to consider to use (=add) MULTIPLE (6D or more) accelerometer sensors, for races or field events, with or without cameras, to feed a LLM with a large amount of quality data sets.

Please contact me if i can help you with a camera system (recording and/or (python) processing), or accelerometer sensor module.
Good luck, and never loose your passion to invent something totally new :folded_hands:

Alejandro, this is a great idea.

I heard of another startup working on this recently - more focussed on gait analysis for recreational runners, but a similar idea, to do with a few off-the-shelf cameras what used to need force platforms and specialised rigs around a treadmill.

Note also the data we just added to the Git repo, with the live athlete data from Rome - this could be correlated with videos of the race footage on YouTube, and maybe this could yield some training data?