JumpLens: Real-Time Horizontal Jumps Analysis and Comparison for Competitions

Hello,

At the beginning of this year, I started working on a project for horizontal jumps to use in my training group. We needed something to use during winter training that could help us understand how athletes are improving in TJ / LJ. I have written a small description of it in this post:

A Hybrid, Multi-Sensor Tracking System for Triple Jump Performance Analysis (A Tracking System for Triple Jump Performance Analysis - Loops, Logic & Syntax)

The system is currently in the design stage, and I have been working on the design documentation and experimenting with relevant libraries in Python. The system comprises both hardware and software.

Given the nature of this competition, I decided to stay within the same topic and focus on horizontal jump analysis for field competitions providing visual cues for spectators or viewers.

As my background is in horizontal jumps, I will focus on these events for the prototype. If successful, the same methodology can be applied to other field events.

Following is my idea. I will need to do some work in the coming days to refine it.

JumpLens - Real-Time Horizontal Jumps Analysis and Comparison for Competitions

With this project, I aim to improve the viewers’ experience by using data visualisation tools to make field events more engaging.

Each athlete’s attempt is analysed to extract metrics (e.g., acceleration, speed reached takeoff, time of flight if possible, etc.). Using these data, feedback is provided to compare the current attempt with the previous one (if it exists).

A comparison with the athlete ranked higher on the leaderboard is also performed, giving information into what differentiates their best performances.

This project also aims to support athletics commentators in storytelling the event, helping viewers better understand the technical aspects of each performance in real-time.
Or what differentiate an athletes performance from the other.

Main features:

  • Real-time or (near real-time) data visualisation of athletes’ attempts, including metrics such as speed, acceleration, and phase distances (e.g. for TJ hop, step, and jump, if tracking these phases is possible).
  • Comparison of each attempt with the athlete’s previous ones to point out improvements or declines in performance.
  • Comparison with other athletes, so that the spectator can understand what distance is needed for the athlete to move up in position.
  • This data discovery help spectators/viewers better understand how athletes differ in technique and execution.
  • Any other data visualisation technique to make the competition more engaging and easy to follow. This would depend on the quality of the data feed to the software.

Data needed:

Usually, some of the data I described are tracked during competitions. (If not, further computer vision technique should be considered?) This data would feed into the software to produce the outputs described above.

If sample data is available, I will experiment with it for development and testing. My next step is to identify the open data sources available for use.

Thank you.

Best wishes

@robotastray

1 Like

This sounds fascinating! I won’t make any promises but will enquire what might have been captured at recent elite championships…

1 Like

I found some papers with tables and could use the entries to create a sample dataset to demonstrate how it would work, if there is no other option.

Hello @robotastray, thank you for posting in our forum. Do you think you could officially participate in our TrHackathon? Presenting your prototype would be fascinating for our audience and the tech-savvy people will attend our AthTech conference.
Looking forward to knowing more about your project.
Nicolas Launois, European Athletics

Hello, yes I would like to participate!
So far I have been able to create database tables using the data from the papers I posted above. I will use these tables to mimic live data feed.

Thank you.

Hi @robotastray, this is indeed a good start. You will be expected to submit your findings as part of your final submission. All the best, Nicolas

The data in the first paper are from European Athletics Premium Meeting “Thessaloniki 2008". I wonder if there are any recordings of the jumps from this competition, so I can try to create an overlay of the parameters and metrics and compare two athletes on the screen. That would provide and example of how things would work.

Hello @andyrobinson.
Just to give an update on my project.

I have attached some screenshots of the work I have so far for the frontend. I have used the same colour scheme of the european atheltics website.

I have also created an read-only API to read the parameters from PostgreSQL tables which I will use to display the information once the user is logged in.

Currently, the assumption is that parameters can be recorded during competitions using a camera system similar to the one used at the European Athletics Premium Meeting “Thessaloniki" in 2008.

For the video comparison I played around with Kinovea.

The following is an example of a video comparison without graphic overlays. (I am currently working on how to superimpose the graphics.)

The competition summary will provide an overview of the most important metrics, while the dashboard and other menu options will have more detailed information.

1 Like

I have added some additional features to the project to give coaches tools to view data from their athletes’ performance.

  1. During international or national competitions, data are measured (a combination of sensors and two cameras system using computer vision )

  2. Basic data must be available for live feed so they can be viewed while watching or on the stadium screen ( just like in football).

  3. Once the competition is done, the data should be cleaned, standardised and stored in a database.

  4. Coaches can access their athletes’ metrics via their profile.

Additionally, when the coach uploads a video of an athlete jumping, information should be returned.

I am working on this now.

The data returned are only an estimation, so not as precise as the metrics recorded by the hardware I mentioned earlier.

After analysis, I would give the coach the option to label the video and identify if it belongs to any of their athletes.

If so, the data will be saved on the athlete’s record which means graphs about their performance will have more data points.

Below is an example of the design.
This week I will develop further the backend code for the feature.
The library I am currently using to obtain the output is OpenCV

A coach can access their athlete’s data via the page below.

NOTE: the current data displayed in the first 3 screenshots are a mixture of real data (e.g. from power of 10,or WA) and synthetic data.

Clicking through each row provides details about attempts and metrics recorded during the international and national competitions for that event.

The following is just an example:

in regard to the video analysis tab, as of now only angles are detected using mediaPipe with skeleton overlay. It will require more work, of course.

One of the issues I noticed is that depending on how the video is recorded, the skeleton of the athlete may or may not be identified correctly. Especially if there are multiple people moving in the frame.

  • Generating Dynamic Graphs:

The next step is tracking the centre of mass throughout the jump and provide a graph plotted in real time for the live-stream. I will also provided one on the competition overview page for the athelte’s best attempt.

This is a feature my coach suggested, and if displayed during the live stream, it can give viewers and athletes an idea of how the athlete is moving through space.

In this way, comparison of two athletes is more straightforward by superimposing two graphs and labelling each line with the athlete’s name.
Additionally, adding the breakdown of other details (like contact times, phase distribution and length of the last stride) below it could be also a good idea.

For the center of mass (COM) calculations the following will be used,

# mass distribution "%" of different male body parts
male_weights = {
"head_neck": 6.94,
"upper_arm": 2.71, 
"forearm": 1.62, 
"hand": 0.61, 
"torso": 43.46,
"thigh": 14.16, 
"shank": 4.33, 
"foot": 1.37 
}
# mass distribution "%" of different female body parts
female_weights = {
"head_neck": 6.68,
"upper_arm": 2.55, 
"forearm": 1.38, 
"hand": 0.56, 
"torso": 42.57,
"thigh": 14.78, 
"shank": 4.81, 
"foot": 1.29 
}

These are the body mass % from Paolo de Leva’s 1996 paper that can be found below.

Paolo de Leva. “Adjustments to Zatsiorsky-Seluyanov’s segment inertia parame-
ters”. In: Journal of Biomechanics 29.9 (1996), pp. 1223–1230. issn: 0021-9290.
doi: https : / / doi . org / 10 . 1016 / 0021 - 9290(95 ) 00178 - 6. url: Adjustments to Zatsiorsky-Seluyanov's segment inertia parameters - ScienceDirect.

Very impressive work! Thanks for the update…

Hello @andyrobinson , I have done some research on SEIKO technologies, and as I understand, their Jump Management System and VDM (Video Distance Measurement System) are able to measure metrics such as

  • distance from the takeoff board to the landing spot (and effective distance)
  • speed and distance of each phase (hop, step, jump)
  • run up speed data

I believe their technology was used during the EU championships and they are also partnered with World Athletics.
Would it be possible to receive this data (Rome 2024 - Triple Jump Men and Long Jump women)? Since the data is not publicly available, I am currently creating a dummy database using results from the 2025 British Championships. However, if possible, I would prefer to use real data.

Thank you.

Unfortunately, we’re unlikely to be able to receive any more data in time for the Hackathon deadline. It took months to get the IsoLynx tracking data. We on the committee have no direct contact with anyone relevant in Seiko, and I am almost certain they will be very busy at the Diamond League final tonight and the forthcoming World Championships.

However, it is certainly something we can raise with World Athletics’ CIO at the conference itself, and point out that interesting research could be done with this later