Ilario Corna, the International Olympic Committee Chief Technology and Information Officer, and Yiannis Exarchos, Olympic Broadcasting Services (OBS) CEO, recently discussed AI’s involvement in the Olympic Games in separate interviews.
The IOC launched the Olympic AI Agenda in April 2024, saying “it sets out the envisioned impact that Artificial Intelligence (AI) can deliver for sport and how the IOC, as the leader of the Olympic Movement, intends to lead on the global implementation of AI within sport.”
At the launch, IOC President Thomas Bach stressed that AI has the potential to support the athletes, who are at the heart of the Olympic Movement.
“AI can help to identify athletes and talents in every corner of the world,” Bach said. “AI can provide more athletes with access to personalized training methods, superior sports equipment and more individualized programs to stay fit and healthy. Beyond sporting performance, AI can revolutionize judging and refereeing, thereby strengthening fairness in sport. AI can improve safeguarding in sport. AI will make organizing sporting events extremely efficient, will transform sports broadcasting and will make the spectator experience much more individualized and immersive.”
A year later, and after seeing the agenda at work at Paris 2024, Corna and Exarchos reflected and looked ahead to what’s next. The separate interviews were lightly edited for length and clarity:
OlympicTalk: From your role, how was AI most visible at the Paris Olympics?
Corna (IOC): One, it is really supporting the athlete. One aspect that we used AI for was actually cyber abuse prevention. The IOC ran the largest online abuse prevention program that we ever conducted in sports history. AI monitored athletes’ social media during Paris 2024. We analyzed over 2.3 million posts for potential cyber abuse. We identified over 10,200 abusive posts which were automatically removed from the social media platforms. We also flagged over, I believe, 152,000 posts and comments as being potentially abusive. We actually referred them to the social media platforms to make sure that we followed up on them. We detected over 8,900, if I remember correctly, unique accounts sending abusive messages. So that’s one part of our AI. We we actually helped athletes.
Another one is we actually created a chatbot. The IOC, we have a lot of rules — what can be posted on social media and what cannot be posted. So we actually enabled a chatbot for all the 11,500 athletes where they were able to ask questions through Athlete365, and we are able to get instant answers instead of going through all of the documentations.
The venues were all planned using digital twin technology. Even before being there, we were able to see what camera positions were the best ones, if there was anything obstructing views, what was the weather impact and so forth.
We planned during the Games an AI-powered energy management called Energy Expert where we monitored in real time all of the energy consumption, then optimized and asked questions of how we could optimize the energy during the Games. A small example: we were able to see fog lights left on in the stadiums, and we were actually going to the venue owners and saying, “Can you please shut them off during the night, because there’s no competitions?” Now that the Games are done, we were able to actually look at it and say, how do we compare against the previous Games?
Editor’s Note: Corna then showed an example of AI use that was tested surrounding competition through a partnership with Alibaba, a technology company, and Omega timing. For the men’s 400m hurdles final, a page displayed race predictions based on past results, real-time data showing athlete speeds and steps between hurdles and race visualization. The hope is to develop it more and go live with the data for future Olympics.
Exarchos (OBS): On the front of broadcasting and also digital engagement, we have been working with AI for a number of years. We started working very, very focused after the Games of PyeongChang, because we saw huge opportunities there. We have focused our strategy around what I would call our three Es: how we can enhance enablement, how can we improve engagement and how can we create more efficiencies? Because the Olympic Games is an exercise in efficiencies by the sheer magnitude.
Paris was kind of the first climax of this effort across all three areas. In enablement, on doing things that were not possible in the past, in a number of sports we did for the first time on a massive scale multi-camera replays that we always felt added a lot of value in the narrative and especially in the understanding of the sport.
These types of replays were technically possible before. The problem was that in order to generate — and some efforts had been done in soccer and American football — these replays, it took a lot of time — 20 to 25 minutes to generate one replay. So practically, it made them a little bit useless for the live coverage.
Through AI and working with our colleagues from Alibaba, we managed to bring this down to a few seconds. So this was available to the director as a very, very quick replay that added a lot of value in the narrative, the storytelling of the sport. And this is why we massively used it in Paris. We will also be using it a lot in Milano. In the winter sports, I don’t think it has ever been used, something like that before.
The second thing that was also very visible was the application of stroboscopic analysis that we developed with Omega in some sports. Sometimes, because of the nature of sports, the movements of the athletes are so fast that people don’t really realize how impressive the things that the athletes are doing. So we introduced that in a number of disciplines — in track and field, gymnastics, diving. Very, very fast after the effort, we could reproduce what are the actual movements of the athlete, and you could tell also who was doing better, who was doing worse.
Also, we used massively in Paris athlete tracking based on AI. There are some sports where there’s massive participation, like marathons, race walks, sailing and so that for a long period of time you cannot really distinguish the athletes. So this helped us identify the different athletes in an easy way.
Also, in collaboration with Omega, we created some solutions where we could show the technical capabilities in some sports. In archery, we managed to show the trajectory that the arrows followed. Because if you’re watching archery, you think it’s a perfect throw, and you have a straight arrow going straight to a target. Once we showed that, people understood how far more difficult and complex is archery.
The other thing we did — something that we had been trying for many years, and it was very difficult — to show the spin of the ball in table tennis. For people who know table tennis, they understand that it’s all about the spin. People who don’t play table tennis maybe don’t realize really what the crazy number of spins these athletes are generating. We have been trying to do that with traditional means in the past. It was not possible. Through AI, we managed for the first time to really show the spin. People were shocked to realize the number of revolutions.
We did it also in golf with a ball tracing that people could also see that golf is not such a straight thing as sometimes people believe. The serve reaction time in tennis, which is something that was a little bit difficult to be measured so fast in the past. Now it was there.
The other element which is important is an element that has to do with efficiencies. In the Games, we produced 11,000 hours of content. There is a massive need for more broadcasters to generate very, very fast highlights that they can use. They can push on their digital. They can do their own summaries and so on. But they need these highlights to be custom made, to be for their own audience, whether it’s their country, their athlete, the sports they prefer, the format that they prefer. So we established a quite innovative platform to generate AI highlights — automated — where broadcasters had the capacity to choose by sport, by athletes, to create their own durations, whether they wanted a horizontal video or a vertical video for mobile phones, whether they wanted to integrate commercials and other things. We ended up with 97,000 different clips being generated by broadcasters, all of them customized. Obviously, this is something that we move forward to the future.
OlympicTalk (to Exarchos): Along similar lines as the archery arrow trajectory and table tennis ball spin rate, are there any specific Winter Olympic sport examples that could be showcased during Milan Cortina?
Exarchos: On curling, it’s a sport which has a very strong Olympic presence. For some reason, curling is a sport that people love in the Olympics to spend hours. But not everybody understands exactly how it works and what happens, which is fascinating. Unless you actually play or are a core fan. But the Olympics are an opportunity for non-core fans, for people to understand. So what we will do for the first time in Milano is we will apply an AI system that we have already tested that will be able to explain very easily the curling stone rotations, and especially show the exact path that these stones follow. Sometimes, we think that these stones follow more or less a linear path and so on. This is nothing like what the sport is really about.
Also, we will employ 360-degree multi-camera systems across practically all winter sports. I believe that in some sports, people will start seeing and understanding things that maybe they don’t understand. Not just Canadians, but everybody will be able to understand how three dimensional ice hockey is through doing that and understanding really what the paths and the views and the strategies are in the game.
Editor’s note: In hockey, a puck-tracking system is under discussion, and if implemented for Milan Cortina, would be a first for a Winter Olympics. The IOC is working with all Winter Olympic sport federations on possible technological advances before Milan Cortina.
OlympicTalk (to Corna): You mentioned the 400m hurdles example that was in a test stage. Are there any specific examples of something similar for the 2026 Milan Cortina Winter Olympics that could be developed to the point it is available publicly?
Corna: One thing that we’re looking at is actually how you can use AI models to make judging easier and provide the right data for the judges to make decisions. In short track speed skating, sometimes the judging (review of contact in races), because of the information, takes longer naturally, because there are touches that happen very fast. So what we are looking at is cameras mounted into the helmets of the short track speed skaters. With AI, we are looking at video analysis to understand if I touch you, if I push you, to understand the penalties that come in.
Editor’s note: The use of cameras on short track skaters’ helmets has not been finalized, though it is a possibility for Milan Cortina, pending further discussions, including those regarding safety.
OlympicTalk (to Exarchos): For LA 2028, is there anything you were close to be able to get for Paris that you think you can implement for LA? Or was there anything you saw in Paris that sparked a new idea for LA 2028?
Exarchos: We did a very quick debrief while we were still in Paris because in the intensity of the Games, you uncover opportunities that may exist. By the way things move now and by the speed of things in technology, it’s probably a little bit too early to say, but I believe that further digging down on a combination of explanatory data with immersive shots is where we would be going. So on one hand, to have the visually rewarding sense of a shot that was very difficult to be achieved, to generate that very, very fast, but also to associate that with an immediate graphics explanation. Ideally, to also combine that, if possible, in some sports, with biometric data. So to have an understanding of the visual beauty of what happens, of what is measurably happening, and what is the impact on the athletes themselves. All these three things coming together. Easier said than done, but we have seen more difficult things happen.
OlympicTalk (to Exarchos): Is there anything else about the use of AI we haven’t covered?
Exarchos: One thing that I keep on repeating to the team — and we have as a mantra — is that we are not about technology. We’re about telling, in the most compelling way, the stories of the greatest athletes in the world.
For us, the Games is not about showing off technologies. It’s about using technology to showcase the athletes. This is a fine line, and this is why, for me, the ultimate test for technology is how effective does it make storytelling?
The other thing is maintaining a very, very ethical and responsible and compliant use of AI, because the risks are a lot, the temptations are a lot. But for us, because we believe that technology is an enabler of human creativity, we’re not thinking about substituting the creativity of humans. Also, not to use data of people for granted. Be very, very responsible on that front. Because of the universality of what we do, we need to be careful, to be extremely compliant in the strictest of regulations in the world. Because some countries might be a little bit more relaxed with some things, some others more rigid. Ourselves, we will always err on the side of compliance, care and ethical conduct.