It isn’t easy to capture the best shots in a golf tournament that is being televised. And that’s why IBM is applying the artificial intelligence of its Watson platform to the task of identifying the best shots at The Masters golf tournament.
For the first time at a sporting event, IBM is harnessing Watson’s ability to see, hear, and learn to identify great shots based on crowd noise, player gestures, and other indicators. IBM Watson will create its own highlight reels.
With 90 golfers playing multiple rounds over four days, video from every tee, every hole, and multiple camera angles can quickly add up to thousands of hours of footage.
IBM Research and IBM iX developed a Cognitive Highlights application to auto-curate individual shot highlights from live video streams. This helps simplify and accelerate the video production process for highlight packages. The system can detect a player celebrating or the start of a golf swing.
June 5th: The AI Audit in NYC
Join us next week in NYC to engage with top executive leaders, delving into strategies for auditing AI models to ensure fairness, optimal performance, and ethical compliance across diverse organizations. Secure your attendance for this exclusive invite-only event.
And this is just the foundation on which IBM will be able to create unique content for personalized experiences for producers, media, and, eventually, fans. This is also the foundation for myriad possible solutions in machine vision and hearing that can be applied to challenges across other industries.
Media and entertainment is one example where this technology could be useful. Imagine how much this would help a TV broadcast producer work at scale. Watson can advise the producer and give them a baseline highlight package from which to work.
The first Watson Highlights dashboard will be onsite for viewing at the Masters. This won’t factor into the actual TV broadcast for 2017, but highlights will be part of the Masters apps and should be available to share on social media.
By leveraging TV graphics and optical character recognition (OCR), the system automatically gathers information about the player name, along with the hole number. This metadata is then associated with the detected highlight segments and posted to a dashboard. In the future, this will enable searches like “show me all highlights of player X during the tournament” or be used to create personalized highlights compilations based on favorite players.
IBM is also rolling out a “cognitive room” at the media center in Augusta, where media will be able to experience all that IBM is doing behind the scenes at the Masters, including what goes into the apps and the ability to voice-interact with the video screen walls (powered by Watson) to generate on-demand player stats and highlights.