At This Year’s U.S. Open, IBM Wants To Give You All The Insta-Commentary You Need

Tennis fans watching the U.S. Open at home or in person this year will get some extra insights delivered by IBM’s machine learning technologies.

“The way we’re thinking about it is: the first time Watson’s come to the U.S. Open,” says IBM program manager John Kent.

IBM is using its cloud-based data-processing tools, including the Watson machine learning suite, to enhance its online videos by automatically generating everything from video captions to automated analyses of ongoing matches. During each match, IBM’s online SlamTracker platform provides real-time scores and assessments, pulling in data from officials and on-court speed tracking radar to evaluate players’ moves as they happen.

“We not only provide the real-time scores, but for each point, try to provide a little bit of insight as to what happened,” Kent says. For example, the system can indicate on which areas of the court each player won the most points, how many feet each player moved over the course of a match, and which points were won by unforced error.

IBM has provided support to the U.S. Open for about 25 years, going back to when speed-tracking radar was new. It has introduced additional digital services as the tournament’s online audience has grown and, recently, increasingly transitioned to mobile devices. This year, the company estimates that about 15 million viewers around the world will use the IBM-powered U.S. Open website and smartphone apps.

And since IBM, which provides technical services to all four of tennis’s Grand Slam events, has years of historical data on individual players’ performance, it can provide instant analyses around “pressure situations,” offering the likelihood that, based on past performance, a player will, say, come back from a particular losing position. The numbers are all crunched using Apache Spark, an open source big-data processing engine hosted on IBM’s Bluemix cloud platform.

After each match, players are able to use IBM’s data to review their own performance.

“We provide the players with a USB key shortly after their match that has video of their match,” Kent says. “Not just like a DVR kind-of version where they can fast-forward through their match [to specific moments], but we also index it to all the points and statistics.”

At the same time, Watson’s language-processing facilities take in highlight clips and player interviews from the tennis tournament, and automatically generate subtitles and transcripts for the U.S. Tennis Association to publish. This enables the USTA to generate accurate, Americans with Disabilities Act-compliant captions and transcripts faster than it could with a human doing it by hand. “The transcript goes into the publishing system,” Kent says. “The USTA can make edits to that if they want.”

Watson’s visual recognition tools can also recognize players on the court and even celebrities in the stands to generate a searchable database of publishable photos, which spares the USTA’s editorial team from having to toil away for hours indexing photos and searching for famous spectators. “That process used to be manual,” Kent says. However, as with the audio processing, human staffers usually review the files before they are published. Any errors discovered can be used to train Watson to be more accurate in the future.

For fans watching the tournament live from the USTA Billie Jean King National Tennis Center in Queens, Watson’s conversational tools will provide them with navigational and other information, including dining options and directions to Manhattan, available through a chat-like interface.

The U.S. Open marks the first time many of these Watson features are being used by IBM at a sports or entertainment event. But IBM (which also provides digital support for the other Grand Slam tournaments, the Masters Golf Tournament, and Broadway’s Tony Awards) will likely soon be working on services for spectators in 2017.

 

Fast Company , Read Full Story

(13)