Man vs. Machine: IBM’s Supercomputer

A couple dozen people gathered in the Student Union Feb. 15 to watch the second televised showdown of man vs. machine onJeopardy!. IBM’s latest technological advancement, a supercomputer named Watson, was again pitted against human opponents.  Watson emerged triumphant against its competitors, Jeopardy’s greatest champions, Ken Jennings and Brad Rutter.  After the three day contest, Watson won the $1 million grand prize.  IBM will donate the money to two charities: World Community Grid and World Vision. David Shepler, IBM’s Jeopardy! challenge program manager, presented the showing, and answered questions before and after the show.

“Luck of the draw for Watson,” Shepler said after the machine crushed the contestants representing humanity.  “Haven’t seen Watson do this well in practice rounds—it just so happens these questions are Watson’s sweet spots.”

Before the showdown at 7 p.m., Shepler presented an overview of Watson’s historic deep question answering technology to the sparsely-occupied Multi Purpose Room. Watson generates and scores combinations of thousands of natural language processing, information retrieval, machine learning and reasoning algorithms.  Evidence profiles summarize evidence analysis across many sources and then Watson delivers the answer that has the best support it can find.  Watson learns that Jeopardy! categories are only weak indicators of the answer type through statistical machine learning.

IBM created the Jeopardy! challenge to test run Watson because of the game show’s extensive domain of knowledge presented, the complex language of the clues, the high precision needed, the accurate confidence needed (how much to wager, confidence in right answer), and high speed turnaround (being able to answer in 3 seconds), Shepler explained.  IBM used Jeopardy! as a playing field because it represents real language.

“We really hope to take this into all sorts of business applications,” said Shepler. “We’re going to take this into the healthcare space, into possible contact centers. We can imagine all sorts of applications that can matter to the government to help deal with their information overload that is so so much a problem of our time.”

Real language is ambiguous, metaphoric, implicit and there are an infinite number of ways to express one sentiment.  Open-question answering is hard for computers; however Watson’s groundbreaking technology achieved deep analytics.  Watson achieved champion levels of precision and confidence over huge variety of expression.

“On average Watson is going to buzz faster than a human being,” said Shepler. “However, it’s also true that human beings can out-buzz Watson because of the anticipatory buzz technique. We saw not infrequently humans buzzing in at zero milliseconds or a fraction of a millisecond.”

A machine understanding deep analytics is a huge technological advancement, Shepler told the audience, although not all of the kinks are out yet.  The final question category was ‘U.S. cities’ and the clue was “Its largest airport is named for a World War II hero; it’s second largest for a World War II battle.” Watson answered ‘Toronto’ while his two human opponents answered correctly ‘Chicago.’  Shepler attributed this error to Watson not being able to decipher the semantics of the question.

IBM hopes to take this technology to business applications such as healthcare and government.

“[Options are] limitless you can potentially apply this to,” Mr. Shepler said in an IBM video shown on Jeopardy!.

Additional reporting by Clara Smith