Democratic Underground Latest Greatest Lobby Journals Search Options Help Login
Google

IBM Supercomputer "Watson" to play on Jeopardy against the shows best champions.

Printer-friendly format Printer-friendly format
Printer-friendly format Email this thread to a friend
Printer-friendly format Bookmark this thread
This topic is archived.
Home » Discuss » General Discussion Donate to DU
 
Statistical Donating Member (1000+ posts) Send PM | Profile | Ignore Mon Jan-17-11 02:15 PM
Original message
IBM Supercomputer "Watson" to play on Jeopardy against the shows best champions.
Edited on Mon Jan-17-11 02:20 PM by Statistical
An IBM computer beat chess champ Gerry Kasparov in 1997. Now another one will try to vanquish top Jeopardy! players Ken Jennings and Brad Rutter.

Supercomputer Watson, named for IBM's founder, was designed specifically to compete in the long-running quiz show; it can understand natural human language and search its vast database to find the answer. Late this week, the show will tape a match pitting Watson against Jennings, the longest player in the game show's history, and Rutter, the all-time money winner. The match will air the week of Feb. 14, as a two-game tournament played over three nights.

But on Feb. 9, PBS science series Nova will offer Smartest Machine on Earth: Can a Computer Win on Jeopardy!, detailing the development of Watson's artificial intelligence and preparations for the match. "He" can search the answer and buzz in, though he lacks fingers and the entire machine--the size of 10 refrigerators--can't physically be on stage.


To those interested in the evolution of computers this a rather interesting exhibition. This is a far more complex challenge than winning at chess. Chess can be easily modeled as a math problem, and Deep Blue simply could look deeper (moves into the future) than any human player could and would always select the optimal move.



"Watson" will play on Feb 14-16. PBS has a documentary on the development of Watson on Feb 9th.

Ken Jennings (Jeopardy player with longest winning streak) doesn't believe Watson can't do it. "This is too daunting a task for a computer."
Printer Friendly | Permalink |  | Top
peekaloo Donating Member (1000+ posts) Send PM | Profile | Ignore Mon Jan-17-11 02:24 PM
Response to Original message
1. Watson is no Turd Ferguson I tells ya!

:-)
Printer Friendly | Permalink |  | Top
 
cognoscere Donating Member (381 posts) Send PM | Profile | Ignore Mon Jan-17-11 03:16 PM
Response to Reply #1
7. That's your opinion...and I am not wearing an enormous foam
cowboy hat.
Printer Friendly | Permalink |  | Top
 
NuclearDem Donating Member (1000+ posts) Send PM | Profile | Ignore Mon Jan-17-11 03:26 PM
Response to Reply #1
9. Watson, Turd Ferguson...
...and, for some reason...Sean Connery.
Printer Friendly | Permalink |  | Top
 
hootinholler Donating Member (1000+ posts) Send PM | Profile | Ignore Mon Jan-17-11 02:24 PM
Response to Original message
2. Ken, what do you think that thing between your ears is?
Simply different hardware, or actually more apropos to call it wetware.

Natural Language Processing is difficult. Looking stuff up is pretty easy. The real problem is recognizing when the stuff you look up is related to the question in a meaningful way and satisfies the question, er answer in this case.

An interesting exercise and test of the state of worthwhile research that has applications throughout human endeavors.

-Hoot
Printer Friendly | Permalink |  | Top
 
Statistical Donating Member (1000+ posts) Send PM | Profile | Ignore Mon Jan-17-11 02:29 PM
Response to Reply #2
3. Exactly and Jeopardy makes it more difficult with unusually linked questions.
"Not to mention the tricky questions in such categories as "before and after": A candy bar that became a Supreme Court justice? Baby Ruth Bader Ginsburg. Watson got it right."

Having a database on Supreme Court justices is easy. Having a database of candy/foods is slightly more challenging. Having software flexible enough to "understand" that one needs to look for similarily between the two is the challenging part.

You are right though though that the human brain is simply running different wetware. Ken prediction is kinda improbable because even if Watson looses, Moores law continues to march on. In a couple years it will be possible to build a system with a magnitude more neural connections, and even faster searching/indexing ability. More CPU power will make language parsing more accurate, etc.
Printer Friendly | Permalink |  | Top
 
hootinholler Donating Member (1000+ posts) Send PM | Profile | Ignore Mon Jan-17-11 02:50 PM
Response to Reply #3
4. Well by wetware I mean
That the hardware and the software are the same thing. The act of learning actually physically changes the programming by reconfiguring the hardware.

The real breakthrough will be when we can grow actual wetware rather than bumping up a digital cpu speed. That helps in simulating what we think is happening, but what is really happening is vastly different. Wetware is analog/holistic not digital. I'm not aware that we have digital networks working that is even approaching the level of synaptic response in the brain let alone operating in a holistic manner.

There is interesting research using holograms as an image storage/query device. holograms are stored in the device and it was queried by putting a new hologram in, the output being the closest hologram being stored, returned instantaneously. I'm not sure what the state of this is these days, it was 15 years ago when I heard about it and thought how cool is that?

-Hoot
Printer Friendly | Permalink |  | Top
 
Statistical Donating Member (1000+ posts) Send PM | Profile | Ignore Mon Jan-17-11 03:07 PM
Response to Reply #4
6. Reconfirgurable "learning" hardware is likely some ways off.
Edited on Mon Jan-17-11 03:12 PM by Statistical
However the sheer speed of digital processors does allow a very accurate "simulation". In essence rather than building adaptable wetware they are building a "wetware model". A model that is able to evolve, adapt, learn. The model is then connected to large datastores and huge amounts of parallel processing power is then used to simulate that model. As the model gets more complex more hardware was added to keep the interaction at "real time".

Still we really are at the Commodore 64 stages. Future applications are limitless. Someday we could have computers that teach students. Each lesson plan could be individualized as the interactions between student and computer allows the computer to identify the areas the student is having problems, and the areas where further study is not necessary. The lessons would be targeted and allow the student to learn optimally without going too fast, or too slow or spending dozens of boring hours teaching things the student already understands. The system could branch off and go into a tangential topic that the student shows interest or promise in. Something that is impossible in a one teacher to many students model. The system would use adaptive logic to teach students in the manner the student best learns. Some learn best by verbal interaction, some need more visual interactions, some learn by merely experimenting and the system providing feedback, questions, etc.

Rather than boring standardized testings the system in real time could measure comprehension of topics by asking probing questions.
Printer Friendly | Permalink |  | Top
 
Swede Donating Member (1000+ posts) Send PM | Profile | Ignore Mon Jan-17-11 03:04 PM
Response to Original message
5. You Can Play Jeopardy Against Watson,
You Can Play Jeopardy Against Watson, IBM's Trivia-Master Supercomputer ,

http://www.nytimes.com/interactive/2010/06/16/magazine/watson-trivia-game.html?ref=magazine
Printer Friendly | Permalink |  | Top
 
Lint Head Donating Member (1000+ posts) Send PM | Profile | Ignore Mon Jan-17-11 03:24 PM
Response to Original message
8. So, does that mean figuring out the worlds critical problems
is put on the back burner for Jeopardy? Ironic.
Printer Friendly | Permalink |  | Top
 
Statistical Donating Member (1000+ posts) Send PM | Profile | Ignore Mon Jan-17-11 03:30 PM
Response to Reply #8
10. Abstract research is always useful.
Edited on Mon Jan-17-11 03:38 PM by Statistical
The development of Deep Blue lead to better understanding of bottlenecks and unique performance issues in massively parallel system. Those discoveries helped improve current supercomputing clusters used today to model everything from climate change, to genome mapping, to whole earth simulations.
Printer Friendly | Permalink |  | Top
 
DU AdBot (1000+ posts) Click to send private message to this author Click to view 
this author's profile Click to add 
this author to your buddy list Click to add 
this author to your Ignore list Thu Apr 25th 2024, 11:47 PM
Response to Original message
Advertisements [?]
 Top

Home » Discuss » General Discussion Donate to DU

Powered by DCForum+ Version 1.1 Copyright 1997-2002 DCScripts.com
Software has been extensively modified by the DU administrators


Important Notices: By participating on this discussion board, visitors agree to abide by the rules outlined on our Rules page. Messages posted on the Democratic Underground Discussion Forums are the opinions of the individuals who post them, and do not necessarily represent the opinions of Democratic Underground, LLC.

Home  |  Discussion Forums  |  Journals |  Store  |  Donate

About DU  |  Contact Us  |  Privacy Policy

Got a message for Democratic Underground? Click here to send us a message.

© 2001 - 2011 Democratic Underground, LLC