The Speculist asks "did the Singularity just happen on Jeopardy?" The answer is "no". Not that Watson isn't incredibly cool, but the Singularity will require self-improving super-human artificial intelligence.
"Self-improving" probably isn't quite the right term. I don't mean that the AI needs to "learn" in the sense that it can improve it's score on Jeopardy each time it plays. That sort of improvement is necessary, but not sufficient. In order to reach the Singularity, an AI will need to be able to create new AIs that are qualitatively more capable than itself. Each new AI generation will need to have a broader domain of capabilities than the previous generation, not just be slightly better at accomplishing the same tasks.
Maybe "self-expanding" is better than "self-improving".