Friday, May 15, 2009

WolframAlpha Performance Degradation

Surprise, surprise! After the big wind-up, it turned out that WolframAlpha wasn't really ready for prime time. In an LA Times interview today, Stephen Wolfram, the creator of the site -- five years in the making -- sheepishly explained that a large-scale traffic simulation test had failed. Oops!

“We ran into a small snag, which hopefully won’t turn into a big snag," he said.

"We have several supercomputer-class compute clusters. One of our tests was to use one cluster to simulate traffic and run it against the other cluster. And when we did that last night, we found that the through-put we got degraded horribly when we increased the amount of traffic that we were pushing from one cluster to the other.

"So we don’t quite understand that, and that would very much degrade the through-put that we could get."


All of which begs the question: Why are you just now doing that level of testing, this close to launch day?

Anyway, the description sounds horribly familiar. Let's throw the USL, expressed in Wolfram's Mathematica, at WolframAlpha performance, shall we?



In other words, the coherency penalty (β) is an order of magnitude worse than the level of contention (α), in this example, and that produces the severe peak in the throughput curve. See my previous post for more background. I'm not sure why this would be, because I would've expected the Wα workload to be read-intensive rather than write-intensive. Reads can be made highly parallel, if you know what you're doing. So, the coherency problem could be elsewhere in the system.

They claim it will be running in 5 data centers totaling 10,000 CPUs. That's only moderately Google-ish. It will be interesting to see what happens when it finally does go live. These things have a tendency to behave themselves, until you let the public get a hold of them.

4 comments:

  1. The launch didn't go as well as they had hoped. I imagine some investors are less than ecstatic,

    ReplyDelete
  2. Interesting point. investors. Are there any? I have no idea what the financial situation is wrt backers, et al. Wolfram himself is not poor from MMA revenues and he surely has supported most of this development. He likes control. But I have been wondering, myself, where all this is going from a financial standpoint.

    A lot of what Wα does is based on the way symbolic computation works, but applied to more qualitative data. That's why it needs manual intervention ("curation") to prep all those data. Hence, Wα can probably do many of the things that MMA does. That overlap means a potential hit for future MMA revenue (or it might be viewed as a loss leader). MMA is also up against more and more challenges from similar tools coming out of the FOSS community.

    And will Wα be free, like Google? If so, then there's no revenue stream from Wα. Or does he view as being more like The Britannica? If he tries to charge for some level (e.g., WolframAlpha Professional--à la the M$ model), that will be anathema to current conventions and trends for web tools. Still might not be much of a revenue spinner. Nobody seems to have discussed this aspect of Wα, as far as I know.

    ReplyDelete
  3. 1. I'm curious how you got the alpha and beta input parameters for your model.

    2. If you've followed me on the Internet the past few months, you know that I'm a big fan of Wolfram|Alpha. So I'm not at all sure I'm an "unbiased observer". I'm going to post a review on my web site some time this weekend.

    ReplyDelete
  4. All measurements were carried out here at Performance Dynamics Labs using the best equipment available under hermetically sealed conditions. The sign on the door says "Gedunken Experimentieren".

    I did query WolframAlpha "Why is your throughput degraded?" But eventually, after a very long time, it responded with: "I'm sorry Dave" (which is not my
    name, btw), "I'm afraid I can't do that... Wolfram|Alpha has temporarily exceeded its maximum test load." (which I knew already--that's why I asked). See http://twitpic.com/5bji0

    ReplyDelete