[Source: Wired magazine]
From another perspective, this is also the holy grail of computer performance analysis: convert monitored performance data directly into performance models, feed those predictions or trends back to the computer and let it tune itself (so we can all go home).
Having spent more than a few years thinking about such things, I have to say I'm skeptical; to put it mildly. For one thing, you have to be super cautious when interpreting something that is so well understood (like Newton's laws and pendulums) that you may be relying on latent assumptions without realizing it. I would be more impressed if the computer program found a simple law for a phenomenon where it was not obvious, e.g., turbulence or genomics. For another thing, how does this approach really differ from say, well-known ARIMA modeling techniques for time series?
Keep in mind also that the supercomputer data came from a "double pendulum" and not a simple pendulum (whose periodicity was first noted by Galileo). Why the more complicated coupled pendulums? I won't digress into the subtle distinctions between the angular motion of a pendulum, Hooke's law, and Newton's 2nd law. Instead, I would draw your attention to the fact that their supercomputer program does not actually discover laws at all (in the sense of Newton's laws). Rather, it detects "invariants," which I take to mean constants of the motion. But this is nothing more than a fancy term for something like angular momentum.
A spinning wheel conserves angular momentum (ignoring frictional effects), so angular momentum is said to be a constant of rotational motion; meaning simply that it doesn't change with time (has zero time-derivative). Historically, this invariant was spotted by Kepler in Tycho Brahe's planetary data and enshrined in his beautiful second law of planetary motion: an orbiting planet sweeps out equal areas in equal times. So, roughly speaking, their claim is that their program could be capable (eventually) of gleaning Kepler's 2nd law from a bunch of planetary data. That's already a rather big claim. Moreover, just knowing Kepler's 2nd law is not sufficient to get the big picture. All 3 of Kepler's laws were critical stepping stones for Newton. So, even if we allow that a computer might have been able to discover Kepler's invariant, it's still a long way, both conceptually and historically, from that to Newton's universal law of gravitation.
Incidentally, even if Kepler had not lived and Newton had a supercomputer with the above program installed, he wouldn't be much better off. What most people (including many physicists) fail to appreciate is that the key to Newton's laws, and especially his law of gravitation, is another invariant: the center-of-mass. But this kind of invariant is not a constant-of-the-motion and therefore, in all likelihood, would not to be uncovered by the supercomputer program. Without the center-of-mass concept, even Newton couldn't get his act together.
So, something more than the detection of trivial invariants has to be demonstrated, and demonstrated more than once or a few times. It's possible that the coupled pendulums fit that bill. I think there's no doubt that computer assistance of this type will come to pass, I'm just not convinced that we're there yet. And, as has become all too common on the web, what are just small developments get overexposed and misunderstood.
All this reminds me of similar hype that surrounded neural nets a while back. Quite unnoticed in those same circles was a paper by Oxford mathematical statistician Brian Ripley (he of S and R fame) that pointed out the existence of robust statistical techniques that could do better than NNs, and were better understood.
Those who do not understand mathematical models are doomed to have computers rediscover them in the data.
No comments:
Post a Comment