It all depends on the standard of measurement. Intel chips (the X86 in general) has a higher clock speed but equivalent or lower throughput when compared to the PowerPC chip in a Mac. (of equivalent age, I'm not trying to skew that representation). What ultimately is the concern is compatability; you're starting to see it crop up in the consoles, and it's still a problem on the Macs, but what allowed x86 to flourish in the first place was not its architecture but the fact that a) it was first (16-bit commercially produced), and b) software and compilers are already written for it and were and still are supported by future generations of the architecture. If your processor changes format and has to have all of its software recompiled every time you upgrade, you can't carry forward the old stuff that you may still want/need. How many of us are still playing games that came out on the previous generation of Intel chips? I'd wager most of us. They still work (unless it's due to a change in Windows, another issue but largely unrelated). Apples don't always do that, for example; the new MacOS will only run on processors made after a certain date, the old (9.x) will only run on processors made before a certain date. So it's as important a measure on the topic of computing as any measure of speed, though few people realize it.