Neil McAllister over at Developer World posted an interesting article regarding Apple’s forthcoming Grand Central technology. It is a great article (a little dense if you are not a hardcore techie), but I wonder if his conclusions are necessarily right. Here are my thoughts:
First, to summarize the technology at hand, as well as McAllister’s points. Grand Central, in Apple’s own words, is a technology that makes the whole of Mac OS X multi-core aware, and provides frameworks that let third-party developers optimize their code for multi-core systems. In plain english: chips aren’t necessarily getting faster, but their processing capacity is expanding, and this expansion requires new code for programs to take advantage of the extra power. McAllister points out that just because mulit-core awareness is baked in, there won’t necessarily be any increase in speed for most users. And even if there was, most users don’t need the extra power. He is right—to a point. Several major pieces of software, like Adobe PhotoShop, QuarkXPress, or Apple’s own Final Cut Pro and Shake, would definitely benefit from all the power they can possibly get. But they won’t get the power until those software packages are rewritten/tweaked to take advantage of the frameworks Grand Central opens up. Add to that the fact that the majority of users never touch these type of programs, and you arrive at his conclusion that Apple may be barking up the wrong tree with this technology.
McAllister frames an argument (quite well, in fact) that the average user who sends e-mail, banks online, and uses Facebook really does not need that much power. Let’s face it: Pixar, Industrial Light and Magic, or Dreamworks might need ungodly amounts of power to make movie deadlines. Joe Ordinary user, by contrast, is likely to be satisfied with the computational capabilities of the iPhone. It can access Facebook, YouTube, and e-mail. So, what on earth could that same user want with a massively powerful multi-core aware Mac OS X? The answer is simple, and it is here that McAllister has missed a point; the software world average users inhabit is not the workstation-world that it used to be. Many applications are really web apps nowadays—Facebook’s photo album features today look a lot like iPhoto did back in 2001 when it first debuted. And the Facebook code has a lot more potential environments to run in, whereas Apple could develop and optimize iPhoto to support a very narrow range of hardware. This means that either Facebook must write a sub-optimal-but-universal codebase, or maintain separate code bases for a variety of devices: iPhones, PCs, mobile devices (anybody remember WAP?), Macs (hey, our Flash player is different!).
I have a five year old PowerBook G4 that still runs like a champ, but it can barely handle YouTube videos. Vids in high quality or with DRM (like abc.com’s full episode player) stutter and drop frames at a rate that makes them unwatchable, and yet I demand that services like YouTube or Hulu constantly give me richer programs. That means more code, which in turn means I need more power to watch them. That also means richer client-side applications (Flash Player, DRM decoders for ABC’s episode player, QuickTime H.264 codecs for YouTube videos). In reality, a handful of programs or technologies actually affect a large number of users; the 80/20 rule at work. As soon as these programs are rewritten to take advantage of Grand Central, Apple’s technology move will have paid off for average users.
Think about it; a 1GHz processor is fast enough for basic e-mail and web surfing, but completely inadequate for watching HD movies. Apple could design a laptop around a quad core 1GHz processor that uses only one core during basic tasks, but ramps up the power as needed (since the OS is handling the multi-core scheduling, it would be challenging, but not impossible). Can we say power savings? And the benefits of having the most advanced operating system would be great for power users, too, so Apple wins in both arenas. Back to Joe Ordinary user—he may not realize that he wants that power available, but as Web 2.0 turns into Web 3.0, he’s going to be glad he has the power.
And that is why Grand Central, though it may be overkill in some instances, really is not Apple barking up the wrong tree. It is about Apple seeing where the future is going, and hedging its bets with more options.