TechnologyTell

Apple’s Grand Central: Exactly what consumers need!

Sections: Mac OS X, Macintosh/Apple Hardware, Operating Systems, Originals, Software

4
Print Friendly

Apple Grand Central graphicNeil McAllister over at Developer World posted an interesting article regarding Apple’s forthcoming Grand Central technology. It is a great article (a little dense if you are not a hardcore techie), but I wonder if his conclusions are necessarily right. Here are my thoughts:

First, to summarize the technology at hand, as well as McAllister’s points. Grand Central, in Apple’s own words, is a technology that makes the whole of Mac OS X multi-core aware, and provides frameworks that let third-party developers optimize their code for multi-core systems. In plain english: chips aren’t necessarily getting faster, but their processing capacity is expanding, and this expansion requires new code for programs to take advantage of the extra power. McAllister points out that just because mulit-core awareness is baked in, there won’t necessarily be any increase in speed for most users. And even if there was, most users don’t need the extra power. He is right—to a point. Several major pieces of software, like Adobe PhotoShop, QuarkXPress, or Apple’s own Final Cut Pro and Shake, would definitely benefit from all the power they can possibly get. But they won’t get the power until those software packages are rewritten/tweaked to take advantage of the frameworks Grand Central opens up. Add to that the fact that the majority of users never touch these type of programs, and you arrive at his conclusion that Apple may be barking up the wrong tree with this technology.

McAllister frames an argument (quite well, in fact) that the average user who sends e-mail, banks online, and uses Facebook really does not need that much power. Let’s face it: Pixar, Industrial Light and Magic, or Dreamworks might need ungodly amounts of power to make movie deadlines. Joe Ordinary user, by contrast, is likely to be satisfied with the computational capabilities of the iPhone. It can access Facebook, YouTube, and e-mail. So, what on earth could that same user want with a massively powerful multi-core aware Mac OS X? The answer is simple, and it is here that McAllister has missed a point; the software world average users inhabit is not the workstation-world that it used to be. Many applications are really web apps nowadays—Facebook’s photo album features today look a lot like iPhoto did back in 2001 when it first debuted. And the Facebook code has a lot more potential environments to run in, whereas Apple could develop and optimize iPhoto to support a very narrow range of hardware. This means that either Facebook must write a sub-optimal-but-universal codebase, or maintain separate code bases for a variety of devices: iPhones, PCs, mobile devices (anybody remember WAP?), Macs (hey, our Flash player is different!).

I have a five year old PowerBook G4 that still runs like a champ, but it can barely handle YouTube videos. Vids in high quality or with DRM (like abc.com’s full episode player) stutter and drop frames at a rate that makes them unwatchable, and yet I demand that services like YouTube or Hulu constantly give me richer programs. That means more code, which in turn means I need more power to watch them. That also means richer client-side applications (Flash Player, DRM decoders for ABC’s episode player, QuickTime H.264 codecs for YouTube videos). In reality, a handful of programs or technologies actually affect a large number of users; the 80/20 rule at work. As soon as these programs are rewritten to take advantage of Grand Central, Apple’s technology move will have paid off for average users.

Think about it; a 1GHz processor is fast enough for basic e-mail and web surfing, but completely inadequate for watching HD movies. Apple could design a laptop around a quad core 1GHz processor that uses only one core during basic tasks, but ramps up the power as needed (since the OS is handling the multi-core scheduling, it would be challenging, but not impossible). Can we say power savings? And the benefits of having the most advanced operating system would be great for power users, too, so Apple wins in both arenas. Back to Joe Ordinary user—he may not realize that he wants that power available, but as Web 2.0 turns into Web 3.0, he’s going to be glad he has the power.

And that is why Grand Central, though it may be overkill in some instances, really is not Apple barking up the wrong tree. It is about Apple seeing where the future is going, and hedging its bets with more options.

4
Print Friendly
  • Scopie

    Grand Central will also allow quite powerful applications to run on future very low power multi-core ARM chips in smaller Apple hardware devices.

  • Ampar

    The word is English.

  • Snafu

    As far as I understand, Grand Central is some sort of job batchs manager plus some API to help applications define such jobs. In this regard, my guess is this is Apple's equivalent to Intel, Microsoft and others' multithread programming kits for Cocoa.

    So this is about the OS doing better load balancing plus priorizing special cases, and making programmers' lives this bit easier. I'd wish someone could say how much easier: making multithreaded apps seems to be quite difficult. Getting parallelism out of a traditionaly serialized app is a whole field of study with decades of research behind its back and little progress at generalizing results… as oftentimes it is impossible.

    There are cases in which is actually quite possible: Google Chrome is one. Every webpage one loads in this browser is a separate task. Most multiple document processor-intensive apps would benefit from such an approach.

    I don't think Grand Central is going to reveal itself as a gamechanger: it'll make things go smoother but not far far faster. My guess is OpenCL is where we'll see ostensible results: even in serialized apps, at some stages doing a GPGPU quickie could do wonders.

    Could any developer say what's the innuendo without breaking any NDA? Is it going to be great, fine, blah?

  • Cliff

    McAllister's argument is built on the premise that no radically new type of software will hit the market. In fact, it's a sure thing that someone will think of a entirely new category of software. Who predicted web browsers? Who predicted search engines? Who predicted Youtube? Who predicted social networks?

    Whatever is around the corner will undoubtedly require more processing power than single processors can now deliver.

    Perhaps OS XI will not be touch-based but voice-based. Reliable voice recognition software controlling the rest of the computer, running concurrently with video conferencing, web browsing, number crunching, and database operation will require that multiple cores work harmoniously.

    Apple is working to future-proof its computers even though the specifics of the future are always unknown.