What's wrong with this picture?
As if it wasn't strange enough that Apple are releasing a two-button mouse, they actually faked an animation to show what the scroll-ball does:
Notice how the document view moves in the same direction as the scrollbar thumb? That never happens in iMovie.
Anyway, I just thought I'd write how odd it is that Apple are taking a step away from the single-button user-interface at just the moment when it looks like the tablet market might be about to open up, a class of defice which almost by definition will be operated by a single-click pointer. And I'm surprised they claim it has the elegance and simplicity of a one-button mouse:
"Click on the left side to use Mighty Mouse in its simplest, single-button form."
Yeah! Good one! My seventeen button super hyper pointing device is as simple as your mouse because I can choose to press only one button on it! I can see it now...
"Click on that icon."
(click - a contextual menu pops up)
"No, I meant left-click."
'But there's only one button!'
A mouse which has two buttons, but it looks like there's only one, is advertised to be less difficult to understand than an ordinary multibutton mouse which has a clear distinction between the buttons? What are they thinking?! I'm glad I'm not in the tech support business!
Intel inside?
Everyone and their dog seems to be blogging about Apple's recently announced switch to Intel processors, so I thought I would chime in as well.
It's the user interface which makes a Mac a Mac, and that's all software, so why do I care what type of chip drives the brains of the computer? It should make no difference to the end-user look and feel one way or the other. But anything which affects developers will affect users too, because it's the Applications which really drive the platform.
And it is here we come to our first hiccough in the take-up of Mac on Intel: Do Apple think every developer will give away the Universal Binary version of their application for free? What about things like Office - I have a perfectly good version v.X, if I buy an IntelMac do I get and upgrade to Office 2004 for free? Clearly not. Will Microsoft provide a Universal Binary patch to slightly-obsolete versions of Office? I very much doubt it.
There's Rosetta, of course, Apple's binary recompilation technology which will allow PowerPC applications to run on the Intel CPU. They say it's "Fast (enough)" but I don't believe that Rosetta will be sufficient for anything even vaguely heavy-duty. Eventually you just hit a limit in physics.
And then there's the whole issue of producing a Universal Binary version in the first place, which apparently is very easy so long as you use XCode as your development environment. According to Apple's stats, only 56% of developers are doing that currently - presumably the rest are mostly using Metrowerks Codewarrior (Metrowerks' stats say that 90% of commercial Mac applications are built with CodeWarrior tools). Have Apple considered the likely reasons for this? They might include:
- The gcc compiler on which XCode is based still doesn't produce code as well optimised as CodeWarrior's.
- CodeWarrior runs compile jobs more quickly and using less memory, and its IDE is much less clumsy than XCode (especially when running on slower hardware)
- CodeWarrior also supports MacOS 9 and earlier. (Depending on exactly which version of compiler plug-ins you install, it still supports 68k even).
Steve says: "Metrowerks users? You should switch to XCode." without really providing a convincing solution to any of these points. I would have thought Metrowerks deserved better, after supporting the Macintosh for the last decade or so, than to have Steve Jobs use a WWDC keynote to hammer nails into their coffin when it isn't even closed yet.
On a different line, How will Apple hardware remain distinctive in the marketplace? When they were so clearly different machines, with a different CPU, a higher price tag could be explained by R&D costs. But now, will consumers accept that the Apple Pentium box is more expensive than another PC containing the same CPU, even if the rest of the board is very different? They might as well shove an ASUS motherboard in for all the difference it will make to the marketing. The fact is that there will be an obvious and direct comparison between the prices of Macs and PCs - a comparison which will make Apple look bad, because it was never because of just the processor that the prices were higher in the first place.
And yet, it's now abundantly clear that MacOS X could be made to run on standard PC hardware, it's just that Apple choose not to release it. You can download for free, and run on a standard PC, a fully working version of Darwin. This is MacOS X's core internals, the kernel of the operating system, and the BSD layer. And it runs today. Bringing up the windowserver and gui applications on top of that base is probably going to be pretty trivial unless Apple go out of their way to block it. Unless, that is, the final hardware lineup will contain something other than the bog-standard Pentium 4 which is in the developer transition kits...
This possibility interests me. My considered professional opinion (I work in the semiconductor industry, although not in the desktop CPU space) is that the ia32 instruction set is a crufty piece of legacy poo, which almost everybody in the world of serious computing is trying to ditch as soon as humanly possible. The main reason they can't is because it's so important to be able to run legacy applications: when Apple, by definition, have no legacy applications on the Intel platform, why on earth would they at this point want to give further traction to such a dementedly outdated architecture? I can only hope that Apple intend to use Itanium, not Pentium4, processors. Or ia64 processors by AMD, even - although that seems unlikely given the nature of the announcement: a fait accompli based entirely on the perception that Intel's opus of speculative fiction (roadmap) was better than anyone else's. But realistically it's far from clear that Intel will forever be able to hit a price/power/performance point to beat the rest of the industry.
And, whilst this move means that the performance Apple computers will now keep pace with Dell's, it also means they've given up any future possibility of ever being faster. Interesting times indeed; I hope Steve knows what he's doing.
(8 June 2005)
Odd warning labels
Spotted on a 9-inch Flying Disc (Frisbee-like thing) in Tesco's:
WARNING! Not suitable for children under the age of 36 months because of small parts - choking hazard
What kind of baby...?
Chirality in unexpected places
Recently I discovered you can get right-handed shower gel.
You don't believe me? The bottle is shaped to be held from one side, but if you hold it in your left hand the cap gets in the way when you try to pour the gel out.
Also, the latest Fairy washing up liquid bottle is so asymmetric you can't hold it properly in your left hand.
What surprises me is that people design products to be gratuitously unfriendly to a minority - and left-handed people aren't that small a minority, about 10% of the U.K. population. I'd be interested to hear if any left handed people have noticed other products doing this, especially if anyone actually bought a competing brand because of this right-handed styling. I can't imagine most company shareholders being too pleased with a marketing department which turns away 10% of its potential audience!
Hair today...
... gone tomorrow.
But sheariously, I'll shave the wooly description and cut to the chase. The "computer-scientist / metal guitarist" look was starting to get a bit old, been there, done that, and anyway it's a pain to keep long hair in good condition (high manetenance, so to speak), so I've got rid of it.
On the one hand I probably won't be playing Samson again any time soon, and Alex will have to find another witty nickname to refer to me as when he's talking to his sister...
But as a fringe benefit, I've lost weight! (117g)
Steps towards improved user-friendliness in computers
Reports suggest that a future version of Windows will improve the users' day-to-day experiences by incorporating a new, more friendly blue screen.
Master Chief goes to the movies
For immediate release:
In an exclusive interview today, Steve Jobs revealed that for the past ten years he has been leading an exciting double life. For not only is he the iCEO of Apple Computer, he is also the (allegedly criminal) genius, Doctor Evil.
"How else would I have been able to build a reality distorion field?" he asks. "I'm surprised noone thought it was obvious".
But over the last several years, Steve has grown tired of keeping his alter ego a secret, and now wants the world to know his identity.
"I tried to give people clues" he says. "You know, so people could think they're clever to work it all out". But none of the rumour sites actually seemed to notice. "Next time I'll just fax them an anonymous press release the day before the keynote."
Those clues are more obvious in hindsight, of course. One such detail, the announcement at MacWorld San Fransico 2003 of a cloned PowerBook, was missed by all but the most asute observers. "I'll call it the mini PowerBook G4.", he announced in his Keynote speech. "It's identical to a regular PowerBook G4 in every way, but five eigths the size".
"I actually wanted my own clone, Mini Me, to appear in the advert." recalls Jobs . "But back then he was still a bit concerned about revealing his secret identity as hollywood actor Jake Lloyd. Then of course, Number Two had the idea of asking Verne Troyer to appear, and the rest is history".
Troyer played the role of Mini-Me in the Austin Powers series of documentary films, dramatised by and starring Mike Myers. "It's odd how people seem to think those films are as fabricated as a Microsoft switcher advert." says Myers. "I guess truth is stranger than fiction."
Bill Gates was not available for comment today. None of our sources were able to confirm conclusively that he lives a double life as unfashionable stuck-in-the-sixties spy Austin Powers, but let's face it - the teeth are a bit of a giveaway.