Interesting People mailing list archives

Re: Apple's Spat With Google Is Getting Personal


From: David Farber <dave () farber net>
Date: Wed, 17 Mar 2010 05:20:41 -0400



Begin forwarded message:

From: Gene Spafford <spaf () cerias purdue edu>
Date: March 16, 2010 8:52:16 PM EDT
To: Jim Warren <jwarren () well com>
Cc: dave () farber net
Subject: Re: [IP] Re: Apple's Spat With Google Is Getting Personal

Jim, perhaps you're right. You certainly make some good points. But I remain unconvinced.

Apple doesn't have a complete monopoly as people claim.  For instance, everyone complained about the iTunes format and 
the iPods.  Yet, they played several open formats just fine and still do.

Apple computers use standard interconnects for devices, and Apple even helped define some of them (e.g., Firewire).  
Again, not a huge deal.  Most of the source code is open source, with a proprietary kernel.  "Open" has a range of 
meanings and instantiations.

I liken it to my BMW.  I have no real desire to mod the engine because it works so well as is.   I hardly ever even 
bother to pop the hood, other than to add windshield wiper fluid.  Are there a few engineers who might want to tinker 
with one to see what happens?  Sure, but they could buy something a lot cheaper and less functional to experiment on.   
Is BMW doomed because they keep a tight lid on some of the mods and accessories to their cars?   I don't think so.

My TVs work without being "open" to all changes.  My clocks do.  So does my oven.  My stereo is a closed system with 
well-defined interfaces, and it works the way I want, and I can hook things together to make something that fits close 
enough to exactly what I want.

When computers were only sold to those of us who liked to tinker, and some of the features and documentation were less 
than wonderful, it was really necessary to have access.  The same was true in the early days of cars.  You had to be an 
expert mechanic and tinkerer, and if you knew of some ways to add nonstandard parts you could do some nifty mods.  But 
the proportion of owners who needed to do that, or even wanted to, has decreased.   And I remember the days when I 
would do some of my own TV & radio repair by taking the tubes to a test-it-yourself machine at the drugstore!   For 
years I kept one of those TVs that I rebuilt and did some changes on, and continued to gripe about not being able to 
find a tube tester anymore.... Now I wouldn't even know how to start to test the TVs I have.    But it really isn't a 
big deal now for consumers, and no major TV vendor is threatened with extinction because owners can't replace the tubes 
or change the tuner circuitry themselves.

Computing is changing too.  The hardware of some systems doesn't need a lot of tinkering to perform well, and as form 
factors have changed some can't easily be taken apart to mod at all.   The software is also trending in that direction 
-- how many people still need to know how to program in assembler vs. Java, HTML, and beyond?  We used to require that 
all students learn to program in assembler, and now it is considered an elective for most paths through the major.   30 
years ago, numerical analysis was required of all our students because they needed to know how to write and maintain 
numerically stable computations.  Now, if those aren't already in the tools they use, they find the libraries they need 
on the Web.  We need far fewer people trained in how to do that programming, and I would guess that the majority of CS 
majors graduating this year don't even understand that there could be a problem!

This is a standard shift in technology literacy as technology matures and becomes more stable -- as the needs of the 
customers becomes better understood and they go on to needs and interests at a higher level of abstraction.   We have 
quite a ways to go in computing, but the pressure to "get under the hood" is shifting to other areas.    It is the 
whole idea behind things like Chrome and building on top of WWW services.  It is a force behind virtualization and 
cloud computing. So long as the stuff "underneath" works, the big issue is what can be developed "on top."

Thus, it is not surprising that older companies such as TI (and Burroughs and Xerox and....) fell by the wayside by 
being too closed when the field was younger.  (There were other reasons, too.)  

Are we at a point where Apple can get away with the amount of control they have been exercising?   I dunno, but I am 
not as certain as you are that they cannot, especially if they continue to be willing to innovate.

That, by the way, is a problem with Microsoft.  They have this incredible property that everything is built on and 
around: Windows.  The problem is, they can enhance it for only so long.   The only way to succeed is to come up with an 
independent successor (they have to kill the Buddha by the side of the road, as it were). It must be the things that 
Windows is not and may never have been:  small, secure, minimal, easily customize for different devices and 
environment, etc.   But few big companies can enthusiastically build competitors for their own products and make them 
succeed.  

So, bottom line, I am still not convinced either way.  I agree that recent history supports your argument, but 
computing the way we've been doing it now is so new, it isn't clear that our own field's history is the best guide.

Regards,
--spaf





-------------------------------------------
Archives: https://www.listbox.com/member/archive/247/=now
RSS Feed: https://www.listbox.com/member/archive/rss/247/
Powered by Listbox: http://www.listbox.com


Current thread: