[BBLISA] "How to Really Scare Microsoft"
Tom Metro
tmetro+bblisa at vl.com
Thu Nov 10 11:56:22 EST 2005
Some comments on Marcus Ranum's "How to Really Scare Microsoft" talk...
But first I'd like to point out a blog entry by Stephen Walli
(Vice-president, Open Source Development Strategy at Optaros, Inc. - a
Boston area open source support company) titled "MSFT will not be
Trading in Ten Years":
http://stephesblog.blogs.com/my_weblog/2005/11/msft_will_not_b.html
The article nicely dovetails with Marcus's talk, though offers an
alternate path to the same conclusion. It was prompted by the recent
debate over Open Document Format here in Massachusetts. It starts with
the premise that Linux vs. Windows is the wrong battle, and not where
Microsoft has its greatest weakness. Instead pointing out that
transitioning to using OpenOffice on Windows has far fewer barriers, and
that it hits Microsoft in ways that will be harder for it to defend against.
Here are some key paragraphs:
In the Innovator's Dilemma, Clayton Christensen talks about over
delivering on functionality, but in the Innovator's Solution he gets
into what actually happens in the market at that point. When an
encumbent starts over delivering, and customers can't absorb the new
innovation fast enough, (and therefore are unwilling to pay for it)
a call for standardization comes from the marketplace.
[...]
The standard has happened. It's OASIS Open Document Format (ODF).
[...]
Microsoft will react in very predictable ways. ... First, they will
tell the sales force to "not lose to OpenOffice" because they
believe it's about innovation and marketing, not value and solution.
They may even tweak compensation models to enable the sales teams
to rapidly discount around the OpenOffice experiments. This will
accelerate their problem.
[...]
Second, they will continue to hammer away at the message that it's
about innovation, and tightly integrated innovation at that. But
the problem is that they've already innovated past the needs of most
of their users in Office. Delivering more innovation exacerbates
the problem.
[...]
Microsoft Office represents a considerable amount of their revenue
stream. It won't take many OpenOffice experiments (15%?) to impact
that revenue stream. So there will be a down quarter or two and
Wall Street will punish Microsoft through its stock price.
Microsoft will have to behave differently: they won't be able hire,
compensate, and retain staff the same way. They won't be able to
execute in the same way.
It goes on to discuss how Microsoft's cash reserve won't necessarily
save them.
Now back to the talk...
The concept Marcus described seemed to be very similar to something far
closer to the present than "Project Athena"[1], namely network computers:
http://en.wikipedia.org/wiki/Network_computer
A network computer is a lightweight computer system that operates
exclusively via a network connection. ... it runs applications off
the network ... During the mid to late 1990s, many commentators, and
certain industry players such as Larry Ellison, predicted that the
network computer would soon take over from desktop PCs, and everyone
would use applications over the internet instead of having to own a
local copy. So far, this has not happened, and it seems that the
network computer "buzz" was either a fad or not ready to happen.
So is Marcus' concept substantially different? Is it less about the
superficial similarities in the hardware/software and more about the
different business model Marcus proposes? Or is the world now ready for
network computing?
Part of the problem is that it isn't obvious why network computing
failed to take hold. My theory is that it arrived on the scene at a time
when the industry was hitting a peak in the complexity of the desktop
environment. If you ignore the increasing complexity of the OS
(Microsoft integrating more "features") and the addition of anti-virus
and anti-spyware tools, I'd say that since the late 90's, desktops
probably have fewer custom applications installed on them, as corporate
apps. have migrated to the web, leaving desktops with a fairly
standardized complement of applications consisting of an office suite,
email client, and a web browser.
So perhaps it was lack of flexibility that killed the idea of network
computers in the late 90's, and perhaps the marketplace has changed
enough such that this is no longer a fatal flaw.
Marcus describes an OS that is built from the ground-up for this
purpose. Eliminating the mess of supporting vast variations in hardware.
But he didn't really address why eliminating that complexity was
beneficial. He seemed to imply that the ability to write to the "bare
metal" would offer performance benefits. But is that really core to this
idea? Is performance a problem in the typical desktop?
If stability is the concern, wouldn't you get equivalent gains by simply
limiting the scope of supported hardware and qualifying the drivers
used? As a friend pointed out after the talk, that extra layer of
abstraction is of great benefit to programmers and the ability to
maintain the code.
There were a bunch of other implementation details that Marcus mentioned
that seemed like improvements over existing technology, but not
necessarily requirements for making his high-level idea work.
If the objective is getting system administration of desktops as close
to zero as possible, how close could this concept be approximated by
simply using a CD (or DVD) bootable operating system? This could easily
be done today using existing bootable Linux distributions (such as
Ubuntu, for example) that provide a friendly, mostly self-configuring
desktop environment. Any remaining configuration required could be
tweaked out of the distribution by targeting it to the specific, limited
set of hardware in use. Add to that a standard convention for mounting
network storage for data and applications, as well as the storage of
personal settings and PKI keys on a USB drive, and you're pretty much
there. It isn't 100% the same, but would this 80% solution have the same
business impact?
(There are CD bootable versions of Windows as well[3], but it would be
far more of an up hill battle to make it work, not to mention licensing
complications.)
Marcus hinges his concept on having a big name to promote the idea. And
while that is likely a requirement to make this happen quickly, if the
benefits provided by this model are tangible, shouldn't it also work on
a small scale? Consider a regional VAR selling CD bootable desktops to
medium size businesses, bundled with a companion file server. Sure, the
subscription software model doesn't really fly if you don't get industry
buy-in, but if most desktops are only running Linux, Open Office,
Firefox, and Thunderbird, then there are no software vendors involved,
and the VAR is free to set their own pricing model for maintenance.
One big downside is that you'd be asking a business to take a leap of
faith on a new desktop model, and a regional VAR might not have the
record of longevity to make that case. Though strict adherence to open
source and open standards should insulate against fears of being
orphaned or vendor lock-in.
-Tom
1. http://www-tech.mit.edu/V119/N19/history_of_athe.19f.html
2. http://www.ubuntulinux.com/
3. http://www.ubcd4win.com/
--
Tom Metro
Venture Logic, Newton, MA, USA
"Enterprise solutions through open source."
Professional Profile: http://tmetro.venturelogic.com/
More information about the bblisa
mailing list