tag:blogger.com,1999:blog-42628883807932117892024-03-13T09:25:55.843-07:00A Matter of Perspectivepr1http://www.blogger.com/profile/10015074312618595113noreply@blogger.comBlogger24125tag:blogger.com,1999:blog-4262888380793211789.post-50036222732961137302008-09-17T03:10:00.000-07:002012-12-08T03:11:07.843-08:00Architecture gem After a couple of months of forced inactivity I can now pick up blogging
again. What better way to start again by writing about two of my
favorite subjects:<br />
<ul>
<li>Architecture and its practioners, and </li>
<li>Government IT. </li>
</ul>
Since
I studied public administration back in the days, I occasionally read
the magazines that are typically read by former public administration
students. I came across an issue of "Digitaal Bestuur" (digital
administration), of which the theme is "Government and ICT". This issue
had an article by a lead architect (!) from a government body, with the
illustruous title "IT is difficult. IT within government is extra
difficult". Fascinated by this title, I started reading and came across
the following excerpt:<br />
<em>"It is getting more and more difficult
to control the increasing complexity of IT. Computers have become 1000
to 1500 times more powerful in the past 15 years, as a result the
systems have become 1000 to 1500 times more complex, and to make matters
worse computers nowadays all communicate with each other through
networks."</em><br />
After I had read the above excerpt 5 times (I
really wanted to make sure that I really read what I thought I was
reading), I finally understood why governments are struggling so badly
with IT and why architects are still having trouble to get accepted or
even taken seriously by other IT and enterprise disciplines.pr1http://www.blogger.com/profile/10015074312618595113noreply@blogger.com0tag:blogger.com,1999:blog-4262888380793211789.post-67982628308937289982007-12-04T03:33:00.000-08:002012-12-08T03:34:42.842-08:00Modernization 2.0 When I was stuck in traffic again this morning and listening to the
radio, I heard about the news that Belgium is facing tremendous trouble
in forming a new government after the most recent elections. The same
has been true in the past for The Netherlands, where the results of the
elections provided the image of a very divided country, with quite some
votes for the more extreme left-wing and right-wing parties. It took
quite a while to form a government, and the Dutch voters are now stuck
with a government that nobody is really enthousiastic for (except maybe
the families of the new ministers and the memebers of the three
coalition partners).<br />
But enough about politics, this blog is about
perspective on IT. But I think this trend in politics could have
something to do with a growing structural differentiation in society due
to technology. Let me explain why.<br />
<br />
Somehow, I really have the
impression that in many aspects in society and our everyday life we are
experiencing the consequences of what sociologists call "modernization".
This refers to a concept that describes a process in which society goes
through industrialization, urbanization and other social changes that
completely transforms the lives of individuals. Key elements in
modernization are structural differentiation and cultural
generalization, or in plain English increasing individualism and
mono-culture. Lately, this modernization appears to be happening at warp
speed, not in the last place because of the explosive growth of
technology usage in Western society. The use of technology and new (IT)
concepts, especially those branded as Web 2.0, accelerate the processes
of structural differentiation and cultural generalization.<br />
<br />
I wrote about culutural generalization before in a reaction to Nick Carr's claim that Web 2.0 is amoral. Former Internet boy wonder Andrew Keen goes as far as saying
that today's internet is even killing our culture. Although I have some
second thoughts hearing this from somebody who made a fortune on the
internet (that's not "eat your own dog food", but "spit in your own dog
food"), I have ordered his book for the Christmas holidays that are
coming up. It should be an interesting read.<br />
<br />
On the other hand,
there appears to be some evidence that the explosion of internet usage
is also accelerating structural differentiation I mentioned earlier.
According to Oxford University Press, societies are seen as moving from
the simple to the complex via a process of social change based on
structural differentiation. The process may be imagined, in its simplest
form, as an amoeba dividing, redividing, then redividing again. Society
is evolving from quite simple structures into seperate institutions of
education, work, leisure, government, religion and social contacts. Most
people do not stay with the same employer their entire working life
anymore, some even work at 2 or 3 different employers, churches see a
declining number of people, small political parties are attracting
voters from the traditional larger parties, mass markets are being
replaced by niche markets (Long Tail, anyone?). In fact even identities
of single persons are differentiating: John Doe who works as a loyal
clerk during day-time, may be SuperVixen666 in his favorite virtual
world at night.<br />
<br />
Media is already heavily influenced by the 2.0
phenomenon (serious newspapers publishing movies of incidents that were
filmed by readers with their cameras on their mobile phones), politics
arguably also is facing the consequences of the wisdom of the crowd
(some might say the power of the mob) and the collective intelligence.
It will be very difficult to predict where this is all leading to, but
for sure society is dealing with Modernization 2.0.pr1http://www.blogger.com/profile/10015074312618595113noreply@blogger.com0tag:blogger.com,1999:blog-4262888380793211789.post-68813734231841059422007-10-12T03:36:00.000-07:002012-12-08T03:36:47.237-08:00Oracle makes offer for BEA Systems <a href="http://www.oracle.com/bea/index.html" rel="nofollow" target="_blank">Press release by Oracle</a>:<br /><br />"Oracle
Corporation (NASDAQ: ORCL) today confirmed that it delivered a letter
to the Board of Directors of BEA Systems, Inc. (NASDAQ: BEAS) on October
9 in which Oracle proposes to acquire BEA for $17.00 per share in cash.
The $17.00 per share offer is a 25% premium over yesterday's closing
price of $13.62."<br /><br />If I did not know anything about acquisitions, I would think that this is Larry Ellison's reaction to the Business Objects acquisition
by arch rival SAP (don't get mad.... get even). But as these
acquisitions and offers usually take quite a long preparation time, I do
not think this is a reaction to the SAP activities.<br /><br />Leaves me
wondering though what the rationale is for Oracle to make the offer for
BEA. Oracle states in its press release that "the acquisition of BEA by
Oracle will enable an increase in engineering resources that will
in-turn accelerate the development of our [Oracle's, LB - sic]
world-class suite of middleware". Which might mean that Oracle is facing
some challenges, maybe even problems with its current Project Fusion
program. Putting it all together is one of the key challenges Oracle
faces in its Fusion endeavor, and the acquisition of the brightest
engineers from BEA could be considered a way to face this challenge.
This should not come as a surprise as I have said before that one of the key drivers for Oracle's acquisition wave was "acquiring the people who created the technology".<br /><br />Also, the potential acquisition of BEA supports my prediction and observation
that increasingly pure-play vendors are disappearing, as the
functionality of their products is incorporated into enterprise
applications, OS and hardware.<br />Maybe this is the moment to say: integration middleware is now officially commoditized.pr1http://www.blogger.com/profile/10015074312618595113noreply@blogger.com0tag:blogger.com,1999:blog-4262888380793211789.post-90071824609231529422007-10-10T03:38:00.000-07:002012-12-08T03:41:07.494-08:00Why SaaS will make and why ASP didn't I picked up blogging again last week when I wrote
that the acronym SaaS does not ring many bells yet with decision
makers. Maybe ASP does, but that is something many decision makers do
not want to get involved with as that is so 90's.<br />
Still there are
many people that think that SaaS is the same as ASP. Take this
definition for instance which I found on the web site of the Dutch research firm that concluded that SaaS is not known by many decision makers:<br />
"SaaS
(Software as a Service) is software provided as a hosted service which
is accessed over the Internet, and which is billed through a
subscription structure. This is also called ASP(Application Service
Providing)."<br />
<br />
Let me explain why I think this makes no sense. To
start with, the delivery model is completely different: with ASP, you
pay a monthly fee for a fixed period (sometimes even for the duration of
3 to 5 years), while with SaaS you only pay for the usage ("pay as you
go"). This may sound trivial, but it has consequences for the customer
orientation of the provider: because a subscriber is not tied to a fixed
(maybe long) period arranged by a contract, which is the case with ASP,
it is easier to make the switch to another provider. This implies that
the SaaS provider must do more for client retention, hence deliver great
customer service.<br />
The architecture of products is usually also very
much different between an ASP and a SaaS provider, mainly because SaaS
products have been developed from scratch to be used over the Internet
(take Salesforce.com as an example). The ASP versions of products have
usually been "web-enabled", but still have a typical client-server
architecture. In the past, the ASP model therefore never fully delivered
what it promised, because of the lack of bandwidth, or because of a
performance degrade resulting from the web-enablement of the
application.<br />
<br />
In short: both the delivery model and the
architecture of SaaS and ASP are different: SaaS solutions are generally
speaking better prepared (optimized) for usage over the Internet, and
its delivery model is more flexible than the ASP delivery model. So, we
are dealing with different animals (that live in the same Zoo though,
named Outsourcing). The key advantages of SaaS over ASP make that I
believe that SaaS will succeed, as these are exactly the reasons why ASP
did not succeed.pr1http://www.blogger.com/profile/10015074312618595113noreply@blogger.com0tag:blogger.com,1999:blog-4262888380793211789.post-9421796602113012502007-10-06T03:42:00.000-07:002012-12-08T03:42:56.766-08:00Are we dealing with GAF(general acronym fatigue)? Two recent news items have made me wondering whether we are dealing with GAF: a General Acronym Fatigue.<br /><br />The first news item is from a Dutch IT newspaper, which quotes research from a Dutch IT research organization:
73 percent of all IT decision makers do not know spontaneously what
SaaS (Software as a Service) is. Only after an explanation of what is
meant by SaaS (a formal definition), 69 percent say they know SaaS.
Which leads to the conclusion that only 4% of IT decision makers <em>really </em>have
no clue what SaaS is about, and 69 percent are just confused by the
acronym / terminology. An ASP (Application Service Provider, not Active
Server Pages) expert in The Netherlands says this is due to the fact
that companies, vendors and end-user organizations all have different
names for the phenomenon: "Web 2.o", "internet-accounting" or "online
services". Some people have even called it "ASP 2.0", which is really
silly as there are substantial differences in the delivery models as far
as I am concerned.<br /><br />The second news item has been not on a single
source, but in fact has been heard among many blogs: more and more
people are seriously questioning the ESB (Enterprise Service Bus) is worth using. I have answered this question already some time ago: it is not!
It is just replacing one low viability solution (CORBA or brokers) with
another (service buses). I have plead before to use Apache for instance
as a service bus (or rather I would name it service intermediary), and
obviously more people have come to this conclusion. But then of course, we are talking about the design pattern, and not about the commercial product type some people considered to be the next generation application server. I suggest to drop the name ESB, and instead to name it what it is: a service intermediary.<br /><br />The
future does not look toto bright for people confused with acronyms, in
particular because the aaS letters are pasted to any random technology
or product category you can imagine: PaaS (Platform as a Service), CaaS
(CRM as a Service), AaaS (Accounting as a Service) well, you get the
idea... the good thing is that we run out of options at 26 acronyms...<br /><br />If
we (yes, everyone in IT is responsible for this GAF situation) are
unable to come up with good names for new concepts, patterns,
technologies or products, we should not blame those poor IT decision
makers for not understanding our value proposition. In 9 out of 10 times
we are not dealing with an ignorant decision maker, but just with
someone suffering from GAF.<br /><br />OK, I am off for now, I am going for a BLT.pr1http://www.blogger.com/profile/10015074312618595113noreply@blogger.com0tag:blogger.com,1999:blog-4262888380793211789.post-2250036684983380282007-04-21T03:46:00.000-07:002012-12-08T03:47:02.101-08:00The Second Life world is flat Some folks have gone really excited about the "news" that Linden Labs is going to open source the back-end of Second Life. The amount of attention this non-news has gained, confirms my observation that Second Life is at the peak of the hype. Why? Because there really is nothing new or spectacular here in the story most bloggers and authors refer to. It comes down to Joe Miller, VP for platform and technology development at Linden Lab stating the following at VW07:<br />
<ul>
<li><em>"We’ll be open-sourcing the back end so sims can run anywhere on any machine whether trusted by us or not."</em></li>
</ul>
OK. When? Under which conditions? How? Just this statement is not truly shocking if you ask me.<br /><ul>
<li><em>"We’ll be delivering assets in a totally different method that won’t be such a burden on the simulators."</em></li>
</ul>
If I understand correctly, the move is primarily driven by the lack of capacity for the back end run by Linden Lab. <br />
<ul>
<li><em>"Very soon we’ll be updating simulators to support multiple versions so that we don’t have to update the entire Grid at once."</em></li>
</ul>
Again my question: when? When is very soon?<br /><ul>
<li><em>"We’ll be using open protocols."</em></li>
</ul>
Which
ones? I heard rumors that IBM is having talks with Linden Labs on
protocol developnment. With the WS-* disaster in the back of our minds,
of which IBM was one of the key creators, we should not expect any added
value really here. In fact if I were in a very cynical mood I'd say
this is the kiss of death for Second Life, but it's Saturday so I am not
:-).<br /><ul>
<li><em>"SL cannot truly succeed as long as one company controls the Grid."</em></li>
</ul>
The
bottom line, although it should read that SL cannot survive as long as
one company controls the Grid (please note the uppercase G here. As if
we are talking about the Matrix. Would have been even more striking if
also an uppercase T was used).<br /><br />Dana Blankenhorn
wonders out loud if open source can save Second Life. He invites people
to share their insights. My 2 cents is that Second Life is nothing more
than a finger exercise for really successful 3D virtual communities,
which will not be mainstream before 2009-2012. Linden Labs have done
some pioneer work in the area, and they and the users of Second Life (I
refuse to use the term resident), including the big companies that have
jumped on the phenomenon will have learned for their experiences. But
they will come (or already have come) to the conclusion that the Second
Life world is flat, meaning they can fall off at any moment.pr1http://www.blogger.com/profile/10015074312618595113noreply@blogger.com0tag:blogger.com,1999:blog-4262888380793211789.post-1396888800773068352007-04-02T03:47:00.000-07:002012-12-08T03:48:58.764-08:00Beyond Kathy Sierra Quite frankly I never heard of Kathy Sierra before last week. I could
say here thatg I have been a loyal reader of her blog for a long long
time, but reality is that I first heard about her when all the blog
postings came in about the death threats aimed at her. And obviously a lot of people are interested in this subject: <br />
Most reactions throughout the blogosphere vary from supportive, to shocked, and in fact James McGovern was one of the first who had a somewhat skeptical reaction
to Kathy's story. I think this was quite brave by James, and although
his later postings about the subject were not as good in my opinion (but
I suspect James is going quickly for the 1000th blog posting so he can
retire as a blogger as he announced), he has brought up some very
interesting points, that even go beyond the whole Kathy Sierra case.<br />
I
do not know Kathy Sierra so I do not have any opinion on how she has
handled this whole situation, although generally I do feel that you
should never bend to those that use violence. That was my opinion last
year with all the fuss about the cartoons in the Danish newspaper, and
that is my opinion now. But maybe it is different if it happens to you, I
don't know. Nevertheless, I wonder what the motive is of people
threatening a blogger. It wouldn't surprise me if we are "just" dealing
with some bored teenagers who just want some attention, kind of the same
breed as those people that write computer viruses (still can't think
what's so gratifying about that).<br />
The key about the whole
discussion goes beyond Kathy Sierra as I said earlier. James pointed in
his post to the single largest threat of the Internet in general: <br />
<em>"In cyberspace it is very easy to become someone else and very difficult at times to prove that you aren't really you"</em><br />
This
has proved to be very difficult in the past and present (phishing,
threats on discussion boards, electronic voting etc.), and it will
become even a bigger issue in the future. With the advancement and
expansion of new forms of collaboration, social contacts and
transactions that use the Internet, identity is becoming the key issue.
Obviously there is a growing number of people that have a need to take
on another identity when being on the Internet. How often do you see
nick names, avatars or email aliases that reflect exactly the name or
appearance of its owner? Even with some very popular blogs, the writers
chose to use a nick name, probably because they were writing about quite
some confidential information on their company.<br />
Anonymity /
namelessness is very valuable to many people on the Internet, but it
will become a complicating factor in the evolvement of the Internet.
Especially as it appears as though increasingly virtual spaces / worlds
and the real world are blending, this becomes increasingly problematic.
Particularly because legislation is still in progress and absent in this
area. In fact some people will even argue that the Internet should not
be subject to legislation from any country (tell that to China...), and should be subject to the self-cleaning capacity of the World Wide Web and its users.<br />
Which
leaves us with the very interesting observation that on one hand the
current evolvement of the Internet has the promise of providing maximum
transparency, but on the other hand it is threatened by the obscurity of
its users. Information becomes transparant, users don't. All the
ingredients are there for a dialectic process of change.pr1http://www.blogger.com/profile/10015074312618595113noreply@blogger.com0tag:blogger.com,1999:blog-4262888380793211789.post-73412957643563954402007-03-22T03:49:00.000-07:002012-12-08T03:50:48.206-08:00Making money with Web 2.0Loyal readers of my blog will remember that in the past I have written about the possibilities to make money with Web 2.0. Via a Dutch site I ran into a presentation by an Italian consultant who names himself BizMogeek (Business Model geek, real name Luca Grivet).<br />
<br />
As
I have not found his ideas on some other blogs on Web 2.0, I will give a
very short summary of his findings here. Grivet identifies 5 business
models for Web 2.0:<br />
<ol>
<li><b>Free</b>: the best known and near-classic model, in which revenue is generated by advertising</li>
<li><b>Free to use, pay for related service</b>:
the base product can be used free of charge, for example open source
software (MoveableType). However, for additional services you have to
pay (TypePad)</li>
<li><b>Freemium</b>: the base product is
free of charge (like with business model 2) to a certain extent. If you
exceed a specific usage limit (Flickr), want additional service
(LinkedIn) or want to buy extra items (SecondLife), it becomes a charged
service</li>
<li><b>Freedom to pay:</b> it is up to the users
to decide whether or not they are willing to pay to use the service, or
you can make a donation (Wikipedia)</li>
<li><b>Nothing free:</b>
the service will be charged anyway (iTunes), however you can get a
share of the revenue that is generated (eBay, Google AdSense)</li>
</ol>
This
is quite a good effort to describe the possibile Web 2.0 business
models in my opinion. Grivet has even created a framework to help
position service providers in the Web 2.0 market, as he has added the
specific target market (or business area as he calls it) as well in this
model: content, application / service or products / software. I find it
somewhat questionnable whether or not you should draw a distinction
between application / service and products / software, especially as
SaaS is blurring the boundaries between those.<br />
Nevertheless, I
appreciate Grivet's work, and it is one of the best attempts I have seen
so far to describe the different possibilities.<br />
I have one key
objection against his list and framework, and that is that there are
quite some Web 2.0 startups that appear to have no other business model
than attracting the big Web 2.0 players to get acquired by those
players. These startups cannot be placed in any category mentioned by
Grivet, as their key objective is to get noticed by Yahoo!, Google, MSN
and [you name it], hoping that these big guys are willing to buy them
out. I would call this business model the <b>Big bucks buy-out</b> (triple B?) model.<br />
Basically,
I think Grivet's model should be extended with this business model, but
not at the same level as the 5 models he mentions. The framework could
be extended with an extra layer, which could be called "Business
intention". Here are 2 choices: <br />
<ul>
<li>being self-supportive through either one of the 5 models mentioned by Grivet, or</li>
<li>going for the <b>Big player buy-out / Triple B</b> model</li>
</ul>
VC's should assess which type they are dealing with, before committing their capital to Web 2.0 startups.pr1http://www.blogger.com/profile/10015074312618595113noreply@blogger.com0tag:blogger.com,1999:blog-4262888380793211789.post-33040320407657438752007-03-19T03:51:00.000-07:002012-12-08T03:52:27.564-08:00Second Life on the hype cycle What if you had to place Second Life on a Gartner Hype Cycle? In which
Hype Cycle would you place it? Hype Cycle for Virtual Worlds? Hype Cycle
for 3D Web UIs? And at which position in that particular hype cycle
would you place it? Technology trigger? Peak of inflated expectations?<br /><br />I
would definately say at the peak of inflated expectations, as not a
single printed or online chronicle has not written about it yet. If even
Dutch grocery stores are opening a complete mall in Second Life, there
is no room anymore for denial: Second Life IS a hype.<br />However, there are already some first signs that the hype may be beyond its peak. Although the State of the Virtual World on the official Second Life blog
still show quite some impressive growth figures, there are also some
things that indicate that SL hype is nearly beyond its peak:<br />
<ul>
<li>All
media have already written or reported about it, a second wave of
attention from the media appears unlikely (people will pretty soon show
signs of Second Life fatigue)</li>
<li>Only a fraction of users actually stay. Many people try, only few keep coming to Second Life</li>
<li>Some companies have already announced to close their virtual offices in Second Life after the first forms of child porn were found in Second Life.</li>
<li>And,
as my colleague Ray Valdes mentioned: "Do you want to have a virtual
press conference in a world where your public event can be disrupted by
flying animated body parts?"</li>
<li>Some people claim that Second Life
is mainly driven (or should I say: populated) by people looking to
gamble or for some erotic pleasure (hey, that does not sound surprising
for an Internet platform)</li>
</ul>
My 2 cents on Second Life and Hype Cycles is that:<br />
<ul>
<li>Second Life should be placed in a Hype Cycle for Web 2.0, as I consider it a social networking platform / community</li>
<li>Secondly, I think it should be placed just beyond the peak of inflated expectations. </li>
<li>Finally,
"Years to mainstream adoption" will be the red circle with the cross in
it ("obsolete before plateau"), because I think another contender will
take the crown from Second Life and will create a highly successful
"virtual world"<span style="font-size: 78%;">*</span></li>
</ul>
<span style="font-size: 78%;">*
If I had to place "virtual 3D communities" on the Hype Cycle (instead
of its most popular current incarnation, Second Life), I would say it
takes another 2 to 5 years before mainstream adoption. </span><br />
Yes,
I do believe that virtual 3D communities have a quite bright future, I
just don't think that Second Life will be THE future virtual world of
choice. Second Life will suffer from the the 'dialectics of progress',
in fact you could even say that it will become a victim of its own
success.pr1http://www.blogger.com/profile/10015074312618595113noreply@blogger.com0tag:blogger.com,1999:blog-4262888380793211789.post-25859750065829450092007-03-01T03:04:00.000-08:002012-12-08T03:05:29.851-08:00Adobe Photoshop goes SaaS The news that Adobe is offering its flagship product Photoshop (or should we call Flash its flagship product after the acquisition of Macromedia?) as an online service has caused quite some speculation on the acceleration of SaaS (Software as a Service).<br />I
find this news interesting for 2 reasons: what is Adobe's product
strategy and plan with this move, and what are the consequences of a
move to a true SaaS model?<br /><br />As for Adobe's strategy: I predicted
earlier this year that Adobe will be acquired by Oracle. Although there
are still some rumours that Oracle's first acquisition priorities are
with BI vendor Business Objects, I still believe that an Adobe
acquisition by Oracle is very feasible, not in the last place because
Oracle might consider Adobe as the doorway to the new, emerging market
of rich user interfaces and desktop computing (Acrobat, Flash,
Photoshop, video editing stuff). If Adobe manages to leverage its web
presence with a Photoshop SaaS offering, it will probably become an even
more attractive target for Oracle. This move by Adobe has not and will
not go unnoticed, and the exposure Adobe is getting for its SaaS
efforts, can create the right momentum for the acquisition by Oracle.<br /><br />In
the second place, I find this news interesting because it is a test
case to see whether the open source business model works for software as
well. We should not be mistaken about the Adobe plans: it is nothing
more than offering only a limited set of Photoshop capabilities on-line,
it is not like all the very sophisticated functions from the desktop
version are incorporated in the online version. Add that to the fact
that Photoshop online is what is called an "ad-supported online
service", and it becomes clear that also with Photoshop as most people
know it, there is no such thing as a free lunch.<br />In line with the Web 2.0 business model
discussed before, Adobe is giving away something expensive but
considered critical, hoping to get something valuable for free that was
once expensive. It could be a way of attracting customers and make them
want the full, desktop version of Photoshop, or it could be a true step
towards "good enough" software offered via the Internet. The question
with the latter is: what's in it for Adobe? Will they charge users in
the future for using the online Photoshop? Will they gain income through
support and services (I wonder if this will work: I predict that online
communities will emerge that provide peer-to-peer support for online
Photoshop users)? Are they only doing this for brand recognition or to
win sympathy?<br /><br />Many people applaud the SaaS model, because they
feel it is important to cut down on physical media, and replace this
with online storage. However, the SaaS model requires a much higher
bandwidth, and not only that: it requires a reliable connection to the
software that is consumed as a service. You do not want to be in the
middle of editing your pics from your latest holiday, and all of a
sudden find out that your connection is slow, and the HTTP request from
your browser times out. Another aspect could be the enormous media
exposure that such initiatives could get: are the online services
prepared for the huge peak load when influential media have a story
about it and all of a sudden everyone wants to check out that cool new
service.<br /><br />According to Adobe's plans, Photoshop should be online
in 6 months. Enough time then to think about some of the above issues,
and I will follow this with great interest.pr1http://www.blogger.com/profile/10015074312618595113noreply@blogger.com0tag:blogger.com,1999:blog-4262888380793211789.post-41943953886914374702006-08-30T02:57:00.000-07:002012-12-08T02:57:58.347-08:00How much is that Vista in the Windows? <a href="http://www.microsoft.com/canada/default.aspx">Microsoft Canada</a>
has accidentally published the prices for Vista on their web site.
These prices give some insight in what to expect when Vista finally hits
the market, but the prices have been taken offline already. Although
Microsoft have already announced that the prices for Vista would be
about the same as the prices for Windows XP, it is still interesting to
see what this will really mean. They also announced that they will
publish the prices after "Release Candidate 1" is released, which is
planned for September. For those that cannot wait, here is the
highlights of the list that has been taken of the Canadian MSFT site:<br />
<ul>
<li>Windows
Vista Basic Edition, the cheapest basic version of Vista which can be
compared with XP Home edition, will cost 259 Canadian dollars, which is
233 US $ and 182 euro.<br /><em>Side note: Windows XP Home cost 199 US $
in 2001. A difference of 34 US $, or 17%, which is not bad if we take
the effects of inflation (approximately 3.5% per year) into
consideration.</em></li>
<li>Vista Home Premium Edition, that combines
characteristics of Windows Media Center and the Tablet PC-edition, will
have a price raise of 13%, i.e. 299 Canadian $ (269 US $, which is 211
euro).</li>
<li>Vista Business would cost 379 Canadian $ (341 US $, 267 euro). This is 7 % cheaper than Windows XP Professional.</li>
<li>Vista Ultimate (what's in a name) will cost twice as much as the Basic Edition: 499 Canadian $ (449 US $, 352 euro).</li>
</ul>
pr1http://www.blogger.com/profile/10015074312618595113noreply@blogger.com0tag:blogger.com,1999:blog-4262888380793211789.post-77329652399942131622006-04-19T03:06:00.000-07:002012-12-08T03:07:05.262-08:00SOA just vendor talk? "‘SOA’ may have meant something once but it’s just vendor bullshit now."<br /><br />Very interesting yet highly questionnable statement by Tim Bray. Without giving <em>any</em>
reasonable argumentation for this bold statement, he dismisses SOA as
something that does not matter anymore, and he says we should rather
focus on what he calls Web style. Of course, the highly fashionable term Enterprisey is used in his post, and increasingly people are using this term to dismiss SOA without giving <em>any</em>
argumentation for dismissing it, other than "SOA is Enterprisey". It is
kind of like what open source zealots throw at you when you talk about
MSFT: "oh, that's FUD!".<br /><br />I do agree with Tim that simplicity is a
virtue when working with the web. After all, the Internet has been such
a big hit because of its simplicity. However, Tim misses the fact that
the services he talks about when discussing the Web Style thing, are a
completely different thing than the services we talk about when
discussing SOA. I wrote before
that there is something fundamentally difficult when talking about
"services". There are different kind of services, and probably the name
"web service" has not been the best choice. The Web Style kind of
services Tim speaks about are fundamentally different from services you
will usually use within an SOA. I wrote about this
before too: WS-* services and REST services are not competing, they are
complementary. Web Style serves another purpose than SOA.
Consumer-facing services have other QoS requirements than high-volume,
cross-platform A2A transaction services. One service needs to be simple
and flexible, while another service should seamlessly integrate with
different platforms. Different requirements ask for different services,
which in turn ask for different implementations. That is the reality of
today's IT world.<br /><br />Once again: it is all a matter of perspective.pr1http://www.blogger.com/profile/10015074312618595113noreply@blogger.com0tag:blogger.com,1999:blog-4262888380793211789.post-19659135622611753612006-04-19T02:58:00.000-07:002012-12-08T02:59:33.668-08:00SOA just vendor talk? "‘SOA’ may have meant something once but it’s just vendor bullshit now."<br /><br />Very interesting yet highly questionnable statement by Tim Bray. Without giving <em>any</em>
reasonable argumentation for this bold statement, he dismisses SOA as
something that does not matter anymore, and he says we should rather
focus on what he calls Web style. Of course, the highly fashionable term Enterprisey is used in his post, and increasingly people are using this term to dismiss SOA without giving <em>any</em>
argumentation for dismissing it, other than "SOA is Enterprisey". It is
kind of like what open source zealots throw at you when you talk about
MSFT: "oh, that's FUD!".<br /><br />I do agree with Tim that simplicity is a
virtue when working with the web. After all, the Internet has been such
a big hit because of its simplicity. However, Tim misses the fact that
the services he talks about when discussing the Web Style thing, are a
completely different thing than the services we talk about when
discussing SOA. I wrote before
that there is something fundamentally difficult when talking about
"services". There are different kind of services, and probably the name
"web service" has not been the best choice. The Web Style kind of
services Tim speaks about are fundamentally different from services you
will usually use within an SOA. I wrote about this
before too: WS-* services and REST services are not competing, they are
complementary. Web Style serves another purpose than SOA.
Consumer-facing services have other QoS requirements than high-volume,
cross-platform A2A transaction services. One service needs to be simple
and flexible, while another service should seamlessly integrate with
different platforms. Different requirements ask for different services,
which in turn ask for different implementations. That is the reality of
today's IT world.<br /><br />Once again: it is all a matter of perspective.pr1http://www.blogger.com/profile/10015074312618595113noreply@blogger.com0tag:blogger.com,1999:blog-4262888380793211789.post-72742011931207091482006-04-11T04:39:00.000-07:002012-12-08T04:40:22.876-08:00The .NET boomerang effect Every once in a while somebody posts a blog or article on how the
monopoly of Microsoft / Windows on the desktop can be ended, or at least
be threatened. Of course I am also guilty of this, although I have said before that it will be quite hard to threaten Windows on the desktop. Keith Harrison-Broninski recently had a quite interesting discussion
on his web log, and although I do disagree with him when he states that
Eclipse will end the Microsoft monopoly on the desktop, his viewpoint
sure made me think about other possibilities, stemming from the open
source (.NET) community: Mono.<br /><br />More and more Linux distributions have the open source .NET framework Mono incorporated,
which means that these Linux distributions can run .NET applications
that comply with the ECMA standard for .NET. Although key distributions
like RedHat for instance do not include Mono (yet?), there are still
signs that the Mono framework is becoming an increasingly important
factor for Linux development.<br /><br />I said before
that one of the key barriers for a switch to a Linux desktop are the
tons of applications that will only run on Windows. I also said before that a Windows emulator (such as Wine) could be helpful to run those apps, but the number of supported apps is still <em>very</em> limited.<br />As
it is in Microsoft's best interest to have as much applications running
on the .NET framework as soon as possible (to acceleratie the expansion
of the .NET framework and runtime), many organizations will be forced
to migrate their applications to this .NET framework. Which in turn
makes it possible to run those applications on a Linux distribution with
Mono included. So it could turn out that the rapid expansion of .NET
that Microsoft is pursuing, could work as a boomerang as it opens up the
possibilities of Linux (with Mono) ending Windows' desktop monopoly.
And just like it is the case with Ajax, it could turn out that Microsoft started this themselves, by submitting their .NET stuff to the ECMA. Ironic again.pr1http://www.blogger.com/profile/10015074312618595113noreply@blogger.com0tag:blogger.com,1999:blog-4262888380793211789.post-84609512628203405752006-04-10T04:41:00.000-07:002012-12-08T04:41:34.075-08:00JBoss not Fusion-ized Despite Larry Ellison's attempts in the past, it is now safe to say that JBoss will not be Fusion-ized: today the news was announced that RedHat has acquired the open source application server producer for 350 million dollar.<br /><br />When
Ellison attempted to buy JBoss earlier this year, they boldly stated
they were not for sale. However, RedHat succeeded in acquiring the open
source company.<br />I think this news will create some disruption in the open source world, as RedHat competitor Novell
uses JBoss database software in its Linux distribution Suse, a key
competitor for the RedHat Linux distribution. However, as JBoss is open
source of course Novell will still be able to use it, but the open
source model sure can create some peculiar situations, especially as at
some points a product / project is acquired by a vendor.<br /><br />So it is now up to RedHat to make a tasty fusion of RedHat and JBoss capabilities. And Oracle will probably target Zend even more after failing to acquire JBoss.pr1http://www.blogger.com/profile/10015074312618595113noreply@blogger.com0tag:blogger.com,1999:blog-4262888380793211789.post-33294327245423765692006-04-07T04:42:00.000-07:002012-12-08T04:43:16.068-08:00Taylorism in IT Recently Gartner started a blog named Unconventional Thinking, and I was particularly triggered by a posting by Richard Hunter. He expects that although the concept of scientific management
dates from the early 20th century, there is great potential in it for
IT to change how executives run the show. He thinks the next big wave is
automation that changes the way managers, not frontline employees,
work.<br /><br />Personally, I do not think this is going to happen because, in fact, it already <em>is</em>
happening. Like with so many changes that involve IT, our IT
organizations show the first signs of such a type of Taylorism. The most
eminent example of the practical application of scientific management
or Taylorism has always been the assembly line. And what have we done
the past years in IT departments? Exactly, we have created assembly
lines.<br />What do we want when managing IT departments? Speed and
predictability. Exactly the same keywords that most are used throughout
scientific management.<br /><br />This of course applies particularly to the
operational processes of IT (some might argue that this applies for the
people that do the <em>real</em> work). However, at the tactical and
strategic level there has also been an increase in Tayloristic
management principles. Corporate Performance Management (CPM) and
increased data mining capabilities fuel the more quantitative way in
which departments are run. Managers and their superiors agree upon key
performance indicators (KPIs), and are increasingly evaluated against
those.<br /><br />I think technology areas such as business intelligence and
service-oriented applications will benefit greatly from this
evolvement, as it relies heavily on having the right information at the
right time.<br />The increased transparency in organizations and processes
allow an increased emphasis on delivering results at the highest
possible level of efficiency. This will not only be for the normal
workers, but also for management.<br /><br />However, one of the founding
fathers of this approach (actually the synonym was named after him)
Frederick Taylor does not have the best of perceptions of mankind:<br /><em></em><br /><em>Now
one of the very first requirements for a man who is fit to handle pig
iron as a regular occupation is that he shall be so stupid and so
phlegmatic that he more nearly resembles in his mental make-up the ox
than any other type.</em><br /><br />All I can say is that for some reason I
feel that this perception of people does not fit very well with the
current culture in most organizations. Also, transparency has an
undeniable other effect on organizations: there is no longer an
information monopoly for top management, in fact there are numerous
examples where the people on the work floor outsmart the executives,
because they are better informed. So my prediction is that scientific
management will certainly become an increasingly popular concept in IT,
but the technology it needs for further expansion, will also limit its
impact, as the mirror has two faces.pr1http://www.blogger.com/profile/10015074312618595113noreply@blogger.com0tag:blogger.com,1999:blog-4262888380793211789.post-10756724960890673972006-04-03T04:44:00.000-07:002012-12-08T04:44:41.548-08:00ajaxWrite, LimeWire and the digital divide First there was the news last week that the much talked about web-based word editor ajaxWrite
has nothing to do with Ajax as it is fully written in
Firefox-proprietary language XUL (the fact that it only runs in Firefox
should have raised some suspicion...). Typical case of randomly
selecting a hyped term and incorporating it in the name of your product.
The web service tooling arena has suffered from this before, when every
small vendor incorporated the term SOA in the name of its products, and
now it is time for Ajax to suffer from these effects.<br /><br />Then this morning when I was stuck for 2 hours in a traffic jam, I heard the news on the radio that a Dutch newspaper
took the test, and searched on the Internet for confidential
information people by accident had put on the web through file sharing
programs such as LimeWire.
It was shocking what they found within the hour: passwords, tax
declarations, application letters, resumes, passport scans and even
proposals for the Dutch national security agency and the Ministry of
Justice and Defense. All this by just searching for the term
"confidential" (but then the Dutch word for it of course).<br /><br />These
two news items, that do not seem to have much in common, show that a lot
of people have no clue what they are doing or what they are dealing
with. As with the Ajax case, a lot of (techie) people will think they
have a cool Ajax app running, while in fact it is just as proprietary as
MSFTs Word for instance. The results of the ignorance of the file
sharing capabilities are worse however. I am sure that all those
persons put their confidential data on the web, just thought that it
would be cool too if they could download the latest music or movies just
like their neighbor kid, and they never intended to throw half of their
private life on the net.<br /><br />Everytime when I hear such news, I get
the feeling that maybe in some aspects we are too far ahead in IT with
our newest technology stuff. A lot of people are having trouble catching
up with the latest evolvements, and because sometimes they are not
fully aware of the possibilities and power of technology (as with the
file sharing item), the results can be quite bad. I wrote about this before,
and not only within IT we can find the problem of a divide between the
front runners and a large group that has trouble catching up, but this
is also the case on a macro level, in society. This is often called the digital divide,
and with the increasing consumerization of IT, which will be driven by a
young group of very tech-savvy consumers, I think this divide will
become even greater. The results will be likewise.pr1http://www.blogger.com/profile/10015074312618595113noreply@blogger.com0tag:blogger.com,1999:blog-4262888380793211789.post-79786059570758538092006-01-04T04:47:00.000-08:002012-12-08T04:48:27.085-08:002006 prediction #2: consumerization comes in most unexpected areas Gartner
has recently published some research on how consumer technologies fuel
innovation. Former META (which has been acquired by Gartner last year)
CEO Dale Kutnick for instance has published a podcast
on the power shift resulting from consumerization. Consumerization is
everywhere! Web 2.0 is just a container term for all kinds of
innovations / concepts / technologies that are a result of this
consumerization. Very often the results are very surprising, and the
idea of mashup
even adds to these surprises. Major innovation does not only come from
R&D departments of large organizations (either commercial or
academic), but also from clever and unexpected usage from consumers. By
combining (this is what mashup is about!) concepts and technologies new
tools, concepts and technologies evolve.<br /><br />Take this article (in Dutch) in a major Dutch newspaper for instance, about how the Nintendo Gameboy
can be used to increase the engine power of scooters. It writes about
how some clevers kids and mechanics found out, that with a Nintendo
Gameboy you can easily read and manipulate the software and chips inside
scooters. This makes it possible to erase the speed limitation which is
set within the software, so that scooters can go way faster than the
legally allowed 50 km/hour (in fact they can go as fast as 90 km/hour).<br />That's
not all: not only kids and mechanics are taking advantage of this, but
it has also gained the attention of scooter manufacturers, who are now
developing and selling special cartridges for the Gameboy to read the
scooter's technical data. They also provide special cables to attach the
Gameboy to the scooter.<br /><br />The above Gameboy anecdote is just an
example to show how consumerization works, and that the results of the
evolution can be found in the most unexpected areas. So this is my
second prediction for 2006: we will see that consumerization will
continue to drive innovation, and the results will be very surprising.pr1http://www.blogger.com/profile/10015074312618595113noreply@blogger.com0tag:blogger.com,1999:blog-4262888380793211789.post-88055492261287991272006-01-04T04:46:00.000-08:002012-12-08T04:46:58.707-08:00Is Web 2.0 amoral? Nicholas Carr has written a quite influential piece
on Web 2.0, already back in October 2005. The article is titled "The
amorality of Web 2.0", and Nicholas saves the beef of his article for
the last part. His point is the following (please read the full article
too, it's well worth it!):<br /><br /><em>The Internet is changing the
economics of creative work - or, to put it more broadly, the economics
of culture - and it's doing it in a way that may well restrict rather
than expand our choices. Wikipedia might be a pale shadow of the
Britannica, but because it's created by amateurs rather than
professionals, it's free. And free trumps quality all the time. So what
happens to those poor saps who write encyclopedias for a living? They
wither and die. The same thing happens when blogs and other free on-line
content go up against old-fashioned newspapers and magazines. Of course
the mainstream media sees the blogosphere as a competitor. It is a
competitor. And, given the economics of the competition, it may well
turn out to be a superior competitor. The layoffs we've recently seen at
major newspapers may just be the beginning, and those layoffs should be
cause not for self-satisfied snickering but for despair. Implicit in
the ecstatic visions of Web 2.0 is the hegemony of the amateur. I for
one can't imagine anything more frightening.</em><br /><em></em><br /><em>In
"We Are the Web," Kelly writes that "because of the ease of creation
and dissemination, online culture is *the* culture." I hope he's wrong,
but I fear he's right - or will come to be right.</em><br /><em></em><br />This
last part of his point, is what sociologists call cultural
generalization. This is one of the aspects of modernization or
modernism, along with structural differentiation (a long-winded way of
saying that the world around is is becoming more complex). Sociologists
have been writing about this modernization process for decades, and from
the works of Marx, Weber, heck even Ritzer (author of the bestseller <em>The McDonaldization of Society</em>),
we can only conclude that modernization and its consequences are
inevitable. Considered from this view point, it is safe to say that Web
2.0 does not cause this cultural generalization, but only accelerates
it. So is Web 2.0 amoral? Nah, not more than other forces driving
modernization.pr1http://www.blogger.com/profile/10015074312618595113noreply@blogger.com0tag:blogger.com,1999:blog-4262888380793211789.post-66136278649844822212006-01-03T04:49:00.000-08:002012-12-08T04:49:25.770-08:00Movie stars and SOA This morning I read on the sys-con website that in the Harrison Ford movie "Firewall" the main character is a security expert who reads SOA/Web Services Journal. As sys-con proudly writes on their web site:<br /><br /><em>In "Firewall" </em><em>SOA Web Services Journal</em><em>
is Jack Stanfield's favorite trade magazine where you see Harrison Ford
reading the magazine in one scene and on the coffee table in his
office, in two other scenes.</em><br /><em>SYS-CON Media granted
permission to Warner Brothers approximately one year ago for the studio
to have Harrison Ford's character to appear with the magazine in the
movie.</em><br /><em>"Firewall" will open in movie theaters on February 10, 2006.</em><br /><em></em><br />Two questions for all of you:<br />
<ol>
<li>Do security experts <em>really</em>
read Web Services Journal (WSJ) from a professional point of view, or
just for fun / leisure. I like WSJ, but quite frankly to me it is not
famous for its articles on security.</li>
<li>Can anyone confirm that Harrison is reading the issue which has my article on BizTalk 2004 in it? That would be quite something for me and a great start of 2006, knowing that Indiana Jones had a look at my BizTalk decision tree ;-).</li>
</ol>
pr1http://www.blogger.com/profile/10015074312618595113noreply@blogger.com0tag:blogger.com,1999:blog-4262888380793211789.post-52014537107630924602006-01-02T04:49:00.000-08:002012-12-08T04:50:20.905-08:002006 prediction #1: web service standards Back in September I wrote
that web services appeared to move away from WS-* protocols and
standards. Looking back now I was partly right. It is certainly so that,
especially with the emergence of Ajax and Web 2.0 in the second half of
2005, other non WS-* protocols have gained attention and are used more
widely. In fact, both Ajax and Web 2.0 rely more on POX (Plain Old XML)
and REST,
than on WS-Security, WS-Transaction or WS-I and the like. What will
further happen with web service protocols and standards in 2006?<br /><br />The
WS-* and more lightweight standards like POX, REST and Ajax will
peacefully co-exist. As being part of an SOA and featuring heavily in
the attempts by major vendors to improve their SOBA's and make them
future-proof, we will need a robust framework for web services. The
further elaboration of the WS-* stack can provide this, so for
inter-application, inter-organization and any other service-oriented
application, the WS-* will still be the best pick.<br />However, for
services that are primarily user-centric, the best pick will be
REST/POX/Ajax. Use these technologies for presenting the information
that is being processed using protocols and standards from the WS-*
stack.<br /><br />My prediction is that both the WS-* stack and its
lightweight counterpart will continue to evolve and mature, and that we
will learn that these are not rival standards, but rather complementary
standards that will feature heavily in organizations pursuing SOA / Web
2.0 success. Whether we like it or not, Ajax and Web 2.0 concepts are
here to stay, and they provide an attractive alternative for
presentation functions for which WS-* is over-bloated.<br /><br />My second
prediction is that the WS-* stack will be start to be consolidated in
2006, and the numbers of standards for web services will rather
decrease, than increase. New joint efforts from vendors and standard
bodies will be started to unify competing standards. Also, standards
that overlap heavily, will be slammed together to increase the
simplicity of the WS-* protocols.<br />Still the golden rule for WS-*
standards applies: use SDKs and generators as much as possible, and do
not go about adopting and implementing all WS-* standards out there. Do
not even attempt to grasp all of them.pr1http://www.blogger.com/profile/10015074312618595113noreply@blogger.com0tag:blogger.com,1999:blog-4262888380793211789.post-80437982706503682872005-12-28T04:00:00.000-08:002012-12-08T04:51:36.449-08:00Being average is NOT good Two things I found out the last weeks, have convinced me that being
average is NOT a good thing for IT organizations. The first was a good
piece of research by CSC, which I found by browsing Christopher Koch's blog of CIO Magazine. The research stated that <em>companies that spend according to the average in their industry are the worst performers</em>.
Furthermore, CSC found that companies that spend much less than the
average are three times more successful than those in the middle. And
companies that spend much more than the average are six times more
successful. How about that! I have done some benchmarks, and most of the
time our client wanted to be at least average, and the first question
that arises when you present the numbers is: "how are we doing compared
to the others?".<br /><br />The second thing I found was the distribution of CMMI certification. The Software Engineering Institute (SEI), inventor and owner of CMMI certification, published figures on appraisal results every 6 months. The september 2005 version
provided some interesting and surprising results, as it showed that
more than half of organizations that requested appraisal, are level 3 or
above. This could suggest that level 3 is average for the industry, but
I can guarantee from experience that this is definately not the case, I
think the average will be at most level 2. In an earlier post
I already wrote that CMMI certification can be approached in different
ways, but what the figures show I think, is that organizations that
generally already have a more mature process, are more eager to go for
formal appraisal, than organzations who are just getting started with
their process. Our own figures indicate that the distribution of CMMI
levels are:<br />
<ul>
<li>Level 1: 50-60%</li>
<li>Level 2: 25-30%</li>
<li>Level 3: 10-15%</li>
<li>Level 4+: <5%</li>
</ul>
Earlier I already wrote
that measurement data should always be interpreted within the context
in which the data was collected. This is especially the case for CMMI
certification. But also, the results from the CSC research show that you
should never go for average, as first of all the does not make you
stand out compared to your peers, and second of all, it may indicate
that you are not doing as well as you think you might. So one good plan
for 2006: not being average!pr1http://www.blogger.com/profile/10015074312618595113noreply@blogger.com0tag:blogger.com,1999:blog-4262888380793211789.post-57937567879013133022005-11-14T03:02:00.000-08:002012-12-08T03:02:31.822-08:00Ryanair: test case for Web 2.0 business modelLast week the statement by Michael O'Leary of Ryaniar, saying that within 5 years airlines will offer free flights,
as revenue will come from passengers gambling while on the plane, was
breaking news on some web logs and web sites (yeah I am late again with
this story).<br />O'Leary (it remains unclear whether he had too much
Guiness when making his statement) said that introducing lottery scratch
cards had shown that passengers were bored and looking for
entertainment. O'Leary further believes that "entertainment is where the
real money will be made in the future". So he launched the idea to give
away free airline tickets, as money should be made by offering gambling
through mobile phones<br /><br />This sounds so much like the investment thesis I discussed in my posting
on a Web 2.0 business model: you give away something for free, hoping
to get back something valuable for free that was once expensive. Ryanair
is expecting to trade seats in their air planes, for extra revenue that
will exceed the revenue that will come from selling tickets for those
seats. Will this work? Will Ryanair, that gained market share by taking
on a different business model than their competitors (low margins, low
service versus high margins, high service), succeed in expanding their
market share?<br /><br />It can go either way I think: either the Ryanair
case will prove that the investment thesis on which Web 2.0 is based can
work in reality, or it will show that in the beginning people will get
interested because something is free, but once they find out it will
cost them money (maybe even more than just buying a ticket) anyhow, they
will lose interest and go for the safe bet (no pun intended ;-) ).
Maybe it will only attract the kind of people you do not want to be on
the plane with, but I must admit that I find it quite interesting what
Ryanair is trying. At least it shows that they dare to be creative when looking for new ways to increase their revenue.pr1http://www.blogger.com/profile/10015074312618595113noreply@blogger.com0tag:blogger.com,1999:blog-4262888380793211789.post-21897793530498886322005-10-10T02:54:00.000-07:002012-12-08T02:55:56.552-08:00Should you embrace the Web 2.0 business model?It is always interesting to try to integrate two trends that seemingly
do not have much in common at a first look. I got the idea for this blog
posting when I was reading the very detailed article by Tim O'Reilly on The Web 2.0. On page four, he refers to an investment thesis on Web 2.0 by Paul Kedrosky. What he writes in his thesis is the following:<br /><br /><em>Another
way to look at it is that the successful companies all give up
something expensive but considered critical to get something valuable
for free that was once expensive. For example, Wikipedia gives up
central editorial control in return for speed and breadth. Napster gave
up on the idea of "the catalog" (all the songs the vendor was selling)
and got breadth. Amazon gave up on the idea of having a physical
storefront but got to serve the entire world. Google gave up on the big
customers (initially) and got the 80% whose needs weren't being met.</em><br /><br />Now let's get back to April 2005, when I blogged
about Microsoft and Open Source. Now, 6 months later, not much has
changed on the MSFT / OS situation. There is still some intiatives from
the Open Source community to create tooling for .NET, but it remains
doubtful whether the combo of MSFT and open source will ever really take
off, as there is so much cultural differences between OS developers and
MSFT developers, and there is a lack of Microsoft focus in the higher
education research, traditionally a cradle of many OS initiatives /
innovations. Not much news here then.<br />What has changed though, is
that Web 2.0 is in the center of attention, and that for the first time
in a long period Microsoft is not the dominant player it once was and they admit/know it.
Could they turn the tide by fully open-sourcing the C# language for
instance? Should they give the full .NET framework to the IT / OS
community? Have they investigated this possibility? MSFT's arch rivals
(when it comes to development) Sun have taken the strategy to
open-source all their new products / initiatives, so you could say that
they have embraced the Web 2.0 business model. The same goes to a
certain extent for IBM, who are actively participating in OS
initiatives, although they still make a lot of money on licences for
WebSphere for instance.<br /><br />IT companies should ask themselves two key questions:<br />
<ol>
<li>Should
we adopt the Web 2.0 business model (in which the open source business
model fits to a certain degree), where we give up something expensive
but considered critical, hoping to get something valuable for free that
was once expensive? (take into consideration that this could be a huge
risk!)</li>
<li>If we do so, just exactly <em>what</em> should we give
up? Is open-sourcing one of our products (maybe all?) enough, or should
we come up with something different, as more and more companies are open
sourcing their offerings.</li>
</ol>
This in fact could be one of the
main challenges for CEO's and CIO's of IT companies / vendors in the
next couple of years, and it goes beyond the open source discussion.pr1http://www.blogger.com/profile/10015074312618595113noreply@blogger.com0