IT strategy – Oracle perfects the art of the U-turn

The IT industry is well known for its U-turns. But Oracle has take the art of the hasty retreat to new levels, with its condemnation of a decade of client/ server computing as a “colossal mistake”.

Claiming that a significant IT movement – one that it helped to pioneer and from which it made a massive fortune – was an aberration, is confusing and unhelpful to users. Oracle et al may be eager to move to the next wave of lucrative technologies, but many companies have spent years moving to a decentralised computing model and feel they have only just achieved their goal.

Speaking at the US Oracle users’ group in April, chief executive Larry Ellison said that he and the rest of the industry made a “colossal mistake” distributing data across multiple systems. He called for a move back to a centralised environment.

He said: “Oracle itself has 70 separate HR databases across the globe all of which have to be backed up. We’ve taken complexity and put it on users’ and consumers’ desktops. This fragmentation is killing us (the business world) because we can’t know what’s going on in our business.” So now he tells us …

Not that Oracle wants us to return to the old traditional mainframe. Predictably enough, Ellison believes that the model for the next century is centralised databases accessed over the Internet. Good for Oracle, which largely sells central server tools, bad for its enemy Microsoft.

Client/server required a hefty desktop machine with many applications and data sources; Internet computing – in theory at least – requires only a simple access device which could even be a mobile phone.

But another U-turn lurks in this message – three years ago Oracle all but invented the concept of the network computer, a cutdown PC-type device that would access data and download applications over the Internet rather than storing them locally. Then last year Ellison was backtracking rapidly from the NC – only to return to it now with his claims that the thin client/fat server/Internet model will cut IT costs, boost efficiency, improve customer service and so on. In fact, all the same arguments that were used to convince users that client/server was so much better than mainframe computing.

Of course Oracle isn’t the only company guilty of these twists and turns in strategy. Nearly all IT suppliers go through them – and to an extent they are justified. The business world changes rapidly, and so do its IT needs. Suppliers are always being forced to think up new ideas to meet new challenges and to keep themselves competitive. Some of these are bound to conflict with earlier approaches.

But where the situation becomes murky is when the changes in strategy are dramatic – but serve to sell more of basically the same equipment, software and services for the supplier. Hence Oracle’s move away from client/server will still keep the focus on its core database business, while forcing its rival Microsoft to move to less lucrative areas of business (for client software providers, cheap or free Internet oriented applications make far less profit than heavy duty client/server ones).

And users move far more slowly and cautiously than vendors. To listen to the suppliers at the start of the client/server craze, you’d have thought a revolution was occurring and everyone would have distributed their data within a year. In fact, many sites have only just achieved this, and many others have never moved away from the mainframe. Others will embrace the Internet structure, but are likely to move to it gradually, aware of the problems of introducing fairly untried technology that relies on networks that, in many cases, are inherently unreliable.

The consultancies have an important role here – to act as interpreters between the cynical marketing moves of the industry and the business strategies of the users. While the users want to adopt the best of new technology if it helps their business, stability in the infrastructure generally allows companies to be more creative in how they carry out core activities. Oracle and its counterparts would do well to remember this before they annoy their customers one time too many.

Rich Scheffer, vice president corporate communications at Forte Software, writes.

Client/server was very significant in its time since it broke the dependency on host based computing, removing IBM’s stranglehold and facilitating open systems. It allowed Microsoft to become dominant. It was also the driver behind the success of the relational database vendors, in particular Oracle. As client/server grew, the market came to better understand the possibilities of applications partitioning, distributed systems and of multi-tier architectures. As applications servers and web servers arrived, so did the idea of the thin client.

If it weren’t for the acceptance of client/server model, the uptake of the web would have taken much longer. To Microsoft’s great fortune, and Larry Ellison’s disappointment, the dumb terminals of the previous generation simply cannot meet the needs of an end user, however much intelligence is dished up from the servers. Today, client/server is still appropriate from some applications. To imply that it was a mistake is truly foolish.

New technologies such as client/server and the Internet simply increase the options available for building optimised solutions. At the same time, there is a growing recognition of the need for multiple applications and architectures to co-exist and co-operate. This had lead to a focus on integration. The mainframe was pronounced dead prematurely: the same is true of client/server. Modern computing will use all models-when appropriate.

Caroline Gabriel is a group editor in VNU’s IT portfolio

Related reading

HMRC banknotes