The future of bandwidth: fast, instantly available point to point pipes are coming …
As we finish 2000, bandwidth is anything but abundant. Most consumers in the UK still struggle with 56.6Kbps dial up modems, while small to medium sized corporates are left fretting over the cost of upgrading from ISDN to a 1Mb leased line. Short term partnering with other players in other regions, and sharing quantities of data, is a nightmare. Today, the favoured solution to this problem is to put executives on planes with data on disks or tapes in their briefcases.
However, things are really happening to change the rules in the bandwidth game. For a start, the Optical Internet came to life big time in 2000, though you had to be a big player to notice. As Chris Champion, marketing director for optical equipment manufacturer, Corvis Corporation, notes, there is already a huge amount of dark fibre in the ground between tier one cities around the globe. Advances in laser technologies means that truly staggering amounts of data can now be shoved down a single fibre optic pair.
Champion points out that Corvis currently has equipment that will allow a telco to send 10 Gigabits of data down each of 160 separate wavelengths on a single fibre optic pair (each wavelength is called a Lambda). Nortel and Lucent are both experimenting with equipment that can transmit data at 40Gb and 80Gb per Lambda. The multiples here are enormous.
As he points out, however, it is one thing to have all this capacity in the ground and at the core of long haul, pan-European carrier backbone networks. It is quite another thing to be able to give corporates access to wavelengths, or portions of wavelengths “on demand”. For most carriers today, provisioning bandwidth for new corporate clients involves physical intervention in the network. Engineers have to go out and add cards to tele-hub sites along the route and the process can take anything from weeks to months.
In 2001 we will see a lot more dynamic provisioning, he says. Equipment being made by Corvis and by other vendors such as Sycamore, allows the carrier to literally “point and click” their way to provisioning new point to point connections for customers. New age telcos such as Storm Telecommunications are looking to enable customers to add and tear down connections at will, via consols placed at the client’s site. This is a wholly different model for buying communications infrastructure. In a point and click world, not only will communications become cheaper, the links will be instantly available – facilitating all kinds of multi-party collaborative team work on short term contracts.
The future of the mobile device: speech recognition is key
Today we have a variety of task specific devices, from mobile phones to PDAs, palm top computers and sub notebook PCs. As Nick Martin, sales director at Psion points out, the forms are converging and morphing one into another at an alarming pace, and no one knows quite what the most popular format will be by the end of 2001. However, he predicts that mobile devices of one sort or another will rapidly become one of the favourite ways of accessing and sending data through 2001.
Adam Jollans, European marketing manager for IBM Software makes the obvious point that devices are going to greatly outnumber PCs – and a good proportion of those devices will be Internet enabled. In fact as long ago as March this year, IDC issued a prediction that before the end of 2002, more people would be connecting to the Internet via wireless technology than via fixed wire links. Its best guesstimate is that there will be over 700 million wireless users sending and receiving data over the Internet, versus just over 500 million fixed wire connections.
The key to a massive take-up of devices, as all the experts point out, is not just faster wireless connectivity. The real trick is to give people a way of escaping the clutches of keyboard driven communications. This means good speech recognition. Size is an obvious issue for mobile devices.
Getting rid of the need for a keyboard to carry out serious data manipulation frees device vendors to focus on a variety of solutions to the problem of combining a good viewing screen size with a compact shape. The tablet shape currently favoured by Compaq with its iPac and Palm with the Palm device family is neat, portable and very viewable, but as things stand right now it arguably suffers badly from the absence of speech to text capabilities.
2001 will see a fierce battle among competing operating systems for control of the device market. Already there are three major contenders. These are: the Symbian operating system (formerly known as EPOC), which has backers like Nokia, Ericsson and Psion; the Palm OS, from 3Com’s spun off Palm division; and Microsoft’s PocketPC. Users probably won’t care too much about which OS wins the largest market share, so this battle will be largely subterranean. Both of Microsoft’s competitors have a vested interest in enabling their operating systems to send and receive files from Microsoft Office applications anyway.
What we can expect to see as these devices multiply is the real start of mobile commerce, with people buying and selling direct from their mobile phones. The implications of this for traditional business processes are immense.
The future of CRM – contact management in the one-click-and-I’m-gone age
It is no accident that the CRM specialist vendor, Siebel, is currently one of the fastest growing companies in the US and that around 50% of its sales are repeats to existing customers. The more companies get involved with e-business and e-commerce, the more they realise the truth of Siebel’s favourite maxim, that customer loyalty is fast becoming the only differentiator between competing companies. Personalisation is the key to winning hearts and minds in cyber commerce, and you can only have personalisation if you are able to gather data on your customers and if you can use that data to make meaningful decisions about what to show or offer to that customer.
Jason Haine, Arthur Andersen partner in charge of the Advanced Technology Group, points out that one of the features of the connection with customers, as we move into 2001, will be the “always on” nature of the connection.
This is true of ADSL fixed line connections, and it is also true of the new GPRS wireless connectivity. CRM applications will need to take account of this “always on” world, in terms of the opportunities this creates for pushing messages to customers, and of the fact that customers will doubtless start deploying smart agents on their devices and PCs to filter out unwanted junk messages.
Another major factor, as Fredrik Uddegard, mobile product specialist for EMEA at Siebel, notes, is that CRM applications will have to be able to encompass the requirements of collaborating parties to track mutual customers and lead opportunities. Uddegard sees the mobile arena as one where there will be all kinds of collaborative “virtual” consortia. CRM applications in this space have to allow for distributed databases where data entered by one partner or captured by their systems is automatically shared by all the parties, according to whatever business logic and security rules the consortium has put in place.
At the same time, he says, CRM vendors have to respond to the new voice centric channels that converged mobile devices make available to corporates.
Siebel Voice allows users to speak their requests via a mobile phone, for example. The CRM system will then go off and execute the requests on the database and read the answers back to the user. Of course, there are going to be some huge integration issues around inserting multi language voice servers into existing systems. Siebel hasn’t quite cracked 100% integration yet of its voice product, but total integration is an important future goal, Uddegard says.
The future of wireless technologies: changing civilisation as we know it?
All the major GSM network operators are due to roll out GPRS wireless services in 2001, giving mobile users the equivalent of a dial up modem connection to the Internet. Anyone who thinks this won’t have much of an impact on the way people work and play, or on life as we know it is in for a surprise.
However, Richard Lanyon-Hogg, IBM principal consultant, mobile e-business, points out that there may be a slow start to wireless, since most handset vendors have warehouses crammed full of non GPRS enabled handsets they would dearly like to shift before the new technology gets going. That said, once it starts to roll, wireless data access will be an unstoppable force, he argues.
“You have around 80 UMTS (third generation) wireless licences being awarded across Europe and hundreds of broadband fixed point wireless licences being issued – over 33 in the UK alone. Wireless is going to be massive.”
Having paid billions for the privilege, GPRS wireless operators are going to have to devise new business models to push revenue returns to the requisite levels. One model he suggests is for consortia of retailers, say in the white or brown goods area, to get together with a bank as underwriter, and with a wireless operator to literally buy users.
“You could offer a lump sum £1,000 payment to users who sign up to spend a certain sum a month with the consortia (for goods they would buy anyway from somewhere). If the user fails to meet the spending limits the ‘gift’ would turn into a loan, which is where the bank comes in – it’s just an idea but it shows how radically new business models will flourish in this new environment,” he comments.
Somewhat more conventionally, Sue Witteveen, vice president, mobile communications at the US mobile services specialist 724 Solutions, points out that the “always on” wireless device makes it possible for portal owners to push all kinds of alerts and services out to users. From the perspective of users, getting bleeped by your mobile device because your boss needs you, your shares just moved by a pre-specified percentage, or that important e-mail you were waiting for has just been downloaded, will become commonplace, she says. Location based services which respond, say, to your preference for Italian food by activating your device to tell you about the Italian restaurant three doors down the street, will also be common enough to need beating off with a stick.
Layton-Hogg says that this kind of environment will encourage the sprouting of intelligent software agents of all kinds, from smart filters to keep out unwanted spam, to shopping agents, adept at trawling the Internet for bargains. Expect 2001 to be the year smart agents might just start to become interesting …
The future of the ASPs: will 2001 be the year they start to actually make money?
While 2000 saw the rise of the Application Service Provider model, it saw precious few ASPs making a profit. Will there be an ASP bandwagon in 2001, or will the rental phenomenon go the way of the 1960s bureaux?
According to Arthur Andersen partner Jason Haine, we can expect the ASP model to undergo something of a shift in 2001 towards a blend of services and software. “Already the software and the services sectors are realising that companies want a total solution, which combines the recommendations and the software, all bundled up for an agreed fee. It is going to be very difficult to sell software without providing a service element.”
Haine reckons that the ASP model will play out against a background of corporates distinguishing between two different types of processes, those that provide critical competitive advantage and those that are routine, commodity processes. The former will get the lion’s share of the budget, the latter will be shipped out to ASPs to deliver for a flat fee. As corporates start making this distinction between core value added, and non core commodity activities, ASPs will be the beneficiaries,” he says.
However, ASPs themselves are going to have to think carefully about their business models. No one is going to pay a premium for quality back office processes or office productivity applications. Clients are going to expect quality as well as a substantial price advantage, and making decent returns is going to be tough, he says.
Kenny Fraser, a director in PwC’s Global Risk Management business is more sceptical about the prospects of the ASP model in 2001. “I would be surprised if they really became popular that quickly. Reliability of bandwidth will continue to be an issue. If a company is running a simple, in-house accountancy package, it just runs and runs and virtually never breaks down. If that company starts renting the software down a link to an ASP, and an invoice takes 10 minutes to enter because of contention on the line, that model is going to struggle to fly,” he notes.
According to Frasier, the most likely candidate for success in the short term is e-mail. Already there are several ASPs offering a price per seat service for Exchange, for example. “E-mail servers are typically connected over the web anyway, so there is no real difference in taking this service from an ASP. Companies tend to feel as if they are on familiar ground and it saves them having to run Exchange servers in-house,” he says.
The future of the enterprise: will 2001 favour flatter structures? more virtualised companies? agile processes?
The market trends watcher, Gartner, announced recently that the major economies have already moved beyond e-business and into something Gartner calls collaborative commerce. Gartner analyst Bruce Bond says, c-commerce is all about extending beyond the walls of the enterprise. “C-commerce achieves dynamic collaboration among internal personnel, business partners and customers throughout a given trading community or market.” In this new expanded, multi party, collaborative arena, Gartner predicts enterprises will look to harness the power of the Internet to gain revenue and profit improvement by going beyond supply chain models and information sharing.
“The evolution to c-commerce enables a dynamic ‘virtual enterprise’, something that has been forecast for years but could never come about because there was little technology to support it,” Bond says. Now, with the Internet, component technologies and the likes of XML and XSL, it is becoming much simpler to stitch links between different corporate systems.
In the real world, though, things move at different speeds, and not all elements of the economy are going to be able to live up to the Gartner model instantly. Hubert Tareieu, senior vice president with Sema Group, is responsible for pushing innovation within the group. He sees one of the really crucial challenges of 2001 playing out inside the financial services arena, where the core business systems of almost all the major players have a distinct feel of yesterday’s technology about them.
“What you have to realise is that the Internet and Internet banking presents conventional banking with some extraordinary challenges by forcing all transactions to take place in real time,” he says. One of the reasons why there has been some disappointment about first generation Internet banking, he says, is that the core systems inside banks have simply not been such as to enable them to deliver the full range of their services instantly over a real time medium like the Internet.
However, he reckons that Sema and others are now rolling out second generation banking services for their clients and that the change will be dramatic.
Fund movements will take place in real time, not when overnight batch transaction systems have taken their course. Deals have to be done with zero latency, and while this is easy to say, it requires a huge investment and rethink of core systems.
“We saw the big incumbent telecoms operators showing a great deal of courage when they decided to forget about trying to extend their fixed line billing systems to embrace mobile billing, and instead set about developing wholly new billing systems for this new service. Very few banks have been prepared to be this dramatic, and 2001 will see some very interesting developments as second generation Internet banks emerge with zero latency services,” he predicts.
The future of hardware: what will 2001 bring?
Among all the hoopla for the new device age, it is easy to lose track of the fact that the desktop PC took a substantial step forward at the back end of 2000. The launch of Intel’s Pentium 4 at clock speeds of 1.4GHz and 1.5GHz, running some 45% faster on graphics intensive tasks than a 1GHz Pentium 3, was clearly a nice step forward in desktop power. The next generation P4 product, code named Foster, is scheduled for 2001, and will use Intel’s yet to be released 860 chip set to provide dual processor capabilities for the workstation market.
However, what is far more significant is that Intel is already talking about shifting its production from the .18 micron technology underpinning the present generation of P4 CPUs, to a .13 micron technology. As Premysl Starovesky, a consultant with Intel’s Workstation division notes, this gives Intel plenty of room to take the P4 through the same speed development gains as that demonstrated by the Pentium since its launch as a sub 75MHz processor in 1995. Five and 10GHz clock speeds lie in front of us in pretty short order. Intel is already talking about a 2GHz P4 by the third quarter of 2001.
However, even Intel is not invulnerable to left field moves, as Transmeta Corporation demonstrated with its own new departure in 2000, the Crusoe chip aimed at mobile applications. As Transmeta director of brand development Frank Priscaro told a web radio audience in November 2000, his company is picking up mobile orders from major manufacturers, including Sony and Fujitsu. The reason it is winning business from Intel is that Crusoe’s unique processor, which combines hardware elements and software elements, runs much cooler than Intel?s chips and uses less than half the battery power. In the mobile world battery life is everything. Transmeta has plans for a processor that is twice as fast as its current processor at half the power, by 2002. As Priscaro notes, there is no reason why Transmeta’s hardware and software approach to CPU design can?t move into both the mobile device and the fixed desktop or server arenas over time. This will be one to continue to watch through 2001.
Graham Sands, PwC assistant director in technology, media and telecoms, reckons that as we move through 2001, the limitations on conventional chip design will start to be felt. Sands believes that the time is right for a number of small start up companies with radically different chip architectures, to start to make an impact. “What we are seeing is a number of new start ups doing very new things, with designs that can handle very high data processing rates,” he says. These start ups are often highly focused on specific industry applications. Their products sit in the middle area between conventional architectures and Application Specific Integrated Circuits (ASICs). “If we do get a real step change in processing capabilities, it will probably come from these new generation, fab-less players, but their impact will probably not be noticeable yet in the context of the wider IT market during 2001,” he comments.
The old operating system battlegrounds:whither Linux, Unix and Windows in 2001?
Whatever strides the world takes towards wireless technologies in 2001, the corporate data centre, be it hosted or run in-house, is going to continue to be at the heart of what runs on the wireless networks. The long running struggle between proprietary Unix operating systems, Windows and the new contender, Linux, will continue to dominate in this space.
As Microsoft Windows 2000 product manager Mark Tennant points out, from a pure benchmark perspective, Windows 2000 Data Centre, launched in 2000, is already performing to the same levels or better, than the top end Unix boxes. Hardware like the Unisys 36 way server is already a Windows mainframe, to all intents and purposes. “We used to have the likes of Sun quoting benchmarks at us all the time to prove that we were not as scalable, or as reliable as their hardware. Now they?ve gone very quiet on that front,” he notes.
Adam Jollans, European marketing manager for IBM Software on Linux and Windows 2000 reckons, not surprisingly, that 2001 will see corporates roll out Windows 2000 on the server in a major way. “Through 2000 the adoption was mostly restricted to the desktop and to mobile computing as corporates sought to take advantage of the greater reliability of the Windows 2000 code, and its better power management capabilities. Many companies held back on server deployments though, and we will see them coming forward with implementations and roll outs through 2001.”
He points out that one of the weaknesses of NT in the past was that any application or driver running in kernel mode could trample all over other parts of the operating system, creating the dreaded NT “blue screen of death” effect. With Windows 2000, Microsoft’s practice of carrying out better testing and then digitally signing device drivers has already made for substantial gains in reliability.
However, Jollans adds that it is important to understand that as far as corporates are concerned, discussions over which platform to standardise on now take place in a radically different context to that which pertained a few years ago. “We are no longer in a world where companies can think only about what happens inside their enterprise. What we now have is a massive, collaborative network where companies have to be prepared to have their systems work with all kinds of other systems. Linux, for example, now looks like having the same order of magnitude and market share as Windows as we go forward.”
The main drivers in the enterprise space through 2001 will be reliability and openness. Architecting for heterogeneous environments and skills in component technologies such as enterprise Java beans and XML are already more important than which operating system a company chooses, he concludes.
Solving the local access bottleneck and opening up the home
The shenanigans of BT’s underwhelming efforts in opening up the local loop to competitors notwithstanding, 2001 will be the year that the UK starts using Digital Subscriber Loop technology in earnest. This, combined with wireless data access speeds at 56.6Kbps and with fixed point wireless ISP offerings at a shared 11Mbps, will go a long way towards easing the long standing bottleneck in local access to the Internet. However, two other technologies, both involving set top boxes, look likely to play an interesting role in 2001 as far as extending e-commerce services to the consumer are concerned.
The first is cable modem access, which promises megabyte speeds rather than kilobyte speeds direct to the home. The second is typified by a technology initiative from the Israeli based company, Integra5. As the company?s CEO Dr Eyal Bartfeld explains, the idea is to use speech as the basis for the viewer to interact with the TV set.
Integra5 sells not to the public, but to the platform owner (the digital channel owner). It’s solution aims to enable the operator to provide every subscriber’s home with value added, Internet based communications services via a natural speech interface. Services include unified messaging, instant messaging, chat, rich text messages with video. Adding a speech interface opens up previously keyboard intensive tasks like sending faxes and e-mails, to non typists, and makes it much more likely that ordinary consumers will feel comfortable using such services in their homes. Bartfeld sees his system as opening up gateways direct to the Internet, so users can shop and browse from home with their feet up, talking to the set. This style of interaction is much more suited to the living room than crouching over a keyboard. Consequently, this kind of technology will greatly speed the take up of e-commerce, he argues.
The system offers users a “universal in-box” receiving e-mail attachments with video and other rich text. Faxes go straight into the viewer’s universal in-box and can be read directly on the TV screen. They can be zoomed in and out, rotated and printed. An SMS like instant text messaging capability will let viewers send messages to friends as they watch television. Bartfeld envisages mobile phones being hooked up to the system as well so that their messaging capabilities can also contribute to the universal in-box, for viewing on the screen.
“As far as we are concerned, rich text messaging is the killer application of the Internet. The TV has unrivalled graphics capabilities and once you give it a natural speech interface it is far more familiar for many people, than interacting with a PC,” he says. According to Bartfeld, the company has already had a tremendous reception from cable TV operators and other platform owners across Europe.
The implications of interactive digital services for enterprise applications integration
There are currently more than 34 million households in the UK with at least one TV. The Government plans to shut down analogue TV by the end of 2010, so all these homes will have to migrate to digital if the Government’s plans go ahead according to schedule. In fact, Datamonitor’s forecast is for nearly 10 million UK homes to be digital by 2005, and something like a fifth of this number can be expected to sign up for banking services via their digital TV.
The growth of new rich media channels to market poses some severe challenges for systems integrators, resellers and ISVs. The implications of these channels is still largely unexplored as far as the newer forms, such as interactive digital TV is concerned.
Shaun Doyle, chairman of the enterprise marketing and automation software specialist house, Intrinsic, reckons that most organisations and most IT shops have not even begun to grasp how huge the challenges are when it comes to building the back end systems that are going to be required to support rich media interactive marketing. He argues that many of these challenges will start to be felt through 2001.
“The first point to grasp is that with digital interactive services, everything happens in real time. If you embed interactivity inside an advert broadcast on a digital TV channel, you have to have the systems that can respond instantly when the customer hits the button on his or her remote control set,” he notes.When you add even a minimal amount of personalisation to each advert, based on, say, the analysed viewing preferences of particular households or individuals, the amount of work for back end systems goes up astronomically. “With digital TV, what you can do today, right now, is beyond the current perception, thought and expectations of most businesses,” he comments. The technology will allow an advertiser to run an ad about a new vehicle model, then have the user book their own test drive time at a dealer local to them, all from the remote handset. Ads themselves will be personalised in future to specific households, with advertisers beaming multiple versions of ads to different homes.
“This is a nightmare for back end systems integrators. Anticipating the volume of responses when you beam multiple ads with interactive content is itself a huge issue. We already know some of the technologies that are going to be required, since the logistics and fulfilment demands associated with Internet e-commerce have prepared the way. People understand how to do the technical integration into back end fulfilment systems. But there is going to be a huge amount of value added work for systems integrators here,” he comments.
Just one half of UK practices have implemented a pricing structure around auto enrolment implementation and advice - with many suffering increased costs
Deloitte's north-west Europe foray; BDO, Smith & Williamson investment paths; Shelley Stock Hutter; and Wilkins Kennedy discussed by editor Kevin Reed on our Friday Afternoon Live broadcast
Accountants should alter their perspective on auto-enrolment to maximise business opportunities, according to Eric Clapton.
Kevin Reed discusses whether new accountancy group Cogital can rival the Big Four...and its likely direction of travel