Feature: Customer intelligence – Mining the web

According to a recent article by Accenture, Simulation, Modelling and Intelligent Agents, “Exploiting available data about customers, and making informed decisions with a better understanding of their ramifications are two of the key differentiators for organisations in the current marketplace”.

Marketing intelligence

Internet server logs were originally used to monitor server operation, but now they are scrutinised for the information they contain about customer behaviour. Such data can reveal details about the customers’ movements around the site including specific mouse clicks as well as any personal details entered by the customer. Combining these two types of data results in the ultimate customer profiling. Web mining allows e-businesses to use the personal data they collect from their customers to maximum impact by combining it with operational data. The result is the most highly detailed marketing intelligence ever developed.

With detailed customer profiling, enterprises can improve the service they offer by identifying and targeting users directly with personalised offers and promotions. The web mining process adds strength to every arm of electronic commerce, improving customer support, informing service strategy decisions, and ultimately making radical changes to the user experience.

Website metrics

Many firms recognise that understanding their users is crucial if they are to make their sites more responsive, easier to navigate, and capable of presenting relevant products to the right audiences.

There are several methods of data collection within a web environment. Some may be more constructive than others but to get a full picture it’s best to use data from a combination of sources. HTTP server log analysis involves the study of log files produced by the web servers. Each web server vendor provides logging capabilities with their products which typically will generate a log entry for every HTTP request to hit the server. This entry contains detailed information which can be broken down into basic metrics such as the number of hits and number of visitors, visitor duration, visitor origin, visitor IP address, platform, and browser type and version.

Server monitors also produce useful data gathered from the web server through a special API (Application Programmer Interface). The server monitor can get hold of unique visitor IDs, referrer pages and other specifics, but it must be very carefully managed if it is not to crash the web server.

Network monitors can see client requests, server responses, cookies and HTML files. They can also pinpoint stop requests which are generally issued by the browser when a page is taking too long to load and measure the web server’s response time to different requests. Network monitors differ in sophistication but some are capable of capturing details of form data transmitted by a user when the visitor hits a submit button.

I’ve got all this data. Now what?

Data on its own is not information. It takes a great deal of insight, understanding and skill to extract meaning from the confusing jumble of figures created by your system. Fortunately cheap computing power as well as recent developments in specialist website analytic tools makes this process easier. Intelligent agent technology is also making it possible to explore techniques which look set to revolutionise the way we deal with large volumes of data.

Old technology comes into play here too, most notably the relational database which will make a comprehensive analysis of your data possible.

You must make sure that your web mining project is able to integrate the many different types of website data at your disposal. If you choose to use a traffic analysis service then make sure it has the flexibility to combine different types of data. The ultimate aim is to create a data warehouse to which new data can be continually added. The system must be relational so that you are able to view the data from multiple perspectives and query the complex relationships it may contain.

As far as data collection is concerned, a combination of visitor metrics and customer data will prove the most powerful, allowing you to make associations between actual revenue data and visitor actions. Basic analysis of web metrics will give you some useful clues but a more sophisticated analysis will involve the set up of more detailed or more specific capturing techniques.

The results can sometimes be surprising. You may find, for example, that a high percentage of visitors leave the site during the purchase process, or that a high percentage of traffic is delivered from a particular search engine. The possibilities are endless, and enable radical changes to be made both to your website and your evolving marketing outlook.

Web mining services

There are numerous third party services or ASPs now appearing on the market who will happily offer to find the nuggets of gold in your data and there are certain advantages to outsourcing the job. The ASP will take full responsibility for managing the databases and will regularly deliver web analytics data in secure reports to customers. Many are able to produce reports in real time which means that you can watch the live reaction to a change on your site.

Outsourcing will save you from securing and maintaining the hardware, investigating web analytics packages, and investing in expertise. But, you may find that you are limited to a particular type of analysis and reporting. If you want to focus your target individual customers, you may well need the flexibility of an in-house system. This will be more customisable and will allow reporting on particular website activities – and the development of more sophisticated business intelligence techniques.

Early days

Customer retention is becoming a priority for growing clicks-and-mortar organisations, and in this respect web mining provides the ultimate opportunity for developing detailed information on customers’ buying patterns. Not everyone has yet realised its power, but the growing market for web analytic tools proves that the need is real. Companies with the vision and creativity to combine traditional data sources with web mining techniques will certainly be in the best position to optimise their sites for maximum commercial impact.

Brian Tomz, solutions manager for Surfaid

IBM’s Web traffic analysis serviceSurfaid specialises in the collection and integration of very high volume data and was used during the Sydney Olympics where it managed more than one billion hits per day. Tomz explains the process: “Customers transfer web log information and potentially other data sources to the IBM Datacentre. Surfaid reconstructs web server hits into session groupings to ensure that specific records are associated with specific users, segments and actions. The information is then parsed into a relational database. Customers are able to access standard reports as well as an ad hoc query environment for comprehensive exploration and analysis.”

According to Tomz, the careful analysis of website data enables companies to experiment with changes to their sites. “They are continually reinventing their sites to meet strategic objectives,” he says. A new website marketing campaign, he explains, produces immediate tangible results which allow you to ask such questions as: “How many first time visitors did my campaign generate?” and “Of those people how many purchased?”

The data which is used to answer these kinds of questions is playing a bigger role in determining business strategy. “It is used to solidify business cases by showing trends to investors. Many customers require traffic metrics as justification of growth and success to their business partners,” explains Tomz. However, he adds: “Very few companies today are truly utilising the data available to them to make better website business decisions. Unfortunately, even fewer have a grasp of how visitors behave on their website and how that correlates to sales, content viewing and site applications.”

Rodrigo Dauster, director of business strategy at Metrius

Metrius, a consulting firm owned by KPMG, provides e-business expertise to help companies remain competitive in the “networked economy”. According to Metrius, there is no “new” economy, “but there has been a profound and pronounced change in the way people, businesses,brands and resources relate to each other”.

It’s in this context that Dauster explains the art of gathering web traffic data. “You need very specific business objectives for your website. You need to know what you are looking for so that you capture the relevant detailed information,” he says. Many companies wrongly believe that all they need to do is collect data from their log files and then mine it later. “Data does not necessarily come in a form which is addressable,” he says. “Basic web metrics can give you clues, but to really gain from your data you need to have a hunch or a hypothesis. Then you can test it. But the data should be captured with an issue to be addressed in mind.”

He concludes: “Companies need to build a business team whose purpose it is to analyse web traffic, create prototypes and test potential solutions. This testing can be done live, but should really be tried out on user focus groups.”

Related reading