Popular model helps meet most user needs

BUSINESSES everywhere, particularly in the hi-tech industry, are under mounting pressure from rivals and are changing their power to react.

The winners are those who are able to provide more and better information faster.

Over the years, computing technology has been evolving to support these needs.

First, some of the power was moved out of the central mainframe systems to mini-computers, then to networks of personal computers (PCs).

The reality today is that most companies are operating with a mix of old and new systems - many of which do a valuable job, but cannot easily communicate with each other.

'With the advent of the PC in the early 80s, people became witness to a new philosophy, one computer talking to or relating to or communicating with another computer over a network. And that has become known as client-server [computing],' said Brooks Freeman general manager client-server computing of IBM World Trade Asia's Asia-Pacific server division.

There is nothing remarkable about client-server computing. In fact it is another name for business computing today. But it opens the door to multiple benefits for a company and its staff.

Client-server computing is a structure that splits the applications, data or services between two or more systems. The system that initiates the work is called the client. The server responds to the client's requests and performs some or all of the work. Clients and servers can be any size of computer. Clients can be PCs, notebook computers or even wireless devices.

'Often a server will be a powerful PC linked to a network of less powerful PCs on the user's desk. Also, the server could be a large mainframe serving a network of clients that might include other mainframes as well as groups of PCs,' said Casey Poon, managing consultant of Hewlett Packard Hong Kong's professional services organisation.

Many organisations run a number of essential, but isolated computer systems that do not communicate with each other. The companies do not want to lose the value of their investment in these systems, but they want the data to be made available to those who need it.

To solve the problem, the information technology (IT) industry has developed a set of rules to provide a channel through which these systems can connect. These are 'open system standards'.

'Organisations need to be able to develop applications that support the way they work today, but can change to meet new needs tomorrow. Also, they want to be able to provide better information and computing capability so that their people can work more productively,' said Albert Li, director software business of Digital Equipment.

Many in the industry have their own definition of client-server computing, but most agree that it covers three main functions.

Presentation, which is the 'client' end of the structure, including the visual display of information on the screen and the means by which users interact with the system.

Application, which covers the 'server' area, including the business logic of the application (what users actually do with the data).

Data, which is the storage and transmission of information.

In traditional computing, a single computer performed all of these functions, but often not the way the users wanted, Mr Freeman said.

Client-server computing distributes these functions so that they are shared among a number of computers, introducing flexibility and choice.