Monday, 24 August 2015

Introduction to Client Server Computing


Contents:

Client/Server computing
Mainframe-centric client/server computing 
Downsizing and client/server computing
Client/server development tools.
Advantages of client/sever computing
Connectivity
Reduction in network traffic
Faster delivery of systems



Client/Server Computing

Client/Servers describes the relationship between two computer programs in which one program, the client, makes a service request from another program, the server, which fulfills the request.

Forces that drive the client/Server

The general forces that drive the move to client/server computing are:

  • The changing business environment.
  • The growing need for enterprise data access.
  • The demand for end user productivity gains based on the efficient use for data resources.
  • Technological advances that have made client/server computing practical.
  • Growing cost/performance advantages of PC based platforms.

Client/Server Architecture

The client/server architecture is based on hardware and software components that interacts to form a system. This system includes three main components.
  1. Clients
  2. Servers
  3. Communication Middle ware
The client is any computer process that requests services from the server. The client is also known as the front-end application, reflecting the fact that the end user usually interacts with the client process.

The server is any computer process providing services to the clients. The server is also known as the back-end application, reflecting the fact that the server process provides the background services for the client process.

Communication middle ware is any computer process through which clients and servers communicate. The communication middle ware, also known as middle ware or the communication layers, is made up of several layers of software that aid the transmission of data and control information between clients and servers.

The features of client server computing includes:
  • Distributed processing environment.
  • Distributed database environment.
  • Comprehensive communications networks.
  • Open Systems for information sharing.
  • Friendly User Interfaces.
  • Standardized communications protocols.
  • Shared responsibility.
  • Client processes request services.
  • Server processes provide services.
  • Client and servers can negotiate the terms and conditions of service.
Mainframe-centric client/server architecture

Three approaches to organization of information sharing are:
  1. Mainframe-centric
  2. PC Server centric
  3. Client Server
Mainframe centric:
  • Use terminal emulators or hardwired terminals.
  • Non GUI proprietary interface.
  • Asynchronous.
  • Tight administrative control.
The features are:
  • Uses the presentation capabilities of the workstation to front-end existing application.
  • The data is displayed or entered through the use of pull down lists, scrollable fields, check boxes and buttons.
  • The user interface is easy to use and information is presented more clearly.

Downsizing and client/server computing

Downsizing means replacing expensive mainframe computers with more cost effective networks of personal computer that achieve the same or even better results.

The other potential benefits of downsizing are:
  • improved response time
  • decreased system development time
  • increased flexibility
  • greater control
  • implementation of strategic changes in workflow processes. 
Client/Server computing is open computing. Mix and match is the rule. Development tools and developmental environments must be created with both openness and standards in mind.

Client/Server development tools

The client/server development tools include:
  • GUI based development.
  • A GUI builder that supports multiple interfaces.
  • Object Oriented development with a central repository for data and applications.
  • Support for multiple database.
  • Data access regardless of data model.
  • Complete SDLC (System development Life Cycle) support from planning to implementation and maintenance.
  • Team development support.
  • Support for third party development tools.
  • Prototyping and Rapid Application development(RAD) capabilities.
  • Support for multiple platforms.
  • Support for middle ware protocols.
  • Multiple network protocol support.
The fundamental ideas underlying OOT are:
  1. Abstraction
  2. Objects
  3. Encapsulation
  4. Classes and Instance
  5. Inheritance
  6. Message
  7. Methods

Abstraction
  • One of the forms of abstraction is Data Abstraction.
  • Abstraction is "to represent the essential feature without representing the background details.
  • Abstraction lets you focus on what the object does instead of how it does it.
  • Abstraction provides you a generalized view of your classes or object by providing relevant information.
  • Abstraction is the process of hiding the working style of an object and showing the information of an object in understandable manner.

Objects
  • An object is anything, real or abstract, about which we store data and those methods that manipulate the data.
  • It is a software package which contains related data and procedures.
  • An object is a category of object.
  • An object is an instance of object type.

Encapsulation

Packaging data and methods together is called encapsulation. Its advantages are:
  • Unnecessary details are hidden.
  • Unintentional data modification is avoided (that is) provides security and reliability.
  • Presents interference with the internals and also hides the complexity of the components. Thus, encapsulation is important because it separates how an object behaves, from how it is implemented.

Classes

A class is an implementation of an object type and is defined by a class description that defines both attributes and messages for an object in that class.
  • A class is template that helps us to create objects.
  • Classes have names that indicates the kind of object that represent.
  • Classes may be arranged in hierarchy with subclass representing more specific kinds of objects than their super class.

Inheritance

Inheritance allows the developer to create a new class for object from an existing one by inheriting the behavior and then modifying or adding to it. It provides an ability to create classes that will automatically model themselves on other classes. Sometimes a class inherits properties of more than one super class, then it is called multiple inheritance.

Its advantages are:

  • Reusability code
  • Avoid duplication of code
  • Reduce cost of maintenance

Message

  • A specific symbol, identifier or keyword with or without parameters that represents an action to be taken by an object.
  • Only way to communicate with an object is through message passing.
  • An object knows what another intrinsic property/capability object has, but not how it does it.
  • A message is not restricted to one recipient object but to multiple object.

Methods

They are often called selectors since when they are called by name they allow the system to select which code is to be executed.
  • Methods are description of operations
  • Methods appears as a component of object
  • There is a 1-1 correspondence between messages and methods that are executed when a message is received by a given object.
  • The same message might result in different methods.

Advantages of client/server computing

  • Enhanced data sharing
  • Integrated Services
  • Sharing Resources among diverse platforms
  • Data interchangeability and interoperability
  • Centralized Management
  • Helps organizations downsize from mainframes and minicomputers to networks that provide an enterprise wide data communication platform.
  • Multiple system can get involved in parallel processing, in which they cooperate in the completion of a processing task.
  • Data is stored close to servers that work on that data, minimizing the amount  of information sent over the network.
  • A large percentage of information is cached once into the server's memory rather than the memory of every workstation that needs it.
  • Network traffic is reduced because the server only gives the client the information requested.
  • Larger server systems can offload applications that are better handled by personal workstations.
  • Data is safe and secure in one location.
  • With centralized data, administers can apply security controls to restrict data access and use tracking mechanisms to monitor data access.

Connectivity

In 1981, with the beginning of IBM PC, users are provided to do spreadsheets, word processing and basic database services for personal data. After three years, there is need for high quality printers, backup tapes, high capacity hard disks and software products which is used in desktop's with high investment. LAN solved this problem. Novell was a popular LAN environment.

Workstations emulate corporate systems

In all large organizations, desktop workstations provide personal productivity and some work group functions, but host services provide other business functions. The lack of desktop made the addition of terminal emulation services to the workstation. This emulation connects the workstation directly to the corporate system. The connection was made directly to the host server or controller.

Another step in connectivity is the implementation of specialized servers to provide database and communications services. The LAN cabling provides the necessary physical connection, and the communications server provides the necessary controller services.

Full Fledged client/server applications:
With the adding of communication and database servers, an organization is ready for the next step up from presentation services-only client/server applications to full-fledged client/server applications. These new applications are built on the architecture defined as part of the system development environment.

User Productivity

The ease of use promoted by client/server computing is obtained through the use of windowing environment's such as DOS-based Microsoft Windows, IBM's OS/2 based presentation manager, and UNIX based Motif and Openlook. These graphical user interfaces are mouse driven, take advantage of color and present groups of data in boxes, tasks in windows and choices in menus of icons.

Ways to improve performance

Database and communication processing are frequently offloaded to a faster server processor. The advantage of offloading is realized when the processing power of the server is significantly greater than that of the client workstation.

Database searches, extensive calculations, and stored procedure execution can be performed in parallel by the server while the client workstation deals directly with the current user needs.

As workstation users become more sophisticated, the capability to be simultaneously involved in multiple processes becomes attractive. Independent tasks can be activated to manage communications processes such as electronic mail, electronic feeds from news media and stock exchange and remote data collection.

Reduction in network traffic

Excessive network traffic is  one of the most common cause of poor system performance.

Minimize network requests

The SQL syntax is very powerful and when combined with server trigger logic enables all selection and rejection logic to execute on the server. This approach ensures the amount of traffic between the server and client on the LAN. The performance advantages available from the client/server model of SQL services can be overcome.

Online Transaction Processing (OLTP) in the client/server model requires products that use views, triggers and stored procedures. Products such as Sybase, Ingress, Oracle and Unify use these facilities at the host server to perform the join, apply edit logic prior to updates, calculate virtual columns, or perform complex calculations. The use of OLTP reduce the traffic between client and server and use the powerful CPU capabilities of the server. The use of application and database servers to provide the answer set required for client manipulation will reduce network traffic.

Faster delivery of systems

The workstation environment, powerful multitasking CPU availability, single-user databases and integrated testing tools all combine to provide the developer with considerable productivity improvements in a lower cost environment. Client/server application development shows considerable productivity improvement when the software is implemented within an SDE.

Reuse of the server application functionality, data base and network services is transparent and almost automatic. Because the applications are built with little regard to standard front-end functionality, many features are part of the standard GUI and are automatically reused.


1 comment: