Data Management Guide IBM COBOL

Distributed Applications[edit]
The designers of distributed applications must determine the best placement of the application's programs and data in terms of the quantity and frequency of data to be transmitted, along with data management, security, and timeliness considerations. There are three Client–server models for the design of distributed applications:
- File Transfer Protocol (FTP) copies or moves whole files or database tables to each client so they can be operated on locally. This model is appropriate for highly interactive applications, such as document and spreadsheet editors, where each client has a copy of the corresponding editor and the sharing of such documents is generally not a concern.
- Thin client applications present the interface of an application to users while the computational parts of the application are centralized with the affected files or databases. Communication then consists of 'remote procedure calls' between the thin clients and a server in which uniquely designed messages specify a procedure to be called, it's associated parameters, and any returned values.
- Fat client applications perform all user interface and processing tasks on client systems, but data is centralized in a server so that it can be managed, so that it can be accessed by any authorized client application, so that all client applications work with up-to-date data, and so that only the records, stream sections, or database tables affected by an application are transmitted. Client application programs must be distributed to all clients that work with the centralized data.
DDM Architecture was initially designed to support the fat client model of distributed applications; it also supports whole file transfers.
Benefits provided by DDM Architecture[edit]
DDM architecture provides distributed applications with the following benefits:
- Local/remote transparency. Application programs can be easily redirected from local data to remote data. Specialized programs that access and manage data in remote systems are not needed.
- Reduced data redundancy. Data need be stored in only one location in a network.
- Better security. By eliminating redundant copies of data, access to the data in a network can be better limited to authorized users.
- Data integrity. Updates by concurrent local and remote users are not lost because of conflicts.
- More timely information. Users of multiple computers in a network always have access to the most recent data.
- Better resource management. The data storage and processing resources of a network of computers can be optimized.
You might also like


![]() |
Build the Best Data Center Facility for Your Business Book (Cisco Press)
|
Data Governance in Insurance Carriers — Insurance Networking News
The discipline includes a focus on data quality, data management, data policies, and a variety of other processes surrounding the handling of data in an organization.
![]() |
TELECT LCX Fiber Optic Patch Plate - 6 Termination SC/UPC - Black (055-0000-6010-BLK) PC Accessory (Telect, Inc.)
|
![]() |
EDI: A Guide to Electronic Data Interchange and Electronic Commerce Applications in the Healthcare Industry Book (Probus Professional Pub)
|
![]() |
Copystar Blu-ray-Dvd-CD Duplicator copier 1 to 5 target 16x 128mb buffered+USB3.0+3D Play back software PC Accessory (Copystars)
|
![]() |
FIFA Manager 12 2012 FM12 Soccer Football PC Game Import [DVD-ROM] Video Games (Electronic Arts)
|