File
Processing System:
File processing systems was an early attempt to
computerize the manual filing system. A file system is a method for storing and
organizing computer files and the data they contain to make it easy to find and
access them. Before DBMS was invented, Information was stored using File
Processing System. In this System, data is stored in permanent system files
(secondary Storage). Different application programs are written to extract data
from these files and to add record to these files. File systems may use a
storage device such as a hard disk or CD-ROM and involve maintaining the
physical location of the files.
Here
is the list of some important characteristics of file processing system:
§
It is a group of files storing data of
an organization.
§
Each file is independent from one
another.
§
Each file is called a flat file.
§
Each file contained and processed
information for one specific function, such as accounting or inventory.
§
Files are designed by using programs
written in programming languages such as COBOL, C, C++.
§
The physical implementation and access
procedures are written into database application; therefore, physical changes
resulted in intensive rework on the part of the programmer.
§
As systems became more complex, file
processing systems offered little flexibility, presented many limitations, and
were difficult to maintain.
Disadvantages
of File System:
1.
Data Redundancy and Inconsistency:
It is possible that the same information may be
duplicated in different files. This results in data redundancy and
inconsistency.
Consider
following two data files:
Saving
account data file: Stores information about customer. {acc_no, name,
social_security, addr, teleph_no};
Checking
account data file: Stores information about customer. {acc_no, name,
social_security, addr, teleph_no};
Fields
{name, social_security, addr, teleph_no} are same in both the files i.e.
Duplication of data is there which results data redundancy.
Data Redundancy increases the cost of storing
& retrieving the data. Various copies of same data may
contain different values then it results in Inconstancy of data. It may create
a risk of out dated values of data.
For Example: If you change customer name in saving
account data file then his name should be changed in all other files related to
customer.
2.
Difficulty in Accessing the Data:
File processing system doesn't allow needed data
to be retrieved in a convenient and efficient manner. If a user wants information
in a specific manner then he requires creating a program for it.
For Example, consider a data file, Saving account
data file with fields {acc_no, name, social_security, addr, balance}. Application
programs to access the data are written. But if user wants to display only
those records for which balance is greater than Rs:10,000 and is that program
is not written, then it’s difficult to access that data.
3.
Data Isolation:
Because data are scattered in various files, and
files may be in different formats, it is difficult to write new application
programs to retrieve the appropriate data. If you want to extract data
from two file then you are required to which part of the file is needed and how
they are related to each other.
4.
Integrity Problems:
A collection of data is integrated if it meets
certain consistency constraints. A programmer always puts these constraints in
the programs by adding some codes. In File Processing System, poor data
integrity often arises and it becomes very difficult to add new constraints at
that time.
For Example: The maximum marks of the student can
never be more than 100.
5.
Atomicity problem:
Atomicity is required to save the data values, it
means that information is completely entered or canceled at all. Any system may
fail at any time and at that time it is desired that data should be in a
consistent state. A computer system is subject failure. In many applications,
it is crucial to ensure that, once a failure has occurred and has been detected,
the data are stored to the consistent state that existed prior to the failure.
It is difficult to ensure this property in a conventional file-processing
system.
For Example: If you are buying a ticket from
railway and you are in the process of money transaction. Suddenly, your
internet got disconnected then you may or may not have paid for the ticket. If
you have paid then your ticket will be booked and if not then you will not be
charged anything. That is called consistent state, means you have paid or not.
6.
Concurrent-access anomalies:
If multiple users are updating the same data
simultaneously it will result in inconsistent data state. In file processing
system it is very difficult to handle this using program code. This results in
concurrent access anomalies.
For example, a student wants to borrow a book from
the library. He searches for the book in the library file and sees that only
one copy is available. At the same time another student also, wants to borrow
same book and checks that one copy available. First student opt for borrow and
gets the book. But it is still not updated to zero copy in the file and the
second student also opt for borrow! But there are no books available. This is
the problem of concurrent access in the file system.
7.
Security Problems:
Poor
data security is the most threatening problem in File Processing System. There
is very less security in File Processing System as anyone can easily modify and
change the data stored in the files. All the users must have some restriction
of accessing data up to a level.
For Example: If a student can access his data in
the college library then he can easily change books issued date. Also he can
change his fine detains to zero.
Comparison
of File Management and DBMS:
-profshardulp.patil@gmail.com
No comments:
Post a Comment