Saturday, June 26, 2004

(Database DB) Database Normalization

Put simply, normalization is an attempt to make sure you do not destroy true data or create false data in your database. Errors are avoided by representing a fact in the database one way, one time, and in one place. Duplicate data is a problem as old as data processing. Efficient and accurate data processing relies on the minimizing redundant data and maximizing data integrity. Normalization and the Normal Forms (NF) are efforts to achieve these two core objectives of data processing. This article will examine the concept of normalization in-depth.

Database design is typically the result of data modelling. To borrow from Joe Celkos "Data & Databases", Normalization encourages solid data modelling and seeks to structure the database in such a way that it correctly models reality and avoids anomalies. When we consider the basic way in which a user or application interacts with a database, we can conclude that there are three basic kinds of anomalies:

Insertion anomalies
Update anomalies
Deletion anomalies

No comments: