Topics Search

Normalizing Data

Normalizing Data
Views: 22
Now that you understand what a table is ... how do you design them? Just be simple ... but the simple way is not often easy to see! Tips to get you looking at your data from a different angle. (pdf file, install Acrobat Reader to read this tutorial.)
Sponsored Links:

Similar posts...


First Normal Form (1NF) - Normalising Your Database

First Normal Form (1NF) - Normalising Your Database Icon
Database design theory includes design standards called normal forms. The process of making your data and tables match these standards is called normalizing data or data normalization. By normalizing your data, you eliminate redundant information and organize your table to make it easier to manage the data and make future changes to the table and database structure. This process removes the insertion, deletion, and modification anomalies you may see. In normalizing your data, you usually divide large tables into smaller, easier to maintain tables. You can then use the technique of adding foreign keys to enable connections between the tables.

Did I Normalize this correctly?

Did I Normalize this correctly? Icon
But I do have experience working with VB. I'm a highschool science teacher trying to help out the phys ed dept by creating a database to help keeptrack of the athletes, their teams, fee collection and jacket ordering.
After my first attempt resulted in failure (thanks Galaxiom for informing me I needed to normalize my data), I spent the next few hours reading a large number of posts trying to figure out what normalizing meant.

So. here's my attempt at normalizing the data. Any advise about my tables and relationships would be hugely appreciated as I now realise the importance of gettingthis step correct before spending hours making the forms.

Making spreadsheets work in MS Access

Making spreadsheets work in MS Access Icon
I have noticed that some of the problems people experience in constructing a database are caused because they have not normalized their data.

Partly as an experiment to test some new software I found, I have constructed this presentation [LINK] which goes some way to explain the reason you should seriously consider normalizing your data.

Insurance Certificates Database

Insurance Certificates Database Icon
Things have slowed down in my department, so I'm back to working on my Contracts Administration database. When I left off, I was mired in designing tables for insurance certificates, unable to decide how far to go with normalizing. Under-normalizing results in large spreadsheet-like tables, which I am reliably informed are anathema to Access. On the other hand, I believe it is possible to over-normalize almost anything until the data is so scattered into many interrelated tables that performance is compromised, not to mention the unguessed sorrows to be had in writing VBA procedures under such conditions.

I believe the path forward must lie more in understanding our business needs, what we want our application to be capable of doing, rather than slavishly following the mantra of Normalize, Normalize, Normalize.

I have searched this site without much success, however, does anyone out there have any experience designing an ADB for tracking insurance certificates for compliance with subcontract requirements?

Relational database concept example

Relational database concept example Icon
And have been tasked with (what I hope to be) a fairly simple relational database problem. to get some feedback on whether or not this is a simple enough task for someone new to access, and 2. understand if this process seems correct.

I have data for 100 survey studies in excel that have already been broken down into 3 datasets. Study# uniquely identifies each survey, and is the same value across the 3 datasets. I washoping to be able to link all three data bases based on study#.

Below, I outline what is in each dataset, how I think I should be normalizing the data, and at the end describe how I think to combine the 3 data sets:

dataset 1:
each study can have multiple responses for country, audience and database, In excel it would look something like:

Convert Excel Flat File Into Relational Database

Convert Excel Flat File Into Relational Database Icon
I have been tasked to convert an Excel Flat file into a Relational Database (sort off) to keep track of Shipments.

So I began grouping the headings first into sort of a related fields and the idea is to create a table base on the groupings and linked them together in Access (Relationship)

This is what I have come up so far Normalizing the headings.By the way it might be worth mentioning this. When I was looking at the Data on the Excel file. I have noticed that the Batch Number and Container columns have more Data on it. Meaning the Shipment on that day did not only contains more Batches of Item, but also has more Containers.

I am really hoping someone here can kindly help me putting this together.

"Normalizing" imported data

"Normalizing" imported data Icon
I have a situation where I'm importing substantial amounts of data (30,000 to 120,000 rows at a pop) from three external sources through a text file or an Excel spreadsheet into three different data tables. I've established lookup tables for those fields that can reasonably be normalized between and amongst the data tables.

The process I'm going through is:

1. Import the raw data into an "import" table that matches the structure of
the source data. Also included in the import table are columns for foreign keys of 'normalizable' fields, which are set to 0 when the source data is imported.

2. Append any new lookup data that may be present in the source file to the
lookup tables.

3. Run a series of update queries on the import table to update the foreign
key fields with the keys of the lookup data. Depending on the source data file, there are between 3 and 7 of these update queries.

4. Append new records into the data table using only the foreign key values
where applicable.

I'm discovering that the update queries in step 3 are taking a LONG time to run (several minutes each),

- are there other, better processes or data structures to use?
- is there a way of optimizing update queries?

Normalizing an Existing Database

Normalizing an Existing Database Icon
A followup to the earlier article 'Modifying an Existing Program,' this article examines the process involved in converting an existing database into a normalized representation.

Highly Normalized Data and Wizards

Highly Normalized Data and Wizards Icon
I've been making my systems more and more normalized. My filter for something occurring more than once is infinity. So the question for normalizing is will it ever happen and can there ever be two.

I am now storing all companies (whether vendor, customer or any other) in one list.

In other words there is no longer a"tblVendors" or a"tblCustomers", there are now Vendor Agreements and Customer Agreements that each reference the mother table"tblCompanies".

Likewise all contacts in one list. And in both cases all of them have daughter tables for any data that can ever change,
So the table tblContacts can only have"FirstName","LastName","DateOfBirth", and if needed SS#, that's it. Everything else is in daughter tables.

There are a lot of problems with this.
1. Every time you are going to add a new customer or a new vendor you must look both the tblCompanies and the tblContacts to make sure those names don't already exist and if they do, are they duplicates etc. etc. The list goes on and on...
2. You must find a way to limit queries to one phone number, one email, one address, etc. etc.

I can't tell you how much extra work this has caused and how many unforeseen data management issues; and users finding ways to do things you would have not dreamed possible.

Make table query against a normalizing select query.

Make table query against a normalizing select query. Icon
If I import a table from Excel, normalize it in Access (which results in a select query) and then run a make table query against this new select query,have I negated the normalization I performed by turning the normalization select query back into a table?
The purpose of the make table query would be so that I could append the newly created table to an existing table.