Wednesday, May 11, 2011

How to handle huge data in and out of DB2 database?

I often come across situations where I need to export or import huge amounts of data into and out of databases. Hence I decided to collect all the information/steps regarding this, at one place. Here it is.

1. CSV files: If you want to create a sample or test data, you can use rand( ) function in EXCEL to generate random numbers. Then copy-and-paste to create huge files.

I had created a csv file with more than 4 million rows (datfile.csv, say).
SQL to be used to load this data into table say, myschema.my_temp_table, (assuming the table has already been created, to match the csv file):
--
IMPORT FROM '/home/data/datfile.csv' OF DEL MESSAGES '/tmp/messages.txt'
INSERT INTO myschema.my_temp_table ;
--

Embedd this SQL into a shell script, which looks like this:
##
db2 connect to InstanceName user username using passwd
db2 -tvf import_temp_table.sql # shown above
db2 "RUNSTATS ON TABLE myschema.my_temp_table WITH DISTRIBUTION AND INDEXES ALL ALLOW WRITE ACCESS"
db2 connect reset
db2 terminate
##

Easy? Try it.

No comments:

Post a Comment