Home > SSIS Best Practices > Dynamically manage indexes and triggers when loading large volume of data

Dynamically manage indexes and triggers when loading large volume of data

I have to deal with large volume of data during the ETL process. 5 million to 25 million records  is a norm for certain tables.

Being able to dynamically manage the indexes and triggers is a key factor for measuring performance.

For a straight Insert, I would drop all the indexes and triggers first on the table, load the data, and then re-create them.

To dynamically drop the indexes and triggers, use the sys tables to find if the index actually exist. Note that this script only determines if the Name of the index exist. If you want to determine all the indexes on a column, you will need to write some while-loop to do that.

image  

To dynamically create indexes. Again, the script only determines if the Name of the index exist.

image

Categories: SSIS Best Practices
  1. No comments yet.
  1. No trackbacks yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: