Dynamically manage indexes and triggers when loading large volume of data
I have to deal with large volume of data during the ETL process. 5 million to 25 million records is a norm for certain tables.
Being able to dynamically manage the indexes and triggers is a key factor for measuring performance.
For a straight Insert, I would drop all the indexes and triggers first on the table, load the data, and then re-create them.
To dynamically drop the indexes and triggers, use the sys tables to find if the index actually exist. Note that this script only determines if the Name of the index exist. If you want to determine all the indexes on a column, you will need to write some while-loop to do that.
To dynamically create indexes. Again, the script only determines if the Name of the index exist.