I don’t know what SQL tools other people are using on Windows for DB2. I’ve used the plain Command Editor from IBM.
It’s not a fancy Windows application, but it does what it is designed to do. But it’s never a standard SQL tool on Windows. It does have some weird looking on Windows.
Another tool I’ve used is Toad. I’ve got some funny jokes from co-workers occasionally when I open Toad.
It’s not a very well-liked tool either.
Now the team is pushing us to use Rapid SQL.
After about two weeks, developers started to complain about freezing issues, same issues as with Toad.
Fortunately, we were told that it’s just some settings that we need to change.
Here they are.
1. Turn off auto complete: File > Options > ISQL > Code Assist, unselect "Enable Code Complete".
2. To prevent table locks from occurring: need to change to Uncommitted Reads for Isolation Level: File > Options > ISQL> DB2 > Set Isolation Level > Select Uncommitted reads from the drop down.
3. Finally, run this to set the IBM DB2 client to uncommitted reads
db2cmd -i -w -c db2 UPDATE CLI CFG FOR SECTION COMMON USING TXNIsolation 1
db2cmd -i -w -c db2 UPDATE CLI CFG FOR SECTION COMMON USING ReadCommonSectionOnNullConnect 1
db2cmd -i -w -c db2 GET CLI CFG FOR SECTION COMMON
I’ve never done SQL database upgrade in a production environment where hundreds and thousands of SQL objects reside on the database engine, Integration Services, Reporting Services, and SQL Agent services.
The current team I am working with is about to take on this upgrade journey from 2005 to 2008. Although we are well into year 2011 now, this reluctant journey is a much welcome step and a necessary one too.
I’ve wrote a blog about SSIS – Enable Package Configuration for Deployment.
It showed you how you can set up the package configuration files on your development PC and enabled it in BIDs.
If you are a release coordinator, you would follow the same step to create the configuration files on the server.
In this blog, I’ll show you the last simple step to use the package configuration files when you create a SQL job for your SSIS package.
When you are setting up the SQL job for your SSIS package, go to the Configuration tab.
Click the Add button to add the configuration files.
1) All package configuration files have extension of .DTSConfig.
2) Each connection in your SSIS package should have a correspondent configuration file. For example, connection IDS should have a configuration file called IDS.DTSConfig.
Reporting for a financial institute can be very difficult due to complex business rules. Reporting can be even harder when the data is about paying employees incentives/bonuses when the rules can be even more complex.
I got into a habit of writing a report/ETL Definition document for every reporting/ETL I do. Excel and Visio documents are my choices.
I don’t call this document “Requirement” or “Specification”. To me, “Requirements” should come from the requester, and ‘”Specification” should come from a business analyst.
Since I am the author of the document, my focus is what I have done in the reporting/ETL, and what business rules I have used.
Incremental loading is not a easy topic. I am just concentrating on a very narrow focus here.
Here is the context of this blog.
- Data Source: relational database in DB2
- Data Destination: a table in SQL database
- Reporting: a SSRS report with direct data pull from the above table
- ETL: Implemented in a SSIS package with some business rules built-in
- Daily Process: newly added data from the data source needs to be loaded into the destination table
Challenge of the ETL Process:
- Need to be re-startable: without any manual setting
- Need to be fast: the source can contain large number of records
- Need to be self-correctable: if source data was corrected and back dated, the destination data should be self-correctable
What it really means is to choose among the following options regarding the incremental daily loading:
- truncate and re-populate the destination table daily
- start from the last date from the destination table: I will need to pass the last date from my SQL table. I can either read it in as a user variable in my SSIS package, and pass it to my query, or I can just “ETL” the last date to the source DB2 database.
- just pull the previous day’s data from the source
- pull the previous 3 days’ data from the source
The first choice is simple enough, but will suffer from poor performance.
The second choice can be a little messy, but it sounds like a good choice.
The third choice is simple enough, but not a good one when considering that our SQL job can potentially fail every day, in which case, data will be missing due to job failure.
The last choice is simple, and it will pull data for the last 3 days even when your job failed in the last 2 days. And it also should be fast. Fortunately SQL jobs are fixed Monday to Friday in our environment.
For the sake of simplicity, I picked the last option for some of my ETL processes. So far, the 3 day rule has worked pretty smoothly.
Adding to the job description collection.
This is the first time I’ve ever seen this: MANDATORY PERSONAL INTERVIEW at candidates expense.
Subject: SQL Server Developer (SSIS, SSRS, Data Warehouse) opportunity
SQL Server Developer (SSIS, SSRS, Data Warehouse) opportunity
Location: West Chester, PA
Duration: 1 year, possible extension to 2 years
Interview Process: Phone Interview followed by MANDATORY PERSONAL INTERVIEW at candidates expense.
Bill Rate: $DOE/hr. Will entertain candidates at a higher level, but they must be senior and not just an inflated rate.
when I showed a draft version of the report to a supervisor, he said that he preferred to have one heading per column.
After some thinking, I feel that he has a point.
So I re-arranged the columns to avoid hijacking columns for different heading.