Wednesday, May 28, 2014

With some magic through the cloud(s)

A couple of years back I was on a trip to Spain. During the taxi to the runway in Frankfurt, our aircraft was diverted to a parking position to get a technical problem fixed. So I grabbed my book, Harry Potter and the Deathly Hallows, and started to read from where I had left the day before. Horcrux by horcrux gets discovered and distroyed, the final duel is about to start when I notice that everybody around me is leaving the aircraft. I had to reconfirm with a look outside, but to my surprise we already arrived in Madrid. Without noticing I had been in and out the clouds, with some "magic" been taken from Germany to Spain.

A similar experience is possible today when using IT services. Usually you don't notice what is going on in the background. Web pages and their components, data and scripts, all could come from a "local" server or from "the cloud". In my recent post I showed how to sign up for DB2 with BLU Acceleration in the cloud and how to catalog a cloud-hosted database locally. Once the database is known locally, you can use the DB2 CLP, db2batch, and your favorite scripts to work with the database, even though it is located "somewhere".

But why would I use an analytics database in the cloud instead of locally? In his post "Cloud is the New Normal" Adam Ronthal answers this and also shares what other services were run locally before, like email or VoIP servers. If you are interested, but have questions about data security in the cloud, the Walid Rjaibi's series on "Data Security in the Cloud" (link is to part 6 with links to other parts provided on that page) is a good read.

That's enough for today. Looking up, I notice that most of my train ride is done. Time for the hotel and some sleep...

Wednesday, May 21, 2014

Clouds ahead: Playing with DB2 and Cognos (FREE)

By now everybody should have heard about IBM BLU Acceleration for Cloud. The tagline is "Data Warehousing and Analytics for Everyone" which caught my eye. So this morning I wanted to find out how easy it is to get started. My conclusion: Almost too easy for an IBM product... ;-)

First you have to visit the official BLU for Cloud website at There you click on the button "Try BLU Acceleration now" and you are taken to an overview of currently four different usage plans. The plan I chose is the free trial plan which is hosted on SoftLayer, but there are also metered plans available on SoftLayer or Amazon WebServices (AWS), and a managed service on IBM BlueMix.

After signing up by providing my Google ID (or alternatively name and email address), I was provided with my new account information within seconds and the system stated that everything was ready to go:
BLU Cloud account created successfully
After clicking on the "Start BLU Acceleration" link as shown in the screenshot above, the web-based managed console came up. It allows working with DB2 database objects, to query and analyze the data, and to run reports against the two sample databases. Of course you can upload your own data and try some of the analytic tools on them. My interest was Cognos and I tried the drill down reports:
Cognos drill down in BLU Cloud
In the graphical report based on the sample database you can click on the regions, product categories, etc. and then continue to subcategories and subregions ("drill down"). Of course you could try out Cognos or Industry Models with your own data.

What I did next is to explore how well the Cloud offering integrates with my local tools. So I opened a local shell on my Linux-based ThinkPad and launched the DB2 command line processor. First I added the Cloud-based DB2 server to my local directory using the CATALOG TCPIP NODE command. Next was to add the remote database using CATALOG DATABASE. Last I connected to the database providing the username and, when prompted, the password. Yeah, connected! That was easy!

Catalog the Cloud-based DB2 server and database on local machine
From start to finish it took me about 5 minutes for the signup process, logging into the Cloud service and adding the remote DB2 server to my local system. Have you tried it? It is free and fun. And I can tell my boss that I know my way around in the Cloud.

Tuesday, May 20, 2014

SQL Quiz: Which command did I run?

I have a small DB2 test database which I wanted to clean up. I ran a command which has the output below. Which command was it?

Table/View                      Schema          Type  Creation time            
------------------------------- --------------- ----- --------------------------
FOO                             HLOESER         T     2014-03-19-
INUSE                           HLOESER         T     2014-05-08-

  2 record(s) selected.

BTW: The command can be found in the IBM Knowledge Center which will be/is replacing and integrating the DB2 Information Center.

Tuesday, May 6, 2014

Tuning your DB2 CLP environment: Customize appearance and editor

Tuning World Bodensee via Wikipedia
Over the last weekend, the annual Tuning World Bodensee was guest at the Messe Friedrichshafen (exhibition center and fair grounds). More than 100,000 people interested in car tuning traveled to Friedrichshafen. "Tuning" can be trying to get more performance out of engine or to customize the car to the personal style. With DB2, you can customize the command line processor to your personal style and preferences. Let's have a look at the available tuning options.

All the recent versions of DB2 provide three environment variables to tune the editing experience in the interactive DB2 CLP: DB2_CLP_EDITOR, DB2_HIST_SIZE, and DB2_CLPPROMPT. The first variable, DB2_CLP_EDITOR, is used to specify an external editor to be used for editing SQL statements. On my Linux system, I did the following:

export DB2_CLP_EDITOR=gedit

Now you can edit previous statements using the EDIT command. "EDIT 1" would call the editor with the first statement in the command history, "E 1" would do the same. To know which statements are available, use the HISTORY command or its short version "H". The maximum number of available commands is determined by the variable DB2_HIST_SIZE. It accepts numbers from 1 to 500.

export DB2_HIST_SIZE=100

To reduce the number of statements listed with the HISTORY commands, you can limit it: "H 10" would return the last 10 statements in history, "H R 5" would return the last five in reverse order. Instead of the option "R" you could also use the full word "REVERSE", e.g., "H REVERSE" or "HISTORY REVERSE". Editing commands is fun, but actually executing them is probably why they were edited. To execute a specific statement from the history, you can utilize RUNCMD. The short version is just "R" and a valid parameter would be the number corresponding to a "historic" statement.

Both RUNCMD and EDIT, if not invoked with a number, will pick the newest statement in history. Both also accept negative numbers with "-1" being the most recent statement.

What is left is to "decorate" the command line processor in your personal style. DB2_CLPPROMPT is used to modify the command prompt. It accepts different tokens and most characters. Here is my version which prints the current database name followed by "=> ":

export DB2_CLPPROMPT="DB: %d => "

Here is a small sample session with the bew prompt:

 DB:  => connect to hltest

   Database Connection Information

 Database server        = DB2/LINUXX8664 10.5.3
 SQL authorization ID   = HLOESER
 Local database alias   = HLTEST

DB: HLTEST => values 'Good Morning'

Good Morning

  1 record(s) selected.

DB: HLTEST => e -1
DB: HLTEST => values 'That''s it, good bye!'
Do you want to execute the above command ? (y/n) y

That's it, good bye!

  1 record(s) selected.

DB: HLTEST => h r
4    h r
3    values 'That''s it, good bye!'
2    values 'Good Morning'
1    connect to hltest

Monday, May 5, 2014

New DB2 BLU video - Analytics on IBM POWER: the game is changing

Do you like playing "Connect Four" (4 gewinnt, Four in a Row, Fire på stribe, Connecta 4, ...)? Well, there is a new new video talking about DB2 with BLU Acceleration, SAP BW, and DB2 on POWER. And it has an ending that is typical for IBM :) The customer wins. See for yourself...