Tuesday, March 1, 2016

Mom, I joined the cloud! (or: Use old stuff with new stuff - DB2 federation)

Everybody is talking about Hybrid Clouds, combining on-premises resources like database systems and ERMs with services in the public or dedicated cloud. Today I am showing you exactly that, how I combined my on-prem DB2 with a cloud-based DB2 that I provisioned via Bluemix. The interesting thing is that really old technology can be used for that purpose: database federation. So relax, sit back, and follow my journey in joining the cloud...
Database Services in the Bluemix Catalog

For my small adventure I used a local DB2 10.5 and a Bluemix-based SQLDB service. The steps I followed are an extended version of what I wrote in 2013 about using three-part names in DB2 to easily access Oracle or DB2 databases. Smilar to the entry I started by enabling my DB2 instance for Federation (FEDERATED is the configuration parameter).
[hloeser@mymachine] db2 update dbm cfg using federated yes
DB20000I  The UPDATE DATABASE MANAGER CONFIGURATION command completed
successfully.




Thereafter, I followed the steps as described in the documentation to configure access to a remote DB2 data source. Basically this is to register the remote server ("tcpip node") and the database with your local DB2 instance, then add access to that remote database within the local database. Knowing the right DB2 statements is one part, having the needed parameters for IP addresses, user names and passwords the other. That information can be obtained from the service dashboard in Bluemix (see the screenshot below). If no published credentials are present, they can be added.
SQLDB Service Credentials
Using the hostname and port information I added the server to my local DB2 directory:
[hloeser@mymachine] db2 catalog tcpip node db2bm remote 75.xxx.xxx.153 server 50000
DB20000I  The CATALOG TCPIP NODE command completed successfully.
DB21056W  Directory changes may not be effective until the directory cache is
refreshed.


I named the node (or server) "db2bm", like DB2 on Bluemix. It is identified by an IP address and the port number. Next the remote database "SQLDB" needs to get cataloged. I named it "CLOUDDB" and the database can be found on the just added node "db2bm":

[hloeser@mymachine] db2 catalog db sqldb as clouddb at node db2bm
DB20000I  The CATALOG DATABASE command completed successfully.
DB21056W  Directory changes may not be effective until the directory cache is
refreshed.


Once this is completed, I connected to my local DB2 database for the next steps. Within the database the DRDA wrapper needs to be created:

db2 => create wrapper drda
DB20000I  The SQL command completed successfully.


Next is to make the remote database "CLOUDDB" available for access within the database. The username "userXXXX" and password are obtained from the credentials information again.
db2 => create server mycloud type db2/cs version 10.5 wrapper drda authorization userXXXX password "servicepassword" options (dbname 'CLOUDDB')
DB20000I  The SQL command completed successfully.


The last step in our setup is to tell DB2 who accesses the remote database. It is a user mapping from local users to remote user names. For my testing I only created a mapping for myself:
db2 => create user mapping for hloeser server mycloud options (remote_authid 'userXXXX', remote_password 'servicepassword')
DB20000I  The SQL command completed successfully.


Fortunately the setup only needs to be done once, regardless of how many tables are used. Now my actual goal of combining the local and cloud databases and their data can begin. How about a small access test?

db2 => select distinct tabschema from mycloud.syscat.tables

TABSCHEMA                                                                                                                      
--------------------------------------------------------------------------------------------------------------------------------
SYSCAT                                                                                                                         
SYSIBM                                                                                                                         
SYSIBMADM                                                                                                                      
SYSPUBLIC                                                                                                                      
SYSSTAT                                                                                                                        
USERxxxx                                                                                                                      

  6 record(s) selected.



If I have used my cloud database as backend for a web application, I could easily copy over the latest order data from the cloud to my local database:

db2 => insert into orders select * from mycloud.userXXXX.orders where orderdate>'2016-02-25'
DB20000I  The SQL command completed successfully.


It is possible to combine the cloud-based and local tables in queries, i.e., to join the cloud:
db2 => select c.cid,o.status,o.total from mycloud.userXXXX.orders o, customers c where o.cid=c.cid and o.status=2

CID       STATUS  TOTAL                                              
--------- ------- -----------
        1       2      222.30                                            

...
     5612       2     1085.99


  215 record(s) selected.



Once the system is set up and configured, the remote, cloud-based database "feels like local". It is great to see how the database federation technology is of value for the hybrid cloud scenario. So don't be scared and join the cloud yourself!