Tuesday, April 24, 2018

Automated, regular database jobs with IBM Cloud Functions (and Db2)

IBM Cloud Functions and Db2
Yesterday, I blogged about the latest tutorial I wrote. The tutorial discusses how to combine serverless and Cloud Foundry for data retrieval and analytics. That scenario came up when I looked into regularly downloading GitHub traffic statistics for improved usage insights. What I needed was a mechanism to execute small Python script on a daily or weekly basis. After looking into some possible solutions, IBM Cloud Functions was the clear winner. In this blog, I am going to discuss how simple it is to implement some regular, automated activities, such as maintenance jobs for a cloud database.

Code your action

An action is the part that is executed. IBM Cloud Functions supports several programming languages for coding an action. JavaScript, Swift, Python, and some others can be used or even a Docker image provided. In my case, I implemented a Python action to fetch the GitHub account information and the list of repositories from Db2, then to retrieve the traffic data from GitHub and, last, to merge it in Db2. The code for that particular action can be found in this file on GitHub.

Create action, trigger, and rule

Once the code is ready, it can be used to create a Cloud Functions action. The available runtime environments already include drivers for several database systems, including Db2. The zip file "ghstats.zip" includes extra files for modules that are not part of the standard environment. The second step is to bind the action to the database service. Thereby the database credentials are automatically obtained.

# Create the action to collect statistics
bx wsk action create collectStats --kind python-jessie:3 ghstats.zip

# Bind the service credentials to the action
bx wsk service bind dashDB collectStats --instance ghstatsDB --keyname ghstatskey 

# Create a trigger for firing off daily at 6am

bx wsk trigger create myWeekly --feed /whisk.system/alarms/alarm --param cron "0 6 * * 0" --param startDate "2018-03-21T00:00:00.000Z" --param stopDate "2018-12-31T00:00:00.000Z"

# Create a rule to connect the trigger with the action
bx wsk rule create myStatsRule myWeekly collectStats 

A trigger emits an event on the given schedule. The above trigger definition uses the cron syntax to fire every Sunday at 6am. Last, a rule creates the connection between the trigger and the action. This causes the action to be executed on a weekly schedule.


Using IBM Cloud Functions it is easy to implement automated, regular maintenance jobs. This could be to clean up data in a database, call APIs of web services, summarize activities and send out the weekly report, and much more. For my use case it is the ideal tool for the problem. It is inexpensive ("cheap") because it only consumes resources once a week for few seconds. Read the full tutorial in the IBM Cloud documentation.

If you have feedback, suggestions, or questions about this post, please reach out to me on Twitter (@data_henrik) or LinkedIn.