Monday, January 8, 2018

DeDUG-Treffen in Karlsruhe bei Fiducia GAD (Db2 User Group Meeting)

DeDUG - Deutsche Db2 User Group
Db2 User Group
Happy New Year! I hope you had a great start into 2018. How about meeting in person soon? I am going to speak next week at the German Db2 User Group (DeDUG) meeting in Karlsruhe. It will be hosted by Fiducia GAD IT AG. You can find out about the details and register for the Db2 user group event here. Topics include database security using trusted contexts, the Db2 catalog and metadata, lots of SQL performance tips and tricks, and how Db2 is used in banking infrastructure.

See you in Karlsruhe!

If you have feedback, suggestions, or questions about this post, please reach out to me on Twitter (@data_henrik) or LinkedIn.

Tuesday, December 12, 2017

News on IBM Cloud and Db2 - December 2017 Edition

Another month and a lot of news
In the middle of November I reported about significant changes to Bluemix and IBM's cloud offerings. A month has passed and I want to give you an update to some news I am excited about.

I try to regularly read over the "What's in IBM Cloud" section in the IBM Cloud documentation. There were two significant announcements.
  • The new Resource Groups allow simpler management of all what is in your account (a.k.a. "resources"). You can now group apps, services, virtual machines, Kubernetes-based container services and more and easily assign access privileges.
  • With the introduction of the Resource Groups also came the IBM Cloud Identity and Access Management. It facilitates fine-grained access control utilizing API keys, service IDs and more.
The two above should keep you busy, but there is more. If you are a Db2 afficionado like I, then you probably have subscribed to the "What's New in IBM Db2 Warehouse on Cloud, IBM Db2 Warehouse, and IBM Db2 on Cloud". Did you know that MySQL and PostgreSQL are now available in the web console as data sources for federation? That makes it easy to access data coming from a LAMP stack.

As you might know, I am using and write about the IBM Watson Conversation service. What I liked from their "Release Notes" is a new beta feature to directly call actions from within a dialog node. IBM Cloud Functions are supported. I put that to a test and wrote a Slack bot backed by Watson Conversation that directly queries a Db2 database. I need to beautify the code and write it down (and submit it to IDUG).

I wrote a tutorial about how to generate, access and analyze application logs on IBM Cloud. You can find it in the Solution Tutorials as part of the IBM Cloud documentation.

If you have feedback, suggestions, or questions about this post, please reach out to me on Twitter (@data_henrik) or LinkedIn.

Thursday, November 30, 2017

IBM Cloud: Some fun with Python and Cloud Foundry Logging


IBM Cloud: Turn Log Data into Donut
Last month, after receiving user questions, I blogged about how to decipher Cloud Foundry log entries. Today, I want to point you to a small Cloud Foundry Python app I wrote. It helps to better understand Python and Cloud Foundry logging. You can also use it to test the IBM Cloud Log Analysis service which provides an easy-to-use interface to logs generated by applications running in the IBM Cloud. In the premium plans, external log events can also be fed into the service for consolidated storage and analysis.

As usual, the code for my app is available on Github: https://github.com/data-henrik/application-log-analysis/. Once deployed to IBM Cloud, the app can be used to send messages on a chosen log level back to the server. The server-side log level, i.e., the threshold for processed log messages can also be set. The app produces diagnostic output on "stdout" and "stderr". The two are treated differently by Cloud Foundry. Here is a screenshot of the logging app:
Test app for Cloud Foundry logging
The produced log entries can also be used to try out the IBM Cloud Log Analysis service. Diagnostic logs are automatically forwarded to the Log Search of that service. The messages are fed into Elasticsearch and can be analyzed using Kibana. I wrote some search queries (one shown below) and then built visualizations like the shown "Donut" based on those queries. I will write more about that in a future blog post.
Search Query for Elasticsearch / IBM Cloud Log Analysis

An official tutorial using that app and Log Analysis is available in the IBM Cloud docs.

If you have feedback, suggestions, or questions about this post, please reach out to me on Twitter (@data_henrik) or LinkedIn.

LinkWithin

Related Posts with Thumbnails