Another day, another DB2 quiz. Below is the partial output produced by a DB2 tool. Which tool is it? It is part of every DB2 LUW installation.
COLUMN NAME : ID
Length of column name : 2
Column id : 0
Col type {hex, type} : 0x0001 , INTEGER
Col dist. type {hex, type}: 0x0018 , INTEGER
Column length : 4
Blob length : 4
Statistic offset : 480
Column String Units : N
Flags : 0x20000090
- SQLRG_MIXED
- SQLRG_NULLIND
Flags2 : 0x00000000
Codepage : 0
Collation name : NOT_APPLICABLE
Collation ID in hex : B'FF FF FF FF FF FF FF FF
Length of user default : 0
Logged : 0
Compact : 0
Inline Length : 0
Datalink Features : 000000
Ref. row type desc. offset: 0
No. unique values in col. : 2
Average column length : 5
Expected Change Rate : 0.000781
Percentage Encoded : -1
Page Variance Ratio : -1
Source column ID : 0
High2key and Low2key len. : 8
Avg column length in chars: -1
Delimiter length : -1
Number of Subelements : -1
Number of nulls : 0
High2key : 2
Low2key : 1
Statistic Descriptor :
colid : 0
fd offset : 16
hd offset : 64
nmostfreq : 2
mtiles : 4
Histogram Descriptor : Count Distcount Value_len Value
---------------------------------------------
0 0 4 1
1280 0 4 1
1280 0 4 2
2560 0 4 2
Frequency Descriptor : Count Value_len Value
---------------------------------
1280 4 1
1280 4 2
Security Label ID : 0
COLUMN NAME : S
Length of column name : 1
Column id : 1
Col type {hex, type} : 0x0101 , VARCHAR
Col dist. type {hex, type}: 0x0038 , VARCHAR
Column length : 2000
Blob length : 2000
Statistic offset : 504
Column String Units : S
Flags : 0x20000090
- SQLRG_MIXED
- SQLRG_NULLIND
Flags2 : 0x00000000
Codepage : 1208
Collation name : IDENTITY
Collation ID in hex : B'00 00 00 02 FF 00 FF FF
Length of user default : 0
Logged : 0
Compact : 0
Inline Length : 0
Datalink Features : 000000
Ref. row type desc. offset: 0
No. unique values in col. : 2
Average column length : 86
Expected Change Rate : 0.000781
Percentage Encoded : -1
Page Variance Ratio : -1
Source column ID : 0
High2key and Low2key len. : 66
Avg column length in chars: 81
Delimiter length : -1
Number of Subelements : -1
Number of nulls : 0
High2key : {'This is a long teeeeeeeeeeeeeeee}
High2key in Hex Format : 27546869732069732061206c6f6e67207465656565656565656565656565656565
Low2key : {'Another }
Low2key in Hex Format : 27416e6f7468657220202020202020202020202020202020202020202020202020
Statistic Descriptor :
colid : 1
fd offset : 16
hd offset : 96
nmostfreq : 2
mtiles : 4
Histogram Descriptor : Count Distcount Value_len Value Hex(value)
---------------------------------------------
0 0 8 {'Another} 27416e6f74686572
1280 0 8 {'Another} 27416e6f74686572
1280 0 33 {'This is a long teeeeeeeeeeeeeeee} 27546869732069732061206c6f6e67207465656565656565656565656565656565
2560 0 33 {'This is a long teeeeeeeeeeeeeeee} 27546869732069732061206c6f6e67207465656565656565656565656565656565
Frequency Descriptor : Count Value_len Value Hex(value)
---------------------------------
1280 8 {'Another} 27416e6f74686572
1280 33 {'This is a long teeeeeeeeeeeeeeee} 27546869732069732061206c6f6e67207465656565656565656565656565656565
Security Label ID : 0
Henrik's thoughts on life in IT, data and information management, cloud computing, cognitive computing, covering IBM Db2, IBM Cloud, Watson, Amazon Web Services, Microsoft Azure and more.
Wednesday, July 30, 2014
Monday, July 28, 2014
What's Going On? - DB2 Workload Management: Identification of Activities
In an earlier blog post I had written about why Workload Management is needed. It's not just something for the database system or on the operating system level, it is really useful and done in "real life". But what is managed in the system, how are you able to identify activies in a DB2 database system? I am going to explain that today.
Before I dig deeper into the identification, first we need to clarify what is meant with "activity". It could be almost anything going on in the database system that is related to a single database and could be both user- and system-related tasks. The important distinction is that is on the database level, not for a DB2 instance. Identification of activities deals with three questions: WHO is doing WHAT on my database and WHERE is that data located?
The WHO can be answered by looking at the connection properties such as:
By specifying a "data tag" for a work class, it can be related to storage groups or tablespaces and their priority. This is how activity can be identified by from WHERE the data is processed.
Because multiple work classes in a work class set could identify the same activity, the individual work classes can be ordered/positioned within the set. That way a work class with several properties could pick a very specific activity whereas other activities would be mapped to more general work classes.
Using the concepts of WORKLOAD and WORK CLASS SET it is possible to identify an activity. They help to understand what is going on in the DB2 database system. It is the prerequisite for actively controlling and managing the activities in the system by assigning resources.
Before I dig deeper into the identification, first we need to clarify what is meant with "activity". It could be almost anything going on in the database system that is related to a single database and could be both user- and system-related tasks. The important distinction is that is on the database level, not for a DB2 instance. Identification of activities deals with three questions: WHO is doing WHAT on my database and WHERE is that data located?
The WHO can be answered by looking at the connection properties such as:
- Who is the user and which group does the user belong to?
- Is the user operating in a special role?
- From where is the user connecting, does the machine have a name, is it from a specific application?
By specifying a "data tag" for a work class, it can be related to storage groups or tablespaces and their priority. This is how activity can be identified by from WHERE the data is processed.
Because multiple work classes in a work class set could identify the same activity, the individual work classes can be ordered/positioned within the set. That way a work class with several properties could pick a very specific activity whereas other activities would be mapped to more general work classes.
Using the concepts of WORKLOAD and WORK CLASS SET it is possible to identify an activity. They help to understand what is going on in the DB2 database system. It is the prerequisite for actively controlling and managing the activities in the system by assigning resources.
Labels:
administration,
best practices,
data in action,
DB2,
IT,
performance,
version 10,
version 10.5,
workload
Quiz? State of Affairs...!
This is not really a quiz (if you followed by advice to try out IBM Bluemix), but merely shows the state of affairs:
What does it say to the trained eyes? Where was screenshot taken, what does it indicate? Any takers?
Current State |
Friday, July 25, 2014
The Hunt for the Chocolate Thief (Part 2) - Putting IBM Bluemix, Cloudant, and a Raspberry Pi to good use
I am still on the hunt for the mean chocolate thief, kind of. In the first part I covered the side of the Raspberry Pi and uploading data to Cloudant. I showed how to set up an infrared motion sensor and a webcam with the RPi, capture a snapshot and secure the image and related metadata in a Cloudant database on the IBM Bluemix Platform-as-a-service (PaaS) offering. In this part I am going to create a small reporting website with Python, hosted as a IBM Bluemix service.
Similar to an earlier weather project, I use Python as scripting language. On Bluemix, which is based on Cloud Foundry, this means to "bring your own buildpack". I already described the necessary steps which is to tell Bluemix how to create the runtime environment and install the needed Python libraries. So how do I access the incident data, i.e., the webcam snapshots taken by the Raspberry Pi when someone is in front of the infrared motion sensor? Let's take a look at the script:
The setup phase includes reading in access data for the Cloudant database server. Either that information is taken from a Bluemix environment variable or provided in a file "cloudant.json" (similar to what I did on the RPi). The main part of the script defines three routes, i.e., how to react to certain URL requests. The index page (index()) returns an overview of all recorded incidents, an incident detail page (incident(id)) fetches the data for a single event and embeds the stored webcam image into the generated page, and the last route (image(id)) redirects the request to Cloudant.
Looking at how the index page is generated, you will notice that a predefined Cloudant view (secondary index) named "incidents/incidents" is evaluated. It is a simple reduce function that sorts based on the timestamp and document ID and returns just that composite key.
function(doc) {
if (doc.type == "oc")
emit({"ts" : doc.timestamp, "id" : doc._id}, 1);
}
Then I access the timestamp information and generate the list as shown in the screenshot above.
The incident detail page has the document ID as parameter. This makes it simple to retrieve the entire document and print the details. The webcam image is embedded. So who got my chocolate? Take a look. It looks like someone who got a free copy of "Hadoop for Dummies" at the IDUG North America conference.
Maybe another incident will shed light into this mystery. Hmm, looks like someone associated to the "Freundeskreis zur Förderung des Zeppelin Museums e.V." in Friedrichshafen. I showed the pictures to my wife and she was pretty sure who took some chocolate. I should pay more attention when grabbing another piece of my chocolate and should more closely watch how much I am eating/enjoying.
Have a nice weekend (and remember to sign up for a free Bluemix account)!
Similar to an earlier weather project, I use Python as scripting language. On Bluemix, which is based on Cloud Foundry, this means to "bring your own buildpack". I already described the necessary steps which is to tell Bluemix how to create the runtime environment and install the needed Python libraries. So how do I access the incident data, i.e., the webcam snapshots taken by the Raspberry Pi when someone is in front of the infrared motion sensor? Let's take a look at the script:
import os
from flask import Flask,redirect
import urllib
import datetime
import json
import couchdb
app = Flask(__name__)
# couchDB/Cloudant-related global variables
couchInfo=''
couchServer=''
couch=''
#get service information if on Bluemix
if 'VCAP_SERVICES' in os.environ:
couchInfo = json.loads(os.environ['VCAP_SERVICES'])['cloudantNoSQLDB'][0]
couchServer = couchInfo["credentials"]["url"]
couch = couchdb.Server(couchServer)
#we are local
else:
with open("cloudant.json") as confFile:
couchInfo=json.load(confFile)['cloudantNoSQLDB'][0]
couchServer = couchInfo["credentials"]["url"]
couch = couchdb.Server(couchServer)
# access the database which was created separately
db = couch['officecam']
@app.route('/')
def index():
# build up result page
page='<title>Incidents</title>'
page +='<h1>Security Incidents</h1>'
# Gather information from database about which city was requested how many times
page += '<h3>Requests so far</h3>'
# We use an already created view
for row in db.view('incidents/incidents'):
page += 'Time: <a href="/incident/'+str(row.key["id"])+'">'+str(row.key["ts"])+'</a><br/>'
# finish the page structure and return it
return page
@app.route('/incident/<id>')
def incident(id):
# build up result page
page='<title>Incident Detail</title>'
page +='<h1>Security Incident Details</h1>'
doc=db.get(id)
# Gather information from database about the incident
page += '<br/>Incident at date/time:'+str(doc["timestamp"])
page += '<br/>reported by "'+doc["creater"]+'" at location "'+doc["location"]+'"'
page += '<br/>Photo taken:<br/><img src="/image/'+id+'" />'
# finish the page structure and return it
return page
@app.route('/image/<id>')
def image(id):
#redirecting the request to Cloudant for now, but should be hidden in the future
return redirect(couchServer+'/officecam/'+id+'/cam.jpg')
port = os.getenv('VCAP_APP_PORT', '5000')
if __name__ == "__main__":
app.run(host='0.0.0.0', port=int(port))
Overview of Security Incidents |
Looking at how the index page is generated, you will notice that a predefined Cloudant view (secondary index) named "incidents/incidents" is evaluated. It is a simple reduce function that sorts based on the timestamp and document ID and returns just that composite key.
Incident Detail: Hadoop involved? |
if (doc.type == "oc")
emit({"ts" : doc.timestamp, "id" : doc._id}, 1);
}
The incident detail page has the document ID as parameter. This makes it simple to retrieve the entire document and print the details. The webcam image is embedded. So who got my chocolate? Take a look. It looks like someone who got a free copy of "Hadoop for Dummies" at the IDUG North America conference.
Maybe another incident will shed light into this mystery. Hmm, looks like someone associated to the "Freundeskreis zur Förderung des Zeppelin Museums e.V." in Friedrichshafen. I showed the pictures to my wife and she was pretty sure who took some chocolate. I should pay more attention when grabbing another piece of my chocolate and should more closely watch how much I am eating/enjoying.
Zeppelin Brief seen at robbery |
Have a nice weekend (and remember to sign up for a free Bluemix account)!
Catching the mean chocolate thief with Raspberry Pi, Bluemix, and Cloudant
I always try to have some chocolate in my office, kind of as mood enhancer. But how to be sure that nobody else is going to plunder and pilfer my hidden treasures? So it was great that last week at the Developer Week conference in Nuremberg I got my hands on a Raspberry Pi (thank you, Franzis Verlag and Christian Immler) and that I know a little about IBM Bluemix. And here is the plan: Hook up my IBM-sponsored webcam to the RPi and then take, activated by a motion-sensor, a snapshot and upload the picture and metadata to a Cloudant NoSQL database. With a Bluemix-based application I could then have worldwide access to the "incident data" and catch the mean chocolate thief...
Raspberry Pi, motion sensor, and webcam |
Next I logged into IBM Bluemix, the platform-as-a-service (PaaS) offering for developers and created a Cloudant data store. This is done similar to how I described it in my previous article on using Cloudant for some statistics for a weather webpage. The account data for the Cloudant database can be obtained in JSON format. I copied that information into a file "cloudant.json" and placed it into my project directory on the Raspberry Pi. With that, we are already at the software part of this project.
In the following, you see the Python script I used for the prototyping. It is performing some setup work which includes reading in the access information for the Cloudant account. The main part is a simple loop waiting for the thief to appear, i.e., the motion sensor to be actived:
import datetime
import time
import subprocess
import RPi.GPIO as io
import json
import couchdb
io.setmode(io.BCM)
pir_pin = 18
scriptPath='/home/pi/projects/officeCam/takeSnap.sh'
imgFile='/home/pi/projects/officeCam/office.jpg'
# couchDB/Cloudant-related global variables
couchInfo=''
couchServer=''
couch=''
with open("cloudant.json") as confFile:
couchInfo=json.load(confFile)['cloudantNoSQLDB'][0]
couchServer=couchInfo["credentials"]["url"]
couch = couchdb.Server(couchServer)
# access the database which was created separately
db = couch['officecam']
io.setup(pir_pin, io.IN) # activate input
while True:
if io.input(pir_pin):
subprocess.call([scriptPath])
f=open(imgFile,'r')
# basic doc structure
doc= { "type" : "oc",
"creater" : "RPi",
"location" : "office",
"city" : "Friedrichshafen"
}
doc["timestamp"]=str(datetime.datetime.utcnow())
# and store the document
db.save (doc)
db.put_attachment(doc,f,filename='cam.jpg')
f.close()
print("Alarm processed")
time.sleep(1)
Once some motion has been detected, the Python script invokes a shell script. It is printed below. The only action is to execute the fswebcam program which takes a snapshot with the webcam. Thereafter, back in Python, I create a JSON document, stuff the current timestamp and some other information into it and store it to the Cloud-based NoSQL database. As last step I attach the picture to that document, so that even if the mean chocolate thief notices the trap, the image is secured in the cloud.
#!/bin/sh
fswebcam -q -c /home/pi/projects/officeCam/fswebcam.conf
With that I am done with the Raspberry Pi. What is left is to work on the reporting. See how it is done in Python on Bluemix and Cloudant.
Wednesday, July 23, 2014
Watch this! Move your DB2 monitoring to the in-memory interface (WLM monitoring)
Since its first days as a database management system, DB2 has been been changed. It has been extended by new features to serve customer requirements and has been adapted to the state of the art in hardware and software technology. One major new feature has been the introduction of the DB2 Workload Management in version 9.5 and related more comprehensive monitoring with finer granularity (in-memory metrics monitoring) in version 9.7. As with many product changes, it takes a while for customers to really put them to use and reap the benefits, especially when the existing functionality still works.
Thus I was happy when I saw a new article on IBM developerWorks describing how to move off the (old) snapshot monitoring interfaces in DB2 and to the in-memory metrics monitoring. What is included in the article is an overview of the advantages of the newer interface. This should get you motivated to read the rest of the article (and then to migrate if not done yet). It contains a side-by-side comparison of old and new interfaces and has many sample SQL queries. The queries demonstrate how to obtain DB2 runtime metrics using the old and new interface for some popular monitoring tasks. You can find the documentation of the SQL interface to the in-memory metrics in the DB2 Knowledge Center in this overview. Most of the pages in the manual have further SQL samples to get you started.
So take a look, it will also help you with one of the upcoming DB2 quizzes on this blog.
Thus I was happy when I saw a new article on IBM developerWorks describing how to move off the (old) snapshot monitoring interfaces in DB2 and to the in-memory metrics monitoring. What is included in the article is an overview of the advantages of the newer interface. This should get you motivated to read the rest of the article (and then to migrate if not done yet). It contains a side-by-side comparison of old and new interfaces and has many sample SQL queries. The queries demonstrate how to obtain DB2 runtime metrics using the old and new interface for some popular monitoring tasks. You can find the documentation of the SQL interface to the in-memory metrics in the DB2 Knowledge Center in this overview. Most of the pages in the manual have further SQL samples to get you started.
So take a look, it will also help you with one of the upcoming DB2 quizzes on this blog.
Labels:
administration,
DB2,
developerWorks,
in-memory,
IT,
migration,
monitoring,
version 10,
version 10.5,
version 9.7
Tuesday, July 8, 2014
DB2 Quiz: Processes and CPU
In today's DB2 quiz the focus is on DB2 processes and CPU consumption. Which SQL statement did I run in DB2 for the following output? What function is used?
NAME CPU_USER CPU_SYSTEM
----------------------------------- -------------------- --------------------
db2fmp ( 13 4
db2fmp ( 9 10
db2vend (PD Vendor Process - 1) 0 5 2
db2ckpwd 0 0 13
db2ckpwd 0 0 13
db2ckpwd 0 0 9
db2sysc 0 18561 30282
db2wdog 0 [hloeser] 6 44
db2acd 0 6997 11550
9 record(s) selected.
A statement similar to the one I used can be found in the DB2 Knowledge Center. The statement makes use of a special table function.
NAME CPU_USER CPU_SYSTEM
----------------------------------- -------------------- --------------------
db2fmp ( 13 4
db2fmp ( 9 10
db2vend (PD Vendor Process - 1) 0 5 2
db2ckpwd 0 0 13
db2ckpwd 0 0 13
db2ckpwd 0 0 9
db2sysc 0 18561 30282
db2wdog 0 [hloeser] 6 44
db2acd 0 6997 11550
9 record(s) selected.
A statement similar to the one I used can be found in the DB2 Knowledge Center. The statement makes use of a special table function.
Labels:
administration,
DB2,
fun,
Information Center,
IT,
knowledge center,
quiz,
version 10.5
Wednesday, July 2, 2014
Nice Cloud, no rain: Using Cloudant/couchDB with Python on Bluemix
My last two blog entries were about getting started with Python on IBM Bluemix and how to use a custom domain with my Bluemix weather application. Today I am going to show how I added Cloudant and couchDB to my application, both locally and on Bluemix.
Storing the weather data locally doesn't make sense because I can query much more historical data on OpenWeatherMap. So I am going to use a database to log information about for which city and when the data was requested. That information, in aggregated form, could then be reported as fun fact to each user of the app. I chose Cloudant because it is simple to use, adequate for the intended purpose, has free usage plans on Bluemix, and I can use it and test locally as couchDB.
The code itself is relatively simple and I put comments (shown at the end of the article). The interesting part is how to add a Cloudant service to my application on Bluemix, how to bind them in the application, and the preparation work for the database itself. So let's take a look at those steps.
Cloudant is offered as one of several services in the "Data Management" category on Bluemix. While on the Dashboard you simply click on the "Add a service" button as show on the right. Navigate to the Data Management section and choose Cloudant.
It will bring up a screen showing information about the service itself, on usage terms, and on the right side of it a dialog "Add Service" for adding the service to your account. Here you can already bind the new database service to your application by selecting an existing application from a dropdown list. I did that and gave my new Cloudant service the name "cloudantWeather" as shown:
Once the service is added you can bring up the Cloudant administration interface. I have used Cloudant and couchDB before, so that isn't anything new. To avoid dealing with creation of a database as part of the actual program I decided to create a "weather" database through the administration interface for the hosted Cloudant and my local couchDB servers. An interesting but not too tricky part is how to access both servers depending on where the application is running. Information with the username, password, server address and other details is provided in an environment variable VCAP_SERVICES when run on Bluemix. Thus, in the program I am testing for the presence of that variable and then either retrieve the server information from it or access my local couchDB:
Storing new documents is simple and is shown in the full code listing. For the queries I am using the MapReduce feature of couchDB. In a "map" function I return the city name (and just the integer value 1), in the reduce function I am aggregating (summing up) the values by city. Both functions could be defined in the Python script and then passed into Cloudant as part of the query or predefined for more performance. I chose the latter one. So I created a so-called "secondary index" in my Cloudant database, it is called "view" in my couchDB. They are stored as part of a "design document" (shown is Cloudant):
With that I finish my Python application, add some calls to the couchDB Python API (which I needed to add to the file "requirements.txt" as dependency) and test it locally. The final step is to deploy the application to Bluemix using the Cloud Foundry tool "cf push". Done, seems to work:
Last but not least, here is the code I used for my little app:
Storing the weather data locally doesn't make sense because I can query much more historical data on OpenWeatherMap. So I am going to use a database to log information about for which city and when the data was requested. That information, in aggregated form, could then be reported as fun fact to each user of the app. I chose Cloudant because it is simple to use, adequate for the intended purpose, has free usage plans on Bluemix, and I can use it and test locally as couchDB.
Add Cloudant as new service |
Cloudant is offered as one of several services in the "Data Management" category on Bluemix. While on the Dashboard you simply click on the "Add a service" button as show on the right. Navigate to the Data Management section and choose Cloudant.
It will bring up a screen showing information about the service itself, on usage terms, and on the right side of it a dialog "Add Service" for adding the service to your account. Here you can already bind the new database service to your application by selecting an existing application from a dropdown list. I did that and gave my new Cloudant service the name "cloudantWeather" as shown:
Bind Cloudant to your application |
#get service information if on Bluemix
if 'VCAP_SERVICES' in os.environ:
couchInfo = json.loads(os.environ['VCAP_SERVICES'])['cloudantNoSQLDB'][0]
couchServer = couchInfo["credentials"]["url"]
couch = couchdb.Server(couchServer)
#we are local
else:
couchServer = "http://127.0.0.1:5984"
couch = couchdb.Server(couchServer)
Storing new documents is simple and is shown in the full code listing. For the queries I am using the MapReduce feature of couchDB. In a "map" function I return the city name (and just the integer value 1), in the reduce function I am aggregating (summing up) the values by city. Both functions could be defined in the Python script and then passed into Cloudant as part of the query or predefined for more performance. I chose the latter one. So I created a so-called "secondary index" in my Cloudant database, it is called "view" in my couchDB. They are stored as part of a "design document" (shown is Cloudant):
Secondary index / permanent view |
With that I finish my Python application, add some calls to the couchDB Python API (which I needed to add to the file "requirements.txt" as dependency) and test it locally. The final step is to deploy the application to Bluemix using the Cloud Foundry tool "cf push". Done, seems to work:
Bluemix weather app with Cloudant stats |
Last but not least, here is the code I used for my little app:
import os
from flask import Flask,redirect
import urllib
import datetime
import json
import couchdb
BASE_URL = "http://api.openweathermap.org/data/2.5/weather?q="
BASE_URL_fc ="http://api.openweathermap.org/data/2.5/forecast/daily?cnt=1&q="
app = Flask(__name__)
# couchDB/Cloudant-related global variables
couchInfo=''
couchServer=''
couch=''
#get service information if on Bluemix
if 'VCAP_SERVICES' in os.environ:
couchInfo = json.loads(os.environ['VCAP_SERVICES'])['cloudantNoSQLDB'][0]
couchServer = couchInfo["credentials"]["url"]
couch = couchdb.Server(couchServer)
#we are local
else:
couchServer = "http://127.0.0.1:5984"
couch = couchdb.Server(couchServer)
# access the database which was created separately
db = couch['weather']
@app.route('/')
def index():
return redirect('/weather/Friedrichshafen')
@app.route('/weather/<city>')
def weather(city):
# log city into couchDB/Cloudant
# basic doc structure
doc= { "type" : "city",
"c_by" : "bm",
}
# we store the city and the current timestamp
doc["city"]=city
doc["timestamp"]=str(datetime.datetime.utcnow())
# and store the document
db.save (doc)
# Time to grab the weather data and to create the resulting Web page
# build URIs and query current weather data and forecast
# JSON data needs to be converted
url = "%s/%s" % (BASE_URL, city)
wdata = json.load(urllib.urlopen(url))
url_fc = "%s/%s" % (BASE_URL_fc, city)
wdata_fc = json.load(urllib.urlopen(url_fc))
# build up result page
page='<title>current weather for '+wdata["name"]+'</title>'
page +='<h1>Current weather for '+wdata["name"]+' ('+wdata["sys"]["country"]+')</h1>'
page += '<br/>Min Temp. '+str(wdata["main"]["temp_min"]-273.15)
page += '<br/>Max Temp. '+str(wdata["main"]["temp_max"]-273.15)
page += '<br/>Current Temp. '+str(wdata["main"]["temp"]-273.15)+'<br/>'
page += '<br/>Weather: '+wdata["weather"][0]["description"]+'<br/>'
page += '<br/><br/>'
page += '<h2>Forecast</h2>'
page += 'Temperatures'
page += '<br/>Min: '+str(wdata_fc["list"][0]["temp"]["min"]-273.15)
page += '<br/>Max: '+str(wdata_fc["list"][0]["temp"]["max"]-273.15)
page += '<br/>Morning: '+str(wdata_fc["list"][0]["temp"]["morn"]-273.15)
page += '<br/>Evening: '+str(wdata_fc["list"][0]["temp"]["eve"]-273.15)
page += '<br/><br/>Weather: '+wdata_fc["list"][0]["weather"][0]["description"]
page += '<br/><br/>'
# Gather information from database about which city was requested how many times
page += '<h3>Requests so far</h3>'
# We use an already created view
for row in db.view('weatherQueries/cityCount',group=True):
page += row.key+': '+str(row.value)+'<br/>'
# finish the page structure and return it
page += '<br/><br/>Data by <a href="http://openweathermap.org/">OpenWeatherMap</a>'
return page
port = os.getenv('VCAP_APP_PORT', '5000')
if __name__ == "__main__":
app.run(host='0.0.0.0', port=int(port))
Tuesday, July 1, 2014
Using a custom domain for my Bluemix Python-based weather app
Last week I played with IBM Bluemix, Python, and JSON to deploy a (very) simple weather application. Now, as an enhancement, I wanted to use my own subdomain for that app. I succeeded, here is how.
One of my own domains is "4loeser.net" which I use, e.g., for this blog. To avoid collisions with some other services linked to this domain, I created a subdomain "bm" (as in BlueMix). Next I added a CNAME entry to route all requests for "bm.4loeser.net" to "mybluemix.net".
That was the easy part. The next is more complex because it requires two simple steps, not just one. In my Bluemix account I have to click on my email address in the upper right corner. The pulldown menu has an entry "Manage Organizations". Then, in the admin panel, you select "domains" and add your custom domain.
The second step is to make your application available under your domain. For that you have to route it to that additional name. It can be done by clicking on the button for application settings in the upper right corner (circled in red) and choosing "edit routes":
That brings up a panel where you can define the routes to your application. In my case I added http://weather.bm.4loeser.net as a second option. I could have removed the mybluemix-based route.
What is left is some testing and writing up my experience. Done.
Update: There is a follow-up article describing how I added a Cloudant / couchDB database to my Python application on Bluemix.
One of my own domains is "4loeser.net" which I use, e.g., for this blog. To avoid collisions with some other services linked to this domain, I created a subdomain "bm" (as in BlueMix). Next I added a CNAME entry to route all requests for "bm.4loeser.net" to "mybluemix.net".
CNAME entry |
That was the easy part. The next is more complex because it requires two simple steps, not just one. In my Bluemix account I have to click on my email address in the upper right corner. The pulldown menu has an entry "Manage Organizations". Then, in the admin panel, you select "domains" and add your custom domain.
Add a custom domain to Bluemix |
Application settings in Bluemix |
Route the Bluemix app to your domain |
Update: There is a follow-up article describing how I added a Cloudant / couchDB database to my Python application on Bluemix.
Subscribe to:
Posts (Atom)