HANA Zeppelin Query Builder with Map Visualization

SAP HANA Query Builder On Apache Zeppelin Demo

HANA Zeppelin Query Builder with Map Visualization

HANA Zeppelin Query Builder with Map Visualization

In working with Apache Zeppelin I found that users wanted a way to explore data and build charts without needing to know SQL right away. This is an attempt to build a note in Zeppelin that would allow a new data scientist to get familiar with the data structure of their database. And it allows them to build simple single table queries that allow for building charts and maps quickly. In addition it shows the SQL used to perform the work.


This video will demonstrate how it works. I have leveraged work done by Randy Gelhausen’s query builder post on how to make a where clause builder.  I also used Damien Sorel’s jQuery Query Builder. These were used to make a series of paragraphs to lookup tables and columns in HANA and allow the user to build a custom query. This data can be quickly graphed using the Zeppelin Helium visualizations.

The Code

This is for those data scientists and coders that want to replicate this in their Zeppelin.

Note that this code is imperfect as I have not worked out all the issues with it. You may need to make changes to get it to work. It only works on Zeppelin 0.8.0 Snapshot. It is also made to work with SAP HANA as the databases.

It only has one type of aggregation – sum and it does not have a way to perform a having statement. But these features could easily be added.

This Zeppelin note is dependent on code from a previous post. Follow the directions in Using Zeppelin to Explore a Database first.

Paragraph One

//Get list of columns on a given table
def columns1(table: String) : Array[(String)] = {
 sqlContext.sql("select * from " + table + " limit 0").columns.map(x => x.asInstanceOf[String])

def columns(table: String) : Array[(String, String)] = {
 sqlContext.sql("select * from " + table + " limit 0").columns.map(x => (x, x))

def number_column_types(table: String) : Array[String] = {
 var columnType = sqlContext.sql("select column_name from table_columns where table_name='" +
    table + "' and data_type_name = 'INTEGER'")
 columnType.map {case Row(column_name: String) => (column_name)}.collect()

// set up the tables select list
val tables = sqlContext.sql("show tables").collect.map(s=>s(1).asInstanceOf[String].toUpperCase())
z.angularBind("tables", tables)
var sTable ="tables"
z.angularBind("selectedTable", sTable)

z.angularWatch("selectedTable", (before:Object, after:Object) => {
 println("running " + after)
 sTable = after.asInstanceOf[String]
 // put the id for paragraph 2 and 3 here

var col = columns1(sTable)
col = col :+ "*"
z.angularBind("columns", col)
// hack to make the where clause work on initial load
var col2 = columns(sTable)
var extra = ("1","1")
col2 = col2 :+ extra
z.angularBind("columns2", col2)
var colTypes = number_column_types(sTable)
z.angularBind("numberColumns", colTypes)
var sColumns = Array("*")
// hack to make the where clause work on initial load
var clause = "1=1"
var countColumn = "*"
var limit = "10"

// setup for the columns select list
z.angularBind("selectedColumns", sColumns)
z.angularWatch("selectedColumns", (before:Object, after:Object) => {
 sColumns = after.asInstanceOf[Array[String]]
 // put the id for paragraph 2 and 3 here
z.angularBind("selectedCount", countColumn)
z.angularWatch("selectedCount", (before:Object, after:Object) => {
 countColumn = after.asInstanceOf[String]
// bind the where clause
z.angularBind("clause", clause)
z.angularWatch("clause", (oldVal, newVal) => {
 clause = newVal.asInstanceOf[String]

z.angularBind("limit", limit)
z.angularWatch("limit", (oldVal, newVal) => {
 limit = newVal.asInstanceOf[String]

This paragraph is Scala code that sets up some functions that are used to query the table with the list of tables and the table with the list of columns. You must have the tables loaded into Spark as views or tables in order to see them in the select lists. This paragraph performs all the binding so that the next paragraph which is Angular code can get the data built here.

Paragraph Two

<link rel="stylesheet" href="https://cdn.rawgit.com/mistic100/jQuery-QueryBuilder/master/dist/css/query-builder.default.min.css">
<script src="https://cdn.rawgit.com/mistic100/jQuery-QueryBuilder/master/dist/js/query-builder.standalone.min.js"></script>

<script type="text/javascript">
  var button = $('#generateQuery');
  var qb = $('#builder');
  var whereClause = $('#whereClause');
    whereClause.trigger('input'); //triggers Angular to detect changed value
  // this builds the where statement builder
  var el = angular.element(qb.parent('.ng-scope'));
    var integer_columns = angular.element('#numCol').val()
    //Executes on page-load and on update to 'columns', defined in first snippet
    window.watcher = el.scope().compiledScope.$watch('columns2', function(newVal, oldVal) {
      //Append each column to QueryBuilder's list of filters
      var options = {allowEmpty: true, filters: []}
      $.each(newVal, function(i, v){
        if(integer_columns.split(',').indexOf(v._1) !== -1){
          options.filters.push({id: v._1, type: 'integer'});
        } else if(v._1.indexOf("DATE") !== -1) {
          options.filters.push({id: v._1, type: 'date'})
        } else { 
          options.filters.push({id: v._1, type: 'string'});
<input type="text" ng-model="numberColumns" id="numCol"></input>
<form class="form-inline">
 <div class="form-group">
 Please select table: Select Columns:<br>
 <select size=5 ng-model="selectedTable" ng-options="o as o for o in tables" 
 <select size=5 multiple ng-model="selectedColumns" ng-options="o as o for o in columns">
 <option value="*">*</option>
 Sum Column:
 <select ng-model="selectedCount" ng-options="o as o for o in columns">
 <option value="*">*</option>
 <label for="limitId">Limit: </label> <input type="text" class="form-control" 
       id="limitId" placeholder="Limit Rows" ng-model="limit"></input>
<div id="builder"></div>
<button type="submit" id="generateQuery" class="btn btn-primary" 
       ng-click="z.runParagraph('20180109-132517_167004794')">Run Query</button>
<input id="whereClause" type="text" ng-model="clause" class="hide"></input>

<h3>Query: select {{selectedColumns.toString()}} from {{selectedTable}} where {{clause}} 
   with a sum on: {{selectedCount}} </h3>

Paragraph two uses javascript libraries from jQuery and jQuery Query Builder. In the z.runParagraph  command use the paragraph id from paragraph three.

Paragraph Three

The results of the query show up in this paragraph. Its function is to generate the query and run it for display.

import scala.collection.mutable.ArrayBuffer

var selected_count_column = z.angular("selectedCount").asInstanceOf[String]
var selected_columns = z.angular("selectedColumns").asInstanceOf[Array[String]]
var limit = z.angular("limit").asInstanceOf[String]
var limit_clause = ""
if (limit != "*") {
 limit_clause = "limit " + limit
val countColumn = z.angular("selectedCount")
var selected_columns_n = selected_columns.toBuffer
// remove from list of columns
selected_columns_n -= selected_count_column

if (countColumn != "*") {
 val query = "select "+ selected_columns_n.mkString(",") + ", sum(" + selected_count_column +
     ") "+ selected_count_column +"_SUM from " + z.angular("selectedTable") + " where " + 
      z.angular("clause") + " group by " + selected_columns_n.mkString(",") + " " + 
} else {
 val query2 = "select "+ selected_columns.mkString(",") +" from " + z.angular("selectedTable") + 
      " where " + z.angular("clause") + " " + limit_clause

Now if everything is just right you will be able to query your tables without writing SQL. This is a limited example as I have not provided options for different types of aggregation, advanced grouping or joins for multiple tables.


Please follow us on our website at https://volumeintegration.com and on twitter at volumeint.

Transforming Sound Into Sight

Transforming Sound Into Sight

The simple idea that sound can be transformed into shapes fascinates me.

How Does It Work?

An oscilloscope provides a visual representation of sound or electronic waves. Typically, you see a wave pattern running from left to right.

Heartbeat Sound

However, when you put the oscilloscope into x-y mode, one channel of sound produces a line on the x-axis, and the other channel of sound makes a line on the y-axis. By creating a stereo signal, one can represent the sound in two visual dimensions.


Oscilloscopes Through Time

This visual representation is the basis of vector graphic monitors that were used on the first computer displays in the late 1950s. In fact, these displays were oscilloscopes. Before the pixel was invented, it was easier for a computer to draw with vectors since they take up even less memory than bitmap images.

Computers were invented to defend against attacks during the Cold War. The first one, called Whirlwind, was built by MIT for the Navy in 1951. Later, the Air Force assumed ownership to intercept incoming bombers.

Some of Whirlwind’s output interfaces included oscilloscopes, typewriters, speakers, and lights. The Whirlwind was featured on See It Now with Edward R. Murrow with oscilloscope output.

Following Whirlwind, another oscilloscope called SAGE (Semi-Automatic Ground Environment) was built with IBM AN/FSQ-7 computers. It was used to display maps and incoming aircraft so that they could be targeted by use of a light gun. Wikipedia says, “The AN/FSQ-7 had 100 system consoles, including the OA-1008 Situation Display (SD) with a light gun, cigarette lighter, and ash tray.“ This is one of the earliest computers I have seen with a map drawn on the screen.

The following 1956 clip shows the Display Scope in action is from _On Guard! _which tells the story of SAGE:

You can see even more of SAGE in this commercial from IBM and in an Air Force film.

SAGE Vector Graphics Display

This technique of making images with circuits and an oscilloscope was also used in the title sequence of Alfred Hitchcock’s Vertigo.

My Oscilloscope Adventure

When I first found a program called Rabiscoscopio, I immediately began shopping for an oscilloscope on eBay.

Hardware Shopping

My first purchase of a Hitachi oscilloscope for $17 was a bust, since it arrived with only one channel working. The key is to get an oscilloscope that has at least two channels and supports x-y mode display. The Hitachi 20 MHz would have worked great if only it had two working channels.

My next purchase of a Leader 20 MHz dual-channel oscilloscope for $39.99 was more successful. I tried out the image of the umbrella provided by Alex, and it worked!

Using Rabiscoscopio

Since my goal was to use the oscilloscope to generate the Volume Integration logo, I proceeded to the next step of attempting to draw my own pictures with Rabiscoscopio. I found that some BNC Male Plug to RCA Female Jack Adapters were the best way to connect my computer to the oscilloscope by plugging the headphone jack into an RCA cable.

Mini plug to RCA cable

BNC to RCA Adaptor

At first, I tried to take a standard Scalable Vector Graphics (SVG) file and convert it to sound with Rabiscoscopio. This caused it to throw an error. Apparently, I did not read the instructions where it says to only use straight lines and only use one continuous line.

I found Inkscape to be the easiest free tool for creating graphics. My experiments also led me to discover that drawing lines that cross each other also causes problems. Here is a gallery of my early experiments:

Stonehenge Wave Form by Bradley L. Johnson

Face by Bradley L. Johnson

Rabbit by Bradley L. Johnson

Square Spiral by Bradley L. Johnson

US Flag by Bradley L. Johnnson

A by Bradley L. Johnson

This is my refined process:

  1. Find image to trace.
  2. Open Inkscape.
  3. Import original image to trace over it.
  4. Use the pencil tool to trace a single line around the image in straight segments without crossing. Finish without joining the beginning and end of the line.
  5. Delete the original traced image.
  6. Save as SVG file.
  7. Open Rabiscoscopio and SVG file.
  8. Rabiscoscopio will generate the WAV sound file automatically.
  9. Plug in sound output from headphone jack to oscilloscope.
  10. Turn on oscilloscope and play sound file.
  11. Watch and enjoy, or go back and refine drawing.

My next feat was to draw the Volume Integration logo. It turned out to be more difficult than I expected because of all the intersecting lines and 3D-like shapes. The SVG file ends up looking like this:


The most difficult part was drawing a continuous line without crossing previous lines. After multiple attempts, a picture of the final product emerged.

Oscilloscope with Volume Logo

Volume Logo On Oscilloscope by Bradley L. Johnson

I have also included the wav file for your listening enjoyment. If you have an oscilloscope, you can watch it appear! You can use a software based oscilloscope by loading this software called Oscilloscope! on your computer.

I hope you’ve enjoyed following my geeky artistic endeavor. If you would like to see more of my oscilloscope art, take a look at Temple of Tech and Tumblr and SoundCloud.


Check out more of our work at Volume Integration and follow us on Twitter.

Experimenting with Google

Experimenting with Google+ Photos

One reason I enjoy working for Volume Integration is the people and the way they care for each other. A recent addition to our benefits package is an allowance for public transit with SmartBenefits loaded onto my SmarTrip card.

I have been using this incentive and enjoy having someone else drive me to work while I work on my company-provided laptop. Blogging and taking pictures on my way to the office is preferable to the stress of navigating traffic. Plus, I’m doing my part to reduce my carbon footprint.

When I enrolled in SmartBenefits, I expected all of the perks above, but I have found unexpected benefits too. Now I am able to watch the sunrise every morning as I transfer from the bus to the Metro, take pictures with my iPhone, and experiment with Google+ Photos.

I am running the Google+ app on my phone, which I have set to back up my photos automatically. Once Google+ gets the photos, it changes them in interesting ways. They call it AutoAwesome.

Into the Sunrise







Tysons Corner Metro Station – Brad Johnson / httptot.wowak.com


Tysons Corner Moving Train

Google+ Photos analyzes the photos and performs various enhancements. I’ve been surprised by what they can produce with automated algorithms including animated GIFs from a series of shots, a panorama from adjacent pictures with a humorous result, and an enhanced image by applying a series of filters.

But sometimes the “enhancements” do not turn out so well:

Not Like the Other


Shifting Perspective

Vertigo on the Metro

Warped Sunrise

So I keep taking pictures to see what Google is going to do with them. The photos have become a great resource to enhance these blog posts.

Technology Inspiring Art

Technology Inspiring Art

Technologists routinely get pegged in the geek category, but our roles also require us to come up with creative solutions to technical challenges. This creativity can help us extend into the realm of art. Recently, I used technology as a media for artistic expression while creating a sculpture entitled The Technologist.


Inspiration is so often fused from many memories, emotions, ideas, and events. My inspiration for The Technologist occurred while on vacation at a resort called Twin Farms in Vermont.

I was dancing with my young daughter in a recreation room of witty art to the sound of salsa music on the jukebox. The room contained a strange, playful set of old televisions with old school MTV graphics playing. It happened to be one of the twelve installations called Internet Dweller) by Nam June Paik from his exhibition Electronic Super Highway: Nam June Paik in the ‘90s.

Internet Dweller Nam June Paik


Nam June Paik’s exhibitions were meant to be participatory, as many of his pieces allow the audience to manipulate the sound or video to make their own art. As a one time member of the Fluxus movement and a performance artist, Paik passionately encouraged everyone to participate in events that create art.

One of Paik’s pieces called Random Access allowed the viewer to create sounds with a wand that read magnetic tape attached to a wall. Another, entitled TV Crown, enabled the viewer to change the artistic patterns of lines on a TV screen. Other installations just put the viewer’s face right into the piece with a closed loop camera, like the Electronic Superhighway exhibit.

Creating The Technologist

Based on my encounter with Internet Dweller in Vermont and exposure to other work by Nam June Paik, I created The Technologist. My sculpture is comprised of a simple male wig head with CPU and memory chips – some carefully placed and others smashed into the surface. It also includes parts from a Flip camera and wireless routers. The Technologist‘s eye plays a five-minute video on an embedded iPod nano (4th generation) with sound from attached speakers or headphones.

Two angles of the Technologist

The video playing in The Technologist‘s eye is the first thing that pulls you into the sculpture. It includes footage from Paik’s piece with sights and sounds from the day I encountered it in Vermont. I was so busy recording the artwork that my daughter was begging me to dance more, “Daddy…dance after this picture. Dance!” So my daughter’s voice is also preserved for posterity in the piece.

I have been working on _The Technologist _for over a year as I refined the video and the installation into the head. At first, I attempted to use a Microsoft Zune since it has WiFi capability. The original vision was to produce a series of pieces that could be connected into a network in a future exhibition titled Temple of Technology. Unfortunately, the small Zune does not have a way to loop videos, and I did not want to invest the time in programming a video player for it.

Most iPod nanos have video playback ability and allow looping in a playlist. The iPod nano also places the power cord and headphone jack in a convenient position that allows the wires to run through the middle of the head. So it seemed like a much better solution.

Views of The Technologist — top row: Wild Side, Third Eye, Antenna & AMD CPU; bottom row: Processing Steps, Video Eye, Heads Up Controls

Layers of Meaning

The video in The Technologist‘s eye also contains QR codes, generated with QR Code Generator, that invite the viewer to explore further layers. This leads us to ask: what is the boundary of this art now that it has jumped into your smart phone?

This piece is partly an expression of the relationship between human individuality, spirituality, and natural rhythm and their conflict with the drive of technology. It contains multilayered ideas on this theme, including building loving relationships in the midst of ever increasing demands for efficiency. There is an emotional paradox expressed in the piece on the role of technique versus creativity and love.

The piece is filled with layers of meaning and emotion. So I leave it to you to discover what you see and feel in The Technologist. Please let me know what you find.


To learn more about Volume Labs and Volume Integration, please follow us on Twitter @volumeint and check out our website.