Geometry Micro-Service


Recently I put myself to the challenge of automating a port of ESRI's geometry-api-java library from Java to C#. The geometry micro-service demonstration is an application built using the resulting geometry-api-cs library.

The automated port of the library is done using the Sharpen-Eclipse Abstract Syntax Tree library and a whole mess of python scripting. It isn't a pretty thing. Anytime there is a change in the ESRI java code, a build process is triggered and the results are commited to the C# repo. I hosted a TeacmCity Continuous Integration server for the automation and I gotta say, I really love it.

It took a lot of work to get the Geometry library to compile and pass tests. There are a few operators that don't yet work. Among the non-working are the JSON import and export methods. And that means the geometry.fogmodel.io demo relies on WKT geometries for importing and exporting geometries. Import and export using ESRIShape binary is working, but that's not really too fun for interactive geometry editing in Leaflet.

The client side of the demo uses React, Ampersand, Leaflet and websockets to allow for submitting requests to a Node server. That Node server then places requests on a RabbitMQ message queue server. A geometry micro-service is connected to the RabbitMQ message queue and processes geometry requests using the geometry-api-cs library. Those processed requests then find their way back to the client via RabbitMQ and websockets. It's pretty groovy stuff.

Docker to Azure from Ubuntu, Part Deux

Alrighty. Now for getting this docker vm up on Azure.

I'm mostly follwing Microsoft's Azure-Docker tutorial for the cli for the windows azure part of this post and for the azure-cli install I used Squillace's azure cli tutorial:

Making some certificates and permissions as decribed by Microsoft:

$ openssl req -x509 -nodes -days 365 -newkey rsa:1024 -keyout mycert.pem -out mycert.pem
$ openssl pkcs12 -export -out mycert.pfx -in mycert.pem -name "My Certificate"
$ openssl x509 -inform pem -in mycert.pem -outform der -out mycert.cer

After creating the mycert.cer file it was just like Microsoft described in their tutorial. Scroll down to settings in the azure table of contents, and in settings click on the Management Certificates. There should be an upload button at the bottom of the screen. Upload your created certificate.

Now I need to be using the Azure CLI. And according to this blog post from Microsoft you need Node.js and npm to install it. Also, it sounds like you might need some legacy Node.js for azure-cli to load properly. I'm doing this all on an Ubunut VM, so if it is a screwed up version of node, I don't give a rip. It just has to work. Here are the commands:

$ sudo apt-get install nodejs-legacy
$ sudo apt-get install npm
$ sudo npm install -g azure-cli

Now to login:

$ azure login
info:    Executing command login  
/info:    To sign in, use a web browser to open the page https://aka.ms/devicelogin. Enter the code DJAJJS49PP to authenticate. If you're signing in as an Azure AD application, use the --username and --password parameters.

Open up the browser for authentication goodness. For some reason I had to do the above authentication twice before I could get azure account list to work.

List the accounts associated with your id

$ azure account list
info:    Executing command account list  
data:    Name      Id                                    Current  State  
data:    --------  ------------------------------------  -------  -------  
data:    BizSpark  8c2345q5g-345r-2345f-345d-as455dfsdf  true     Enabled  
info:    account list command OK  

Now for installing docker-machine. First we install curl:

$ sudo apt-get install curl

WARNING, I think the below call wiped out my previous docker images...
Then, as described on docker's install instruction page for docker-machine:

$ curl -L https://github.com/docker/machine/releases/download/v0.5.0/docker-machine_linux-amd64.zip >machine.zip && \
unzip machine.zip && \  
rm machine.zip && \  
mv docker-machine* /usr/local/bin  

Because I haven't chmod'd the /usr/local/bin I had to run the mv with sudo:

$ sudo mv docker-machine* /usr/local/bin/

Now to create the vm on Azure with all the good docker bits and the credentials we've made. Azure has the process described on their web page as follows (but this didn't work for me):

$ docker-machine_linux-amd64 create \
-d azure \
--azure-subscription-id="<subscription ID acquired above>" \
--azure-subscription-cert="mycert.pem" \

Instead I needed to replace docker-machine_linux-amd64 with docker-machine. The below sample uses the id we collected from the call to azure account list(although, this one is totally false). You'll want to replace geometry-client with the name of the vm you'll be creating:

$ docker-machine create \ 
-d azure \
--azure-subscription-id="8c2345q5g-345r-2345f-345d-as455dfsdf" \
--azure-subscription-cert="mycert.pem" geometry-client

The build process will take a while, but once it's done you can make a call to azure vm list and you'll get a print out of all your VMs, including the one you've just created with docker-machine. Here's my printout:

info:    Executing command vm list  
+ Getting virtual machines                                                     
data:    Name             Status     Location    DNS Name                      IP Address  
data:    ---------------  ---------  ----------  ----------------------------  -------------  
data:    geometry-client  ReadyRole  West US     geometry-client.cloudapp.net  
info:    vm list command OK

There is some discussion of environment variables in the microsoft blog. I don't know if it's necessary, but I did it anyways. Honestly, there's a few mentions of this in the blog and I don't really know the significance (this might be more my inexperience with bash):

$ docker-machine env geometry-client

Now to play with different docker VMs!! But first I need to run stop and start to get docker to behave:

sudo service docker stop  
sudo service docker start  

Now, now is the time for play! Microsoft suggests running a quick nginx test:

$ docker run --name machinenginx -P -d nginx
$ docker ps -a

The port printout allows you to map the incoming tcp traffic appropriately.

$ azure vm endpoint create machine-name 80 49153

After the above command I was able to visit my azure site and see the nginx welcome page.

Next to try with a rabbitmq worker on Azure!

Docker to Azure from Ubuntu (Parallels)

Playing with docker on Ubuntu(Parallels) from my Mac. The below commands are mostly from the below docker ubuntu install:


$ sudo apt-key adv --keyserver hkp://p80.pool.sks-keyservers.net:80 --recv-keys 58118E89F3A912897C070ADBF76221572C52609D
$ sudo apt-key adv --keyserver hkp://p80.pool.sks-keyservers.net:80 --recv-keys 58118E89F3A912897C070ADBF76221572C52609D
$ sudo touch /etc/apt/sources.list.d/docker.list

now I got tired of sudo making me sandwiches so I chmod the directory. Not sure if this is kosher.

sudo chmod -R 777 /etc/apt  
nano /etc/apt/sources.list.d/docker.list  

God. Everything requires sudo...

$ sudo apt-get update
$ apt-cache policy docker-engine
$ sudo apt-get install linux-image-extra-$(uname -r)

Now to install docker engine. (I'm not sure this is necessary)

$ sudo apt-get install docker-engine

At the suggestion of docker I

I went to the bathroom and my parallels ubunut machine locked me out. Now I had to restart my docker service. Most of you won't have to do that...

$ sudo service docker status
$ sudo service docker start

Now I pull my mono azure build

$ docker pull davidraleigh/monoazure

And I test to make sure it's running:

$ docker run -t -i davidraleigh/monoazure /bin/bash
root@a4cae838c950:/# echo "Words"  
root@a4cae838c950:/# exit  

After this I need mono for building my applicaiton

sudo apt-get install mono-complete  

Git was installed as part of the docker build, so I grab my source code from bitbucket and build it for release:

$ git clone https://davidraleigh@bitbucket.org/davidraleigh/geometry-server-cs.git

My docker file in this project looks like the following:

FROM ubuntu:14.04

MAINTAINER David Raleigh <david.raleigh@gmail.com>

#based on dockerfile by Jo Shields <jo.shields@xamarin.com>

RUN apt-get update \  
    && apt-get install -y curl \
    && rm -rf /var/lib/apt/lists/*

RUN apt-key adv --keyserver pgp.mit.edu --recv-keys 3FA7E0328081BFF6A14DA29AA6A19B38D3D831EF

RUN echo "deb http://download.mono-project.com/repo/debian wheezy/snapshots/ main" > /etc/apt/sources.list.d/mono-xamarin.list \  
    && apt-get update \
    && apt-get install -y mono-devel ca-certificates-mono fsharp mono-vbnc nuget \
    && rm -rf /var/lib/apt/lists/*

ADD bin/Release/ .

CMD [ "mono", "geometry-client.exe" ]  

This above code relies on my having already build my project to bin/Release/. Now I build the docker geometry client and start it running to see if it all works:

$ docker build -t geomclient .

Now, I just test again that my client is working:

$ docker run -i -t geomclient mono geometry-client.exe
 [x] Requesting fib(5)
 [.] Got '5'

I'm doing the fibonacci tutorial from RabbitMQ and it looks like everything is golden (although I changed from 30 to 5, because, really who has time for 30!?!?!).

Next post is getting this image up on Azure.

Compiling Sharpen with Java 7

Java logo so pretty

Sharpen is a conversion utility for taking code from Java to C#. I've been playing around with converting ESRI's geometry library to C#, but there are a number of for loops that have multiple declarations in the initialization section and that's causing me all sorts of trouble. If there was more documentation for Sharpen I might figure this out easily. Instead I'm going to jump into the source code and see what's going on.


Originally I had uninstalled Java 8 from my Mac and installed Java 7. But this most recent OS I decided to use jenv instead. Following the instructions in this stackoverflow post allowed me to install Java 8 and 7 with homebrew.

$ brew tap caskroom/versions
$ brew cask search java
$ brew cask install java7
$ brew cask install java

After that it was easy enough to follow the instructions on the jenv site.

Installing is a breeze:

$ brew install jenv

Updating the ~/.bash_profile means adding two lines:

export PATH="$HOME/.jenv/bin:$PATH"  
eval "$(jenv init -)"  

Then you add in all your jenv jdks:

$ jenv add /Library/Java/JavaVirtualMachines/jdk1.8.0_74.jdk/Contents/Home
oracle64- added added  
1.8 added  
$ jenv add /Library/Java/JavaVirtualMachines/jdk1.7.0_80.jdk/Contents/Home
oracle64- added added  
1.7 added  

For my case I needed a local version of Java 7 for use with sharpen. The following commands set up Java 7 for the directory I am running Sharpend from:

$ jenv local oracle64-
$ java -version
java version "1.7.0_80"  
Java(TM) SE Runtime Environment (build 1.7.0_80-b15)  
Java HotSpot(TM) 64-Bit Server VM (build 24.80-b11, mixed mode)  

Perfect. Now whenever I run Java from that directory it will be Java 7. If you need to remove the local setting just call jenv local --unset.

ORIGINAL POST (This post shows how to remove other version of Java from system so that Java7 is the only one installed):

In order to do that I must remove my Java 8 and install Java 7. Maybe I'm supposed to save Java 8 to the side and download java 7 after I've done that and do some linking, but it seems easier to follow a few tutorials instead. My favorite is this gist.

Right now I have Java 8 installed:

$ java -version
java version "1.8.0_45"  
Java(TM) SE Runtime Environment (build 1.8.0_45-b14)  
Java HotSpot(TM) 64-Bit Server VM (build 25.45-b02, mixed mode)  

Instead of stashing it and messing with linking I'm just going to dump Java 8:

$ sudo rm -rf /Library/Java/JavaVirtualMachines/jdk1.8.0_25.jdk
$ sudo rm -rf /Library/PreferencePanes/JavaControlPanel.prefPane
$ sudo rm -rf /Library/Internet\ Plug-Ins/JavaAppletPlugin.plugin

Now when I check the version I'm stuck with the system install Java 6:

$ java -version
java version "1.6.0_65"  
Java(TM) SE Runtime Environment (build 1.6.0_65-b14-466.1-11M4716)  
Java HotSpot(TM) 64-Bit Server VM (build 20.65-b04-466.1, mixed mode)  

After downloading and installing the java 7 JRE and the Java 7 JDK (those are really important to this whole process) I can see that I'm getting closer to running a system that uses Java 7:

$ /Library/Internet\ Plug-Ins/JavaAppletPlugin.plugin//Contents/Home/bin/java -version
java version "1.7.0_80"  
Java(TM) SE Runtime Environment (build 1.7.0_80-b15)  
Java HotSpot(TM) 64-Bit Server VM (build 24.80-b11, mixed mode)  

I stash Java 6 and then link my newly installed Java 7 to the /usr/bin/java and all things are good in the world:

$ sudo mv /usr/bin/java /usr/bin/java-1.6
$ sudo ln -s /Library/Internet\ Plug-Ins/JavaAppletPlugin.plugin/Contents/Home/bin/java /usr/bin/java
$ java -version
java version "1.7.0_80"  
Java(TM) SE Runtime Environment (build 1.7.0_80-b15)  
Java HotSpot(TM) 64-Bit Server VM (build 24.80-b11, mixed mode)  

Now to install Maven:

$ brew update
$ brew install maven

Holy crap. Now to try and build Sharpen? I'm not sure what I'm doing now. I've downloaded the https://github.com/imazen/sharpen fork of repository as all it's tests are passing(the mono/sharpen repo has some inconclusive test results). Now to run tests and run the build tools

$ mvn clean test
$ mvn install

I've successfully built a sharpen jar file: sharpencore-0.0.1-SNAPSHOT-jar-with-dependencies.jar. Using that file I've done a few runs of the conversion utility and I'm getting used to some of the options available. In the below example I'm trying to convert the geometry while referencing the various jar files necessary:

$ java -jar sharpencore-0.0.1-SNAPSHOT-jar-with-dependencies.jar ~/Downloads/geometry-api-java-master/src/ -cp ~/Downloads/geometry-api-java-master/DepFiles/public/jackson-core-asl-1.9.11.jar -cp ~/Downloads/geometry-api-java-master/DepFiles/unittest/junit-4.8.2.jar -cp ~/Downloads/geometry-api-java-master/DepFiles/public/java-json.jar -junitConversion

Now when you're in IntelliJ 14 you can go to File->Project Structure and edit the Java SDK to use the 1.7 version.

Pushing MongoDB from localhost to MongoLab

For the bard-chatter project I collected all of Shakespeare's work and split it up by punctuation. All of that data was collected on a local database. Now I'm in the process of pushing that database to MongoLab. Below are the terminal commands for exporting from localhost database to bson and then importing to my MongoLab database from bson.

First the localhost mongo server has to be up and running:

$: mongod

Then we open up another terminal window and start pumping all the database collections to local BSON files:

$: mongodump -h localhost -p 27017 -d bard -o /Users/davidraleigh/code/temp/bard-dump/

Now for pushing all those local BSON files to MongoLab:

$: mongorestore -h <yourMongoLabURI>.mongolab.com:55642 -d bard-db -u <username> -p <password> /Users/davidraleigh/code/temp/bard-dump/bard/

If I only needed to update one collection, in this case the 'phrases' colleciton, you would go about it using the below commands:

$: mongodump -h localhost -p 27017 -d bard -c phrases -o /Users/davidraleigh/code/temp/bard-dump
$: mongorestore -h <yourMongoLabURI>.mongolab.com:55642 -d bard-db -c phrases -u <username> -p <password> /Users/davidraleigh/code/temp/bard-dump/bard/phrases.bson

Moving Files from One git Repo to Another Repo

In trying to get a node.js appliction up and running on Azure I came up against a problem. My git repository had two separate node.js projects in it so I couldn't use git publishing. In order to allow for git publishing I needed to separate out my projects, but I didn't want to lose my git history for each project.

After some searching I found Greg Bayer's "Moving Files from One Git Repository to Another, Preserving History." It seemed to work up until the point where I was moving files. It seems maybe he had a typo when it came to the mv * <directory 1> line (maybe is should be git mv * <directory 1>?). Or maybe I was just doing it wrong.

Looking into the stackoverflow question he referenced I found the solution I needed.

The bard-chat repo had multiple projects in different directories. From the bard-server sub-directory I needed to pull the node.js project files and make those the basis for the bard-chatter repo.

$: git clone https://github.com/davidraleigh/bard-chat.git
$: cd bard-chat/
$: git filter-branch --subdirectory-filter bard-server/ -- --all
$: git remote rm origin
$: cd ..
$: git clone https://github.com/davidraleigh/bard-chatter.git
$: git remote add orig ../bard-chat
$: git fetch orig
$: git branch orig remotes/orig/master
$: git push origin master