Sunday, November 1, 2015

Setting-up OpenGeo and TileCache Servers


Installing OpenGeo Suite

In my previous post I talked about the impact that Boundless is having on my GGS 692: Web-based GIS course.  It turns out the impact is even more than I reported, since our final project for the course will be based on the Building a GeoNames Heat Map tutorial found on the Boundless website.

We were given a VirtualBox hard drive image at the beginning of the course, with Ubuntu 12.04 and all the software we would need for class, including the OpenGeo Suite. While I've been using this image, I want to be able to set up everything myself, so I'm going to try to install the OpenGeo software on an Ubuntu 14.04 virtual machine.  I'll be following the instructions for installing OpenGeo Suite 4.7 on Ubuntu 14.04 here.  Following those instructions, I did:
$ sudo su -
# wget -qO- | apt-key add -
# echo "deb trusty main" > /etc/apt/sources.list.d/opengeo.list
# apt-get update
# apt-get install opengeo-server
This installed 244 new packages, including each of the components listed here. The OpenGeo Suite is certainly free software, but it uses both Java and Mono, and I'm a Python fan, so while I'll learn to use GeoServer for class, I'm going to pursue another, lighter map server stack at the same time.


Installing TileCache

TileCache is an implementation of the Tile Map Service (TMS) specification written in Python.  I setup an Ubuntu 15.10 server for this project and installed tilecache with:
$ sudo aptitude install tilecache
It installed 17 dependencies, including python 2.7, along with the tilecache package.  I am beginning to wrap my head around how this map server works.  It is built on three layers:
The geospatial data objects are stored in a PostGIS database.  Mapnik turns the data into images (tiles), which TileCache serves through the web. TileCache implements the Tile Map Service (TMS), a specification developed by OSGeo.

On the client side, a JavaScript library like Leaflet assembles the tiles for viewing.

Continuing with the server installation:
$ sudo aptitude install python-mapnik
$ sudo aptitude install postgresql-9.4-postgis-2.1
 With the software for the two servers installed, my next task will be to load the data for my GIS 692 class onto the OpenGeo server and the Arlington County map data onto the TileCache server.

Saturday, October 31, 2015

Boundless Geospatial Opportunities

A few weeks ago in GGS 692: Web-based GIS we were assigned a set of exercises writing spatial SQL queries on a spatially enabled database. The reading material for the lesson consists mainly sections from an Introduction to PostGIS tutorial produced by a company named Boundless.

It wasn't until I began reading the tutorial that I realized just how much of the curriculum of our course comes directly from the Boundless materials.  By providing high quality, classroom ready learning materials, Boundless has changed the curriculum in the GIS program GMU from one using proprietary software to one built on free software tools.

I only wish other projects in the free software movement could take this valuable lesson from Boundless and find a way to get their tools into classrooms. If we want to change the world, this is an important part of how we have to do it!

Sunday, October 4, 2015

A Geospatial Werewolf

I've been contemplating a post with the tile "Seduced by the Dark Side - Or Why I Love My MacBook Air".  I decided not to devote a whole post to such evil things, so I'll restrict myself to a few paragraphs and then move on to better things.  Let me just admit it.  I really like using the MacBook Air that I was given at work.  I didn't buy it.  I know it's evil (really, really evil, in fact, since it is sooo seductive), and I will certainly do all I can to resist its evil temptations.  I've already removed all the Apple Store paid for applications that were on it, and I've installed Homebrew and a host of free software (things like QGIS and pgAdmin3).  I also installed VirtualBox and set up a Lubuntu 14.04 VM on it, which I use most of the time. In spite of all that, I have to admit that I really like using it.  I like the way it looks.  I like the way it feels in my hands and under my fingers as I type on it.  I like the way it responds so smoothly and quickly. There, I've come clean and got that out of my system.  The last thing to mention is that when I told my friend Kevin Cole that I had been seduced by the dark side, he knew without being told what I was going to say, and sent me this link, which pretty much hits the nail on the head...

Lubuntu 15.10 - A Geospatial Werewolf

Moving back into the realm of freedom, the upcoming release of Lubuntu 15.04, which I am running now on a few different machines but which will have its official release on October 22nd, offers new Python 3 versions of the geospatial libraries I described in a post back on April 19th. It also comes with QGIS version 2.8, which is a good thing indeed, since the version 2.4 that came with previous versions did not work properly (I couldn't get it to create layers using the DB Manager).  Here are the Python 3 GIS packages I installed on this new version, code named Wily Werewolf:
  • python3-gdal
  • python3-pyproj
  • python3-shapely
  • python3-mapnik
The last one in particular was not available in previous releases. And as if a sign of its strong GIS support, the default background on the Lubuntu version features a TIN pattern.  Here is a screenshot with QGIS running:
I've been tasked this week with looking into how to use mapnik.  Thanks to Wily, I'll be able to do this in Python 3.

Saturday, October 3, 2015

Getting an Arlington Bounding Box of PBF Data

The best source for a current, relatively small data set for OSM PBF data I could find is the Geofabrik website, specifically the Virginia page, which has a PBF file for the state that weighs in at less than 230 MB. After downloading the virginia-latest.osm.pbf file, my next task will be to use osmosis to extract data from within a bounding box containing Arlington County.  The web application BoundingBox provides a nice visual way to do this.

Here is a screenshot of the Arlington County bounding box:
An interesting little curiosity I observed from looking at this bounding box is that little Andrew Ellicot Park lies within Arlington County, Fairfax County, and the city of Falls Church:
After Googling it, it is listed as the originial Western cornerstone of the District of Columbia:
Anyway, enough distraction, back to work.  With osmosis installed and the Virginia data on hand, I ran this command to get just the Arlington data:
$ osmosis --read-pbf file=virginia-latest.osm.pbf --bounding-box top=38.9342803955 left=-77.1723251343 bottom=38.8272895813 right=-77.032081604 --write-pbf file=arlington.osm.pbf
Only a few more steps to create a database and load this data into it:
$ createdb map_arlington $ psql -U postgres -d map_arlington -c 'CREATE EXTENSION postgis' $ imposm --read arlington.osm.pbf $ imposm --write -d map_arlington -U postgres
A database user other than postgres should be used, but this at least documents the process.

Thursday, September 24, 2015

Importing OSM Data into PostGIS - Part 1

The PV Viability Map project seems to really be picking up some steam!  Thanks to our good fortune in having David Winslow join us, we now have someone with the skills and experience to move us forward -- thanks, David!  In our meet up last Thursday we had a first discussion of the requirements for the project, which I'll do my best to summarize here.

PV Viability Map Requirements

  • Display map in web page.
  • Search an address and display the building there.
  • Standard map mouse navigation (drag to pan, mouse wheel to zoom, etc.)
  • Click on building to identify.
I have been tasked with seeing how far I can get before next Thursday's meet up with the first task - displaying a map in a web page.  Actually, I've been tasked with the first part of this process - getting OpenStreetMap (OSM) data, trimming for Arlington, and loading it into PostGIS.

Process Overview

Before diving into looking for a solution, a pause to consider the process is in order.  From the Wikipedia page for OpenStreetMap we learn:
  1. The main copy of the OSM data is stored in OSM's main database. The main database is a PostgreSQL database, which has one table for each data primitive, with individual objects stored as rows. All edits happen in this database, and all other formats are created from it.
  2. For data transfer, several database dumps are created, which are available for download. The complete dump is called planet.osm. These dumps exist in two formats, one using XML and one using the Protocol Buffer Binary Format (PBF).
From what I can gather (and the notes I took at our last Code for NoVA meet up), I will be using a combination of two tools to grab the data from OSM's server and then push it into my local database:
  1. osmosis
  2. imposm
I was able to install both of those with:
$ sudo aptitude install osmosis imposm
I'm not clear when one would use imposm and when one would use osm2psql, which seems to be for the same purpose. I installed the latter as well with:
$ sudo aptitude install osm2psql
To be honest, I find the OSM wiki documentation to be very challenging to read. It appears thorough and is well written, but it assumes a wealth of background I lack.  Hopefully with time I'll be able to make better use of it.  For now I'll wait for our meet up this evening to see if David can help me figure out how to use osmosis to get data for Arlington County.

Tuesday, September 22, 2015

Creating a Lubuntu Custom Install Disk for the School Lab

I'm a high school teacher in a lab full of Windows 7 machines, and it's just not
working for me.  I'm teaching Web Page Design I, which is a course in HTML and CSS, and by using WinSCP and Notepad++, the Windows workstations are adequate for the task.  But for my Computer Science class, where we will be exploring mathematics with Python and will need to install lots of Python libraries and tools, I need Ubuntu.  Actually, I'll be using Lubuntu, since it is light weight, supports the same software, and works much better with the old NVIDIA GeForce 200 graphics cards that are in the machines (the Unity desktop wants to make use of the 3D graphics features of the card, which looks awful and crashes with both the free nvidia driver and the proprietary ones from nvidia.  Lubuntu is perfectly happy in 2D, and the free driver works very well with it.

The Plan

  1. Create a VirtualBox VM with Lubuntu and all the software I plan to use in the lab installed on it.
  2. Install  Remastersys on this VM and use it to create a custom installation disk.
  3. Use the custom installation disk thus created to install Lubuntu along side Windows on the lab workstations.

Step 1: The VM

I created a new VirtualBox VM and installed Lubuntu 14.04 (64 bit) on it.  After running all the updates, I did the following:
  • Installed virtualbox-guest-utils. This package will not be needed on the lab workstations, but without it screen resolution on the VM is limited to 640x480, making it too difficult to work with.
  • Added this PPA and then installed remastersys-gtk. I figured better to test this early, since if it doesn't work there is not point in doing the rest of the setup.
  • Installed gnome-screenshot and gimp. I added these two now to be able to take screenshots of the remastersys screens and edit them.
  • From Synaptic package manager I enabled the "Canonical Partners" repository and installed adobe-flashplugin.
  • Added the ubuntugis-unstable PPA.
After this I tried using remastersys, since if that doesn't work, there is no point in continuing.  I'm glad I did this early in the process, since I encountered a problem after launching System Tools -> Remastersys and selecting dist:
I found an Ubuntu forum post here, which suggested creating an empty lightdm.conf file in /etc/lightdm would fix the problem. It did! The process completed and I found a custom-dist.iso file in the /home/remastersys/remastersys directory.  I installed successfully with the custom-dist.iso, with one caveat -- when I tried to select "Encrypt my home folder" during the install, it crashed.  Encrypting the home directory does not appear to work with the Remastersys created iso, so I'll just make sure not to select that option, and to use the process described here to encrypt home directories later.

The last thing I needed by way of infrastructure on this VM is Grub Customizer, which will make it easy to edit the bootloader menu to have it boot to Windows by default (it pains me to do that, but I'm sharing the lab this year so I have to play nice ;-)


Step 2: Adding Software

I know I'll continue to add other software as the year goes on and I find other things I need, but for the first go round, here is a list of packages I know I'll want, that I have just installed on the VM:
  • python3-pip
  • idle3
  • python3-matplotlib
  • python3-sphinx
  • python3-pep8
  • python3-bs4
  • python3-w3lib
  • python3-scipy
  • python3-pyqt5
  • ipython3
  • ipython3-notebook
  • python3-termcolor
  • python3-cairo
  • python3-paste
  • python3-cherrypy3
  • python3-flask
  • python3-bottle
  • spyder3
  • inkscape
  • gftp
  • vim
  • vim-gtk
  • most
  • openjdk-7-jdk
  • sqlite3
  • spatialite-bin
  • spatialite-gui
  • qgis
  • grass
  • pgadmin3
  • postgresql-client
I also installed SymPy, using the command: 'sudo pip3 install sympy', since sympy is not in the package repository. Finally, I installed Google Chrome using the installer from here and Opera using the installer here.

With all this software installed, I ran Remastersys again to make an installation iso disk image.  Tomorrow I'll try it out in the lab...

Sunday, September 20, 2015

Web-based GIS Assignment 2

With pgAdmin running on my Ubuntu 14.04 desktop, and PostGIS setup on an Internet virtual machine (running Ubuntu 14.04 server), I connected pgAdmin to the server by double clicking on the server name (NYC) in the list.  It connected fine, but not without giving me a warning:
A quick Google search brought me to this Ask Ubuntu link:
which says I need to install postgresql-contrib.  Here goes:
$ sudo aptitude install postgresql-contrib
It installed a single package without incident.  After clicking the "Fix It!" button, I can now connect to the remote server without warning.

To install pgAdmin on my desktop at home, I ran:
$ sudo aptitude install pgadmin3
it returned that:
The following NEW packages will be installed:
  pgadmin3 pgadmin3-data{a} pgagent{a} postgresql-client{a}
  postgresql-client-9.3{a} postgresql-client-common{a}
While it is no surprise that pgAdmin would depend on postgresql-client (which has the command line psql program), it is convenient. For one thing, it makes it easier to test out the connection to the remote database server, since the psql command:
$ psql -h [server name] [database name]
Is easy to remember.  I still don't know off the bat how to fill in the fields of the pgAdmin connection screen without looking things up.  Here is a screen shot of the New Server Registration screen:
I know what to put in the Host, Username, and Password fields, but I'm not sure about Name and Maintenance DB.  Is Name the name of the database, or is that Maintenance DB?

This documentation page proved most helpful:
I put "nyc" (the name of the database we are using for class) in both the Name and Maintenance DB fields, and it connected to the server without incident.

Setting Up the Local Virtual Database Server for Testing

My next task, assigned to me at our last Code for NoVA meet up, is to look into loading OpenStreetMap (OSM) data into a PostGIS database. As cool as it is having a remote server out on the web running a database server 24/7 that I can connect to whenever I want, I'm not going run new "experiments" on that machine.  That's where local VirtualBox VMs come in handy.

I described my first attempt at setting up a PostGIS server in a previous post. As I've learned since, starting out with the command:
$ sudo aptitude install postgresql-9.3-postgis-2.1
is the best way to get going, since it installs postgresql itself and most everything else you need to get started. I still have the VM I made back then, so now I'm going to copy over the nyc database to it and configure it for remote access.

I'll repeat the process I used to move the nyc database from the VirtualBox VM we were given in our Web-based GIS class, described in my previous post, only this time I'll export it from my database server, so that I can skip the steps where I had to rename the owner of the database.

Here is what I did:
username@local_machine:~$ ssh [database server name]
username@dbserver:~$ pg_dump -c nyc > nyc.sql
username@dbserver:~$ exit
username@local_machine:~$ scp [database server name]:nyc.sql .
I could export the database as me, since I've been setup as a database user. I don't remember whether I did that on the VirtualBox VM, so now is a good time to learn some more PostgreSQL administration.  Here is a documentation page which has what I need:
Trying to start psql on the VirtualBox server shows that I didn't add my user as a database user:
$ psql
psql: FATAL:  role "[username]" does not exist
Time to fix that:
username@postgis:~$ sudo -i
[sudo] password for username:
root@postgis:~# su - postgres
postgres@postgis:~$ createuser --superuser username
postgres@postgis:~$ exit
root@postgis:~# exit

username@postgis:~$ createdb nyc
username@postgis:~$ psql nyc < nyc.sql
It successfully ran the script and populated the nyc database. I could then run:
username@postgis:~$ psql nyc
and connect to the nyc database and run queries.

Now to enable remote connections:
username@postgis:~$ sudo vi /etc/postgresql/9.3/main/pg_hba.conf
and changed this line:
host  all  all  md5
host  all  all     md5
username@postgis:~$ sudo vi /etc/postgresql/9.3/main/postgresql.conf
and changed this line:
#listen_addresses = 'localhost'
listen_addresses = '*'
username@postgis:~$ psql nyc
nyc=# alter user [user] with password '[password]';
This is mostly a repetition of information in my previous post, but since I want to learn it, it bares repeating.

Finally, I'll connect from my desktop machine to the "remote" VirtualBox server, which has IP address on my home network:
username@localmachine:~$ psql -h nyc
psql (9.3.9)
SSL connection (cipher: DHE-RSA-AES256-GCM-SHA384, bits: 256)
Type "help" for help.

Excellent!  I'm all set to explore importing OSM data into my database.

Saturday, September 19, 2015

Two Old Draft Posts from My LIDAR Study

Note: I had two draft posts from my Lidar study last Summer that are incomplete, but contain useful information I will want to refer to in the future.  Since time does not allow me to complete them now, I'll just publish them both "as is" here...

Visualizing the Loudoun County Lidar Data


For the PV Viability Map project, we are using Lidar data from Loudoun County, Virginia.  Loudoun County was chosen because the goal of the initiating organization, Northern Virginia Regional Commision (NVRC), is to produce a map for all the NVRC members, but complete Lidar data is currently only available for Loudoun County.  Currently available data can be obtained from the USGS EarthExplorer website (see previous post, Getting Started with LIDAR Data), but it is available in small distribution units from the VirginiaLidar site.  A few other websites I came across in searching that might be useful later include:
To get the data we'll be using, I went to this folder on Virginia Lidar's Google drive.  I downloaded the PDF, Shapefile zip, and KMZ files.  This is the first time I've encountered KMZ files, which are zipped Keyhole Markup Language files.  KML is an open standard for expressing annotations and markers on web based 2D and 3D maps.  It appears you can work with them in OpenStreetMap.

The Lidar data on the Virginia Lidar site is divided into blocks.  The region covered by each block is described in the LoudounCo_Ref.pdf PDF file in the Virginia Lidar Google drive location linked above.  Here is what it looks like:
To begin learning to work with the data, I set myself the task of downloading a single block of the data and then exploring tools to visualize it.  I choose 18STJ7733.laz since it contains the intersection of Routes 15 and 7 in downtown Leesburg (see this Google map), and would thus I hoped have recognizable features that would aid in testing the visualizations. Since the data is provided in compressed LAZ format, and many of the tools require the uncompressed LAS format, I had to first uncompress the file. I described how I did this in an earlier post.


After spending a reasonable amount of time searching on the web for Lidar visualization tools, it became clear to me that I will have to overcome a number of obstacles to be able to work with this data.  I'm going to need to understand the LAS file format in some detail to use the lower level tools that are available for GNU/Linux systems.  I may explore some of the free software tools that are only available for Windows, but I'll only do that after I've exhausted what I can do a free software platform.

I found a website,, that offers through the web visualization of LAS files.  This screenshot shows the website with the default data being visualized:


Manual of Airborne Topographic Lidar - Chapter 3 Notes

Chapter 3 of the Manual of Airborne Topographic Lidar is titled Enabling Technologies, and discusses the Global Navigation Satellite Systems (GNSS) and inertial navigation systems (INS) that together enable airborne laser scanning (ALS).  Both these technologies were mentioned frequently in the chapter 2 discussion of elements of ALS technologies, and it is clear from the discussion why ALS could not emerge as a viable commercial technology until the 1990s, when the GPS system become available.

Global Navigation Satellite Systems

There are currently two operational GNSS systems, the US Global Positioning System (GPS), and the Russian GLObal NAvigation Satellite System (GLONASS), and two systems under development, the EU Galileo system, and the Chinese BeiDou Navigation Satellite System (BDS), both expected to be completed by 2020.

How Does a GNSS Work? 

The text describes GNSS as "a constellation of satellites carrying atomic clocks that broadcast time and an arbitrary number of receivers each of which computes its own position on the Earth from measured signal propagation times from the visible satellites to the receiver." (p. 99).  What I didn't understand was how the receiver could compute the propagation time of the signal (and thus find the distance from the satellite) without having a clock that was synchronized with the atomic clock on the satellite.  A post titled How does GPS receiver synchronize time with GPS satellites? provides a nice explaination:
"The time value isn't used to tell the receiver what time it is (at least not directly, although that is helpful later). It's used so that the receiver can tell relatively what the distance is to each satellite.
If you hear Sat A say that the time is 0.00000 and Sat B says the time is 0.00010, then if they are in sync, you must be closer to B than to A. You can tell exactly how much closer you are by the specific time difference.
Repeat the calculations with a few other satellites and you will find that there is only one place (and time) that the receiver can be located.
The GPS receiver computes a solution that simultaneously provides Position, Velocity, and Time (PVT). It's not that one is calculated first, then the other is. They all fall out simultaneously."
A bit later in the post the following equation is listed:

Looking at the 4 unknowns, x, y, z, and t, it makes sense why 4 satellites are need to provide a location (and time).


Tuesday, September 15, 2015

First Assignment in Web-based GIS Class

My first assignment for the Web-based GIS class has two parts: 1. Reading the first chapter of Spatial Databases: A Tour, by Shashi Shekhar and Sanjay Chawla, answering several conceptual questions related to the reading, and 2. A "lab" project designed to get us familiar with our VirtualBox VM, pgAdmin, and QGIS.

The questions involved understanding what spatial data is, comparing file systems to databases, looking at a brief history of spatial database management systems (SDMSs), defining abstract data types (ADTs) and understanding an SDMS as a specific instance of example of an object-relational database management system (ORDBMS).

The lab portion of the assignment involved running pgAdmin and QGIS on a VirtualBox VM to connect to a PostGIS database. I've been using VirtualBox for years, so I wanted use this assignment to push my PostGIS administration knowledge by setting up a PostGIS database on the web and migrating the database from our class VM to this server, and then connecting to it from pgAdmin and QGIS clients running on local machines.

Moving a Database

Logging into the VirtualBox VM given to us in class, I ran the following commands:
ggs $ sudo -i
# su - postgres
postgres $ pg_dump -c nyc > nyc.sql
(the file can be downloaded from here).  I then made the following substitutions in the nyc.sql file, since I want the database owner to be my username and not the postgres user:
  1. change ' TO postgres' to ' TO [user]'
  2. change ' FROM postgres' to ' FROM [user]'
I then used scp to copy the nyc.sql file to my on-line database server.

Picking up on the database server where I left off in my previous post, I now need to make my regular user a database admin who can create databases.
$ sudo su - postgres
$ createuser --superuser [user]
$ exit
Now I can create the database and import the data:
$ createdb nyc
$ psql nyc < nyc.sql
Let me run a query from the lesson to make sure it works:
$ psql nyc
psql (9.3.9)
Type "help" for help.

nyc=#  select name from nyc_subway_stations where name like 'Broad%';
 Broadway Jct
 Broadway Jct
 Broad St
 Broadway-Lafayette St
 Broadway-Nassau St
 Broadway Jct
 Broad Channel
(9 rows)

To get the postgres server to except outside connections, I needed to make the following changes to config files:
$ sudo vi /etc/postgresql/9.3/main/pg_hba.conf
and changed this line:
host  all  all  md5
host  all  all     md5
$ sudo vi /etc/postgresql/9.3/main/postgresql.conf
and changed this line:
#listen_addresses = 'localhost'
listen_addresses = '*'
I still couldn't connect remotely, since I got a password error the user I created for the database, so I had to set that by running:
nyc=#  alter user [user] with password '[password]';
After that I could connect remotely to the server!

Monday, September 7, 2015

GGS 692: Web-based GIS

I've just started my Fall semester graduate GIS course at George Mason University, GGS 692: Web-based GIS. This program continues to be extremely rewarding and just what I was looking for in a graduate program, since I am learning real skills that allow me to apply my previous background in mathematics and computer science to solving "real world problems".

According the syllabus, this course will:
[P]rovide the students with the knowledge to curate, store, manage and query geospatial data by means of powerful database management systems. Moreover, to communicate the data, the students will learn how to build Web mapping applications on top of a database and so communicate and interact with the data using nothing more than a Web browser. The course will cover a variety of open source software packages for web mapping and will provide pointers to commercial solutions where appropriate.
The specific goals are
  • To enable students to develop a good understanding of the principles and techniques of spatial databases.
  • To design and build a spatial database.
  • To perform common various types of queries and spatial analyses.
  • To design, develop, and implement custom web mapping applications using open standards and open source software.
The course involves a large final project, which I hope to use to develop the Photovoltaic Viability Map web application that will allow Northern Virginia residents to look at their homes on a map and get information about the cost and benefit of putting solar panels on their roof.

The specific technologies we will be learning about include:
All of these are free software GIS tools, so I am delighted at the opportunity to be compelled to learn about them.  I will be adding Mapnik and GeoDjango to the list, since my goal is to learn to be a Python GIS web application developer.

Getting Started

I've already been learning some of these technologies as part of my previous two courses, so this semester the goal is to really begin to master them.  Since a geospatial database is something I'm going to need on a regular basis, I'm going to install PostGIS on an Internet VM that I already have available, so that I'll be able to connect to it whenever I need to.

Referring back to the post I made on July 8, PostGIS Installation, I ran:
$ sudo aptitude install postgresql-9.3-postgis-2.1
My July 9 post, Adminning a PostGIS Server, has details for setting up remote connectivity and creating a database, but I think before I do that I'll go through this tutorial:
to get a broader overview of PostreSQL administration.

Sunday, August 30, 2015

Mobile Happiness with Firefox OS!

As I described in a previous post back in mid June at the start of the LibriFox "Summer of Code", I am not as a rule an early adopter of new technologies, and am now making regular use of a mobile phone for the first time in my life.

I just purchased a Nexus 5 and so far I am delighted with it.

Here is why:
  • I was able to easily root the device, install MultiROM Manager, and then make it dual-boot (triple boot, actually, since I installed Ubuntu as well) with Firefox OS.  I will be using Firefox OS as my main OS, and I'm happy to say that the version v2.2 pre-release image I found here is working great.  The only app that doesn't work is the FM radio, since the Nexus 5 unfortunately doesn't have an FM chip. What it does have that I really wanted was 4G connectivity. Browsing the web or loading a location map through the mobile data connection are quite pleasant now.
  • Alex Hirschberg completed a wonderfully successful "Summer of Code", delivering the LibriFox app to the Firefox OS Marketplace.  For me this is a "killer app" that alone makes having the phone worthwhile.  That's why I setup the "Summer of Code" in the first place, and I am completely satisfied with the results that Alex delivered.  I would not hesitate to fund "Summer of Code" again next year, provided I can find another graduate at least half as good as Alex. Here is a screen shot of LibriFox running on my phone:
  • As a regular Capital Bikeshare user, I thought I was going to have to boot into Android to use BikeShare! Not so. DC Bike Finder will serve my needs nicely, and keeps me from having to reboot. Here is a screenshot of DC Bike Finder running on my phone:
What drew me to Firefox OS in the first place was a developer environment that lowered the barrier to entry (see my April 16 post, Firefox OS and Lowering the Barrier to Entry into ICT) and would thus be a platform friendly to student learning. With a mobile device that I like to use and feel closely connected with thanks to contributing to it through LibriFox, I am excited about the prospect of further developing the Firefox OS curriculum inroads we began last year and seeing both how far we can progress in the new year and what surprises our journey will bring.

Sunday, August 16, 2015

Setting Up GeoDjango II

In my previous post on this topic, I ended up stuck with a database not existing error, which my good friend Kevin Cole very politely (only implying that I'm an idiot, while refraining from directly saying so ;-) pointed out that the documentation I'm using contains the instructions I need to create the database:
(env)$ createdb -T template_postgis geodjango

Unfortunately, running that gave me the following error:

createdb: database creation failed: ERROR:  template database "template_postgis" does not exist
Let me try a modified version (modified because my user doesn't have the privileges needed to create extensions) of the steps laid out here:
(env)$ createdb geodjango
(env)$ sudo su - postgres
$ psql geodjango
psql (9.3.9)
Type "help" for help.

geodjango=# CREATE EXTENSION postgis;
geodjango=# \q
$ exit
Now let me resume where I left off before the error:
(env)$ python sqlmigrate world 0001
CREATE TABLE "world_worldborder" ("id" serial NOT NULL PRIMARY KEY, "name" varchar(50) NOT NULL, "area" integer NOT NULL, "pop2005" integer NOT NULL, "fips" varchar(2) NOT NULL, "iso2" varchar(2) NOT NULL, "iso3" varchar(3) NOT NULL, "un" integer NOT NULL, "region" integer NOT NULL, "subregion" integer NOT NULL, "lon" double precision NOT NULL, "lat" double precision NOT NULL, "mpoly" geometry(MULTIPOLYGON,4326) NOT NULL);
CREATE INDEX "world_worldborder_mpoly_id" ON "world_worldborder" USING GIST ("mpoly" );

(env)$ python migrate
Operations to perform:
  Synchronize unmigrated apps: gis, messages, staticfiles
  Apply all migrations: world, admin, contenttypes, auth, sessions
Synchronizing apps without migrations:
  Creating tables...
    Running deferred SQL...
  Installing custom SQL...
Running migrations:
  Rendering model states... DONE
  Applying contenttypes.0001_initial... OK
  Applying auth.0001_initial... OK
  Applying admin.0001_initial... OK
  Applying contenttypes.0002_remove_content_type_name... OK
  Applying auth.0002_alter_permission_name_max_length... OK
  Applying auth.0003_alter_user_email_max_length... OK
  Applying auth.0004_alter_user_username_opts... OK
  Applying auth.0005_alter_user_last_login_null... OK
  Applying auth.0006_require_contenttypes_0002... OK
  Applying sessions.0001_initial... OK
  Applying world.0001_initial... OK
Progress!  Let me keep going and see if my good fortune holds:
(env)$ python shell
Python 3.4.0 (default, Jun 19 2015, 14:20:21)
[GCC 4.8.2] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import os
>>> import world
>>> world_shp = os.path.abspath(os.path.join(os.path.dirname(world.__file__),
...                             'data/TM_WORLD_BORDERS-0.3.shp'))
>>> from django.contrib.gis.gdal import DataSource
>>> ds = DataSource(world_shp)
>>> print(ds)
/home/[user/geodjango/geodjango/world/data/TM_WORLD_BORDERS-0.3.shp (ESRI Shapefile)
>>> print(len(ds))
>>> lyr = ds[0]
>>> print(lyr)
>>> print(lyr.geom_type)
>>> print(len(lyr))
The tutorial continues with several other interactive examples showing how to use GeoDjango's pythonic interface to the GDAL library.  I began experimenting with python's GDAL wrapper back in April as part of the Introduction to GIS Programming and Algorithms course I took at George Mason University.  I documented the installation of these tools in a post at that time.  The ability to "play" with data at run time is one of the many things I love about Python, and this tutorial, like most python tutorials, is making good use of that powerful pedagogical feature of the language.  There is no need for me to recount the other examples here, however, so I'll skip over them.

The next step in the tutorial is to create a file in the world app named that contains the following:
import os
from django.contrib.gis.utils import LayerMapping
from .models import WorldBorder

world_mapping = {
    'fips' : 'FIPS',
    'iso2' : 'ISO2',
    'iso3' : 'ISO3',
    'un' : 'UN',
    'name' : 'NAME',
    'area' : 'AREA',
    'pop2005' : 'POP2005',
    'region' : 'REGION',
    'subregion' : 'SUBREGION',
    'lon' : 'LON',
    'lat' : 'LAT',
    'mpoly' : 'MULTIPOLYGON',

world_shp = os.path.abspath(os.path.join(os.path.dirname(__file__), 'data/TM_WORLD_BORDERS-0.3.shp'))

def run(verbose=True):
    lm = LayerMapping(WorldBorder, world_shp, world_mapping,
                      transform=False, encoding='iso-8859-1'), verbose=verbose)
Note: The tutorial lists: from models import WorldBorder, which will cause an import error.  models needs to be .models for this to work.

After making that change, I was able to:
(env) python shell
>>> from world import load
and watch as the countries of the world were loaded into the database.

Creating the github repo

Now would a good time to create a github repo.  First, I'll create a .gitignore file inside the top level geodjango directory (where is located) that lists the things I don't want in the repository:
This will tell git not to include the virtual environment, the python byte code files, and any vim swap files.  Next install git and initialize the repository:
(env)$ sudo aptititude install git
(env)$ git init
Now I'll check to see what a git add . would add:
(env)$ git add -n .
Since it looked good, I'll add do it, after configuring my git email and user:
(env)$ git config --global [github email address]
(env)$ git config --global [github name]
(env)$ git add .
(env)$ git commit -a
Initial commit.
Now push it to github (after adding an ssh key and creating a learn_geodjango project on github):
git remote add origin[user]/learn_geodjango.git
git push -u origin master
With the github repo now created, I'll continue the tutorial in a future post.

Thursday, August 6, 2015

Setting Up GeoDjango I

According to the documentation website,
"GeoDjango intends to be a world-class geographic Web framework. Its goal is to make it as easy as possible to build GIS Web applications and harness the power of spatially enabled data."
Since my goal is to become a free software GIS application developer, and since Python and Django are two of the core technologies I hope to utilize, GeoDjango seems like a no-brainer as something I should learn.

In this post I'm going to document the beginning process of setting up a GeoDjango server on a local VirtualBox VM.  In a later post, I'll look into installing it on the WebFaction application hosting service. Throughout this process, I'm going to modify the steps described in the tutorial to facility creating a github repo and migrating the app to WebFaction.

Since GeoDjango comes with Django, I'm going to start by creating a virtualenv on the same virtual machine I used to setup a django CMS virtualenv, the process for which I described in a previous post, Installing Django CMS on Ubuntu 14.04.  I'll install this new virtualenv right alongside the other one.  I'll be following the installation instructions from the GeoDjango Installation page, together with blog posts I've made previously documenting specific steps to install in my Ubuntu 14.04 VM. Installing GeoDjango requires installation of:
  1. Django
  2. Spatial database
I'll tackle each in turn.

Installing Django

$ mkdir geodjango
$ cd geodjango
$ virtualenv  env
$ source env/bin/activate
(env)$ pip install django
(env)$ django-admin --version
(env)$ deactivate

Installing a Spatial Database

According to the documentation, "PostGIS is recommended, because it is the most mature and feature-rich open source spatial database." It also recommends that Ubuntu installation use packages.  Since I'm new to this, I'll follow the recommendation.

$ sudo aptitude install postgresql-9.3 postgresql-9.3-postgis-2.1 postgresql-server-dev-9.3 python3-psycopg2
Next setup a database user who can create databases:
$ sudo su - postgres
$ createuser --createdb [user]
$ exit
Now follow the tutorial to see if it works (note: the command in red has been modified from tutorial with the aim of keeping the virtual environment directory (env) inside the project directory):
$ cd geodjango
$ source env/bin/activate
(env)$ django-admin startproject geodjango .
(env)$ python startapp world
(env)$ vi geodjango/
Change the DATABASES section to match the following:
    'default': {
        'ENGINE': 'django.contrib.gis.db.backends.postgis',
        'NAME': 'geodjango',
        'USER': '[user]',
Also add the last two items to INSTALLED_APPS so that it looks like this:
For the next step, we will need some gdal packages (and I'll grab unzip while I'm at it:
(env)$ sudo aptitude install gdal-bin python3-gdal unzip
(env)$ mkdir world/data
(env)$ cd world/data
(env)$ wget
(env)$ unzip
(env)$ cd ../..
(env)$ ogrinfo world/data/TM_WORLD_BORDERS-0.3.shp
INFO: Open of `world/data/TM_WORLD_BORDERS-0.3.shp'
      using driver `ESRI Shapefile' successful.
1: TM_WORLD_BORDERS-0.3 (Polygon)

The tutorial includes another ogrinfo example command that provides more detail about the world borders shape file, which I'll skip in the interest of space.  I will include the excellent short description of each of the component files in the shapefile:
Holds the vector data for the world borders geometries.
Spatial index file for geometries stored in the .shp.
Database file for holding non-geometric attribute data (e.g., integer and character fields).
Contains the spatial reference information for the geographic data stored in the shapefile.
Next edit the file so that it looks like this:

from django.contrib.gis.db import models

class WorldBorder(models.Model):
    name = models.CharField(max_length=50)
    area = models.IntegerField()
    pop2005 = models.IntegerField('Population 2005')
    fips = models.CharField('FIPS Code', max_length=2)
    iso2 = models.CharField('2 Digit ISO', max_length=2)
    iso3 = models.CharField('3 Digit ISO', max_length=3)
    un = models.IntegerField('United Nations Code')
    region = models.IntegerField('Region Code')
    subregion = models.IntegerField('Sub-Region Code')
    lon = models.FloatField()
    lat = models.FloatField()
    mpoly = models.MultiPolygonField()
    objects = models.GeoManager()

    def __str__(self):              # __unicode__ on Python 2
The default spatial reference system is WGS84.  New to me from the documentation is the existence of Open Geospatial Consortium (OGC) Spatial Reference System Identifier (SRID), which in the case of WGS84 is 4326.

The next step in the instructions were to run:
(env)$ python makemigrations
This didn't work for me, giving me the following error:
ImportError: No module named 'psycopg2'
I fixed this with:
(env)$ pip install psycopg2
(env)$ python makemigrations
Migrations for 'world':
    - Create model WorldBorder
But when I ran the next command:
(env)$ python sqlmigrate world 0001
I got a database error:
psycopg2.OperationalError: FATAL:  database "geodjango" does not exist
Anticipating that I might need knowledge of PostgreSQL databased administration, I began looking into that in an earlier post. I better look into that further before returning to the present task.