Using Free Geospatial Tools and Data Part 3: Understanding Geospatial Data

Before going into what tools and data are available, I feel a quick introduction to geospatial data is in order. I’m going to gloss over some things and greatly oversimplify others. If you want to know more, I would suggest checking out books such as Map Projections: A Working Manual by Snyder or some excellent tutorials on map projections and GIS that can be found via your favorite web search engine.

To begin, there are two main types of geospatial data available: raster and vector. Raster data are usually things such as aerial or satellite photographs, scanned paper maps, and so on. Vector data are mathematical vectors that describe items such as lines for roads, polygons for areas like parks, and others. As is, aerial photographs or vectors do not fully describe the Earth’s surface. They do not represent a one to one correlation between a point on the map and a location on the ground. They have to be run through a process called georectification to mathematically translate them to a coordinate system known as a map projection.

USGS DOQQ Of the Washington DC Area.

USGS DOQQ of the Washington DC Area Courtesy of Wikipedia

Map projections are mathematical descriptions of the surface of the Earth. As most of you (hopefully) know, the Earth is not flat. It’s a large spherical object with dents (canyons) and bumps (mountains). Since it’s not a flat sphere, it’s impossible to model it with 100 percent accuracy. Map projections were created that describe the Earth in terms of a series of equations that define a spheroid with various characteristics. These characteristics are useful for different types of representations, from maps of continents to maps of cities. I suggest you look at websites such as Map Projections: From Spherical Earth to Flat Map to get a more in-depth discussion. Basically, think of a map projection as establishing an average elevation of the Earth’s surface with areas where it’s more accurate than others.

USGS Traverse Mercator Projection.

USGS Traverse Mercator Projection. Courtesy Wikipedia

Georectification then takes the flat two-dimensional image or vector data and warps it so that it fits the three-dimensional surface of the Earth as approximated by the map projection. Once transformed, each pixel in a map image or each point on a vector matches its corresponding point on the Earth’s surface. Inside a GIS, you can then move the cursor around and see the matching latitude and longitude on the ground. If you wanted, you could take a GPS and even go there.

Axonometric Projection of a 3D object to a 2D surface.

Axonometric Projection of a 3D object to a 2D surface.  Courtesy Wikipedia

GIS file formats then allow the mathematical parameters that went into georectification to be encoded into the file as well. These are usually a large amount of numbers that describe the equations used and the inputs into all of those equations. Many times you’ll see something like below if you use a tool such as listgeo that can parse the information out of the file format. Note that most formats assume a Cartesian coordinate system running from coordinates (0,0) in the upper left corner to some X and Y to the size of the data set.

Cartesian Coordinate System

Cartesian Coordinate System Courtesy Wikipedia

listgeo o39102g6.tif
Geotiff_Information:
Version: 1
Key_Revision: 0.2
Tagged_Information:
ModelTiepointTag (2,3):
0 0 0 
691331.977 4417194.85 0 
ModelPixelScaleTag (1,3):
2.4384 2.4384 0 
End_Of_Tags.
Keyed_Information:
GTModelTypeGeoKey (Short,1): ModelTypeProjected
GTRasterTypeGeoKey (Short,1): RasterPixelIsArea
ProjectedCSTypeGeoKey (Short,1): PCS_NAD27_UTM_zone_13N
PCSCitationGeoKey (Ascii,25): "UTM Zone 13 N with NAD27"
End_Of_Keys.
End_Of_Geotiff.

PCS = 26713 (NAD27 / UTM zone 13N)
Projection = 16013 (UTM zone 13N)
Projection Method: CT_TransverseMercator
ProjNatOriginLatGeoKey: 0.000000 ( 0d 0' 0.00"N)
ProjNatOriginLongGeoKey: -105.000000 (105d 0' 0.00"W)
ProjScaleAtNatOriginGeoKey: 0.999600
ProjFalseEastingGeoKey: 500000.000000 m
ProjFalseNorthingGeoKey: 0.000000 m
GCS: 4267/NAD27
Datum: 6267/North American Datum 1927
Ellipsoid: 7008/Clarke 1866 (6378206.40,6356583.80)
Prime Meridian: 8901/Greenwich (0.000000/ 0d 0' 0.00"E)
Projection Linear Units: 9001/metre (1.000000m)

Corner Coordinates:
Upper Left ( 691331.977, 4417194.851) (102d45'44.74"W, 39d53' 6.48"N)
Lower Left ( 691331.977, 4400540.579) (102d46' 2.23"W, 39d44' 6.68"N)
Upper Right ( 704806.576, 4417194.851) (102d36'17.89"W, 39d52'55.15"N)
Lower Right ( 704806.576, 4400540.579) (102d36'36.61"W, 39d43'55.41"N)
Center ( 698069.276, 4408867.715) (102d41'10.37"W, 39d48'31.03"N)
Output of the listgeo command.

From the above output, we can see a few things that describe this image. It came from a USGS DRG that was mathematically warped to a projection (which is why some of them look “tilted” if you’ve looked at a lot of DRGs). Under the ModelTiepointTag key we see several zeros then the numbers 691331.977 and 4417194.85. If you look at the DRG as a two dimensional grid, those numbers represent the latitude and longitude (x, y) coordinates of the upper left pixel in the image. The projection of the DRGs uses meters so the x and y here represent meters in the coordinate system. Under the ModelPixelScaleTag we see another set of numbers, including 2.4384 listed twice. This is the per-pixel scale of the image in meters. So this means that as you increase in the x- or y- direction of the image, each pixel you count up equals 2.4384 meters on the ground. Moving from pixel (0,0) to (10,10) in this case only moves you by a value of 10 on the grid, but would move you 24.384 meters on the ground.

Scale is another issue that may seem counter-intuitive to many people. There are two scales of data that you may hear about: large and small scale. Large scale actually means data that is more detailed than a small scale. The scales do not reference the size of the map but the scale of the map to the Earth. Obviously a 1:1 scale map of the Earth is not feasible since it would have to be as large as the planet. Maps are scaled down to make them much smaller. A common size of a large scale map is 1:25,000 and a small scale map is 1:100,000. The 1:25,000 map is called a large scale because it’s fractional size is larger than that of the 1:100,000 scale map (think of the scales as fractions). A 1:25,000 large scale map does not show as much surface area as the 1:100,000 small scale map, but it shows an area in much more detail than the small scale map does. Think of a large scale map as maps of things like cities, while small scale maps are more along the lines of maps of entire countries.

With the scale discussion out-of-the-way, we then must turn to the usability of geospatial data. Much digital map data was collected from paper maps. Thus, the digital data has a scale where it is most accurate and less accurate at others. Data collected at 1:25,000 scale will be more accurate than data collected at 1:100,000. A GIS allows you to zoom in to the data and in some cases over-zoom past the actual accuracy of the data. Consider data such as a USGS Digital Elevation Model that was collected on a ten meter grid. You can zoom to a view where each pixel on the screen is smaller than ten meters on the ground, but your measurements will not be any more accurate than the ten meter grid you are viewing. Another example is looking at two road maps where one was collected at 1:25,000 and the other at 1:250,000. If you overlay the road maps on to a georeferenced aerial photograph, the 1:25,000 lines will more closely follow actual roads than the ones from the 1:250,000. You cannot expect the 1:250,000 data to be as accurate as the larger scale data, so zooming into a data set will not make the 1:250,000 data any more accurate than the scale where it was captured.

With this out-of-the-way, next time we will start looking at Open Source tools to use the data.

Mono 3.2.3 and Fedora 19 and a KeePass fix

If anyone is interested in installing Mono 3.2.3 on Fedora 19, here is what I did.  Note that if something breaks and your favorite app quits working then it’s all on you 😉

I found that someone out there had built Mono 3.2.3 RPMS for Fedora and created a repository.  However, since I was unable to find a .repo file, I went ahead and created this one:

[mono_linet]
name=Mono 3 repository
baseurl=http://yum.linet.dk/opensource/fedora-19-x86_64/RPMS/
enable=1
gpgcheck=0

Once this is done, do a yum update and you should be good to go.

However, I noticed I could no longer run KeePass on my system.  I had followed the instructions found here and use KeePass on my computer, phone, and tablet.  With Mono 3.2.3 KeePass wouldn’t work.  Simple fix is to change to /usr/lib and run the following command:

sudo ln -s /usr/lib64/libgdiplus.so libgdiplus.so

This should make KeePass run again.  Note you’ll get some errors on close if you run it from a console.  These appear to be harmless.

My Experience with the Spotsylvania County Juvenile Court System

Ok, so I have to a bit of a rant here.  Last year, a couple of young delinquents threw a large rock at my car as we were driving home in our neighborhood.  Had the top window been open, the rock would have struck me in the head and could have caused a crash.  I tried to chase them down but they got away.  However, we found out who they were from some young Mexican kids who they normally bullied.  These poor kids were just happy that someone FINALLY would do something to them since they were bullied and afraid to speak to the authorities.  We confronted the mom and the kids admitted to it, so we called the police and charges were pressed.

My wife went to court over it and we found that it would cost $500 in repairs to fix the vehicle.  The kids pled guilty and were found guilty.  They were ordered to do community service and pay the $500 in restitution.  The judge apparently said that even if they moved they still had to do community service and pay the restitution.

We had never heard anything and contacted the courts since they had a year to do this.  What we eventually found out was that another trial had been scheduled since they never sent any money to the courts and never reported any community service.  They never showed up, of course, and we were told since they couldn’t find them that they were just found guilty with a felony on their record and the case was closed.  I spend five minutes online and found out that they’re currently living in St. Louis, MO.  We were told that since it’s just kids and only $500 that it was closed since the state couldn’t do anything any more.

So let’s recap here.  What happened is that a couple of kids caused damage to a vehicle.  They were found guilty.  They were ordered to pay us back for the damage they caused.  However, all they had to do was move out of Virginia and they don’t have suffer any consequences for their actions.  We were told “but they will have a felony on their permanent record.”  This means jack.  People with felonies can easily find almost any kind of job in this day and age.

I realize now why there are so many young punks in this world.  They can basically get away with anything because it’s “too expensive” to prosecute them.  They can get off scot-free without repercussions from their actions.  And then they’re free to continue being young punks wherever they are.  It seems to me like this is just another example of how broken our legal system is.

A Few of My Research Publications

When I was at the USGS I did a lot of research and published several papers.  Now that I have my own website, I figure I can put them online now.  Here’s the first list of papers.  Some of them are final versions of the open file reports, others fell between the cracks when I transferred from the USGS to NGA.

  1. A New Method of Edge Detection for Object Recognition
  2. A Parallel-Processing Approach to Computing for the Geographic Sciences 
  3. A Parallel-Processing Approach to Computing for the Geographic 

    Sciences: Applications and Systems Enhancements

  4. Directed Edge Detection – A New Method for Identifying Objects in Imagery
  5. Distributed Processing of Projections of Large Datasets 

    A Preliminary Study

  6. Extending Beowulf Clusters
  7. A Distributed System for Fast Reprojections of Geospatial Data for 

    The National Map

  8. Model Development and Extraction from Neural Networks 

    Final Report

  9. Processing Large Remote Sensing 

    Image Data Sets on Beowulf Clusters

  10. Using Mosix for Wide-Area Computational 

    Resources

  11. Beowulf Distributed Processing and the United States Geological Survey
  12. An Intelligent Systems Approach to Automated Object Recognition 

    A Preliminary Study

OK, I’ll Give Microsoft Credit

They’ve come a long way from the “bad old days” of doing GUI apps with MFC under Visual C++ 6 where you had the wonderful “black box” message framework you weren’t supposed to touch but always had to anyway.  I managed to make a GUI app today with Windows Forms to read data off a CAC card so I don’t have to keep relying on the command-line driven app I wrote for testing a while back.  No black boxes, no hand editing message maps, no hand driving message pumps to get things done.  And the cool thing is it might run under Mono too.

While I still prefer Qt for most everything, Visual Studio now isn’t bad for the times I have to write source under Windows.

Hello!

Welcome to my new blog.  I got tired of Blogger/Google and now that I have my own domain, well, it’s time for a change 🙂  I’ll be filling more here soon.

Changes, They Are a Comin’

I’m going to try to start posting more here I think.  I’d like to get more into blogging (even podcasting).  This blog will keep its theme, just hopefully more content 🙂  I’m working on a long series of posts about using GIS tools and data since that’s still near and dear to my heart.

I’m also going to keep posting other technical things of interest here as time permits.  I want to get into reviews and discussing technology as well.

Stay tuned!

Fun with OpenStreetMap and Open Source

So as part of my pushing OpenStreetMap to people, I made some sample screen shots for a friend of mine who is wanting to move away from paying Google for their maps.  Thought I’d share in case other people want to see what’s possible using OpenStreetMap data for rendering.

For my setup, I have PostgreSQL/PostGIS running on a server here at home.  In it I’ve imported:

On top of that, I’m also running the Geoserver 2.2 beta to provide all of the data as a WMS, WCS, and WFS OGC services.  The OSM data was imported using the excellent imposm tool.  I used the styles from OSMinaBox for WMS styling with some additions I made to the SLDs to make sure additional features were rendered.
The following screen shots are samples I’ve made using some of this data.  I threw them together for Doug so didn’t really do any tweaking to label placements and the like. 
Screenshot 1 – OSM overview
In this screen shot, I used the US Tiger 2011 Counties data set for the background layer.  Plus it helped me to locate and zoom into my areas of interest.  The OSM* layers are all WMS renderings from my local Geoserver install using the above-mentioned SLDs.  OSM_Roads is actually a database view that imposm makes with all of the road layers merged together.
This second screenshot is zoomed in a level so that Geoserver does more labeling based on the SLD rules.
Screenshot 2 – Zoomed in example
On this third screenshot, I overlaid the USGS GNIS file with my own rendering rules inside QGIS.  I basically used SJJB icons under QGIS to make GNIS look a little nicer.  However, the bluish-green dots show I’m still not done with the style in QGIS and once it’s done there I’ll move it over to Geoserver to do the WMS rendering there as well.  
Screenshot 3 – GNIS overview
Here’s my quick post for now.  Just wanted to do a quick showing of how you can make decent looking maps from completely free data.  I apologize in advance since I have no artistic skills whatsoever though 🙂

A Random Post: Am I on a 32- or 64-bit OS?

So in response to a question on IRC this morning (and it’s early and I’m not fully awake), here’s a quick program I wrote to show how to detect if you’re on a 16-, 32-, or 64- bit OS by checking the size of an int* in C/C++:

#include


int main (int argc, char* argv[])
{
  int foo = sizeof(int*);
  
  switch (foo)
  {
  case 2:
    std::cout << "Size = 2: 16 bit OS" << std::endl;
    break;
  case 4:
    std::cout << "Size = 4: 32 bit OS" << std::endl;
    break;
  case 8:
    std::cout << "Size = 8: 64 bit OS" << std::endl;
    break;
  default:
    std::cout << "Size = " << foo << std::endl;
  }


  return(0);
}

My Latest Odd Project

So since the USGS is releasing a bunch of their old scanned paper maps at their Historical Topographic Map Collection, I thought it would be interesting to take the GeoPDF‘s there and georeference them to GeoTIFF‘s to compare to modern maps.  I’ve always had a thing about history, so thought this would be a fun side project to get my cartography back on.

Since QGIS doesn’t yet import GeoPDF’s, I’m first loading them into the GIMP (at around 300 dpi from the PDF), clipping the collar out, and saving them to a LZW-compressed TIFF.  Then I’m importing the TIFFs into QGIS and using the Georeferencing plugin to mark the grid points on the map.  I’ve read that the old maps were based on the Clarke 1866 ellipsoid, so in QGIS I’m setting the source projection to NAD27, which is based on Clarke 1866.  Yes, I know that technically this isn’t fully correct in the cartographic sense, but then again georeferencing old paper maps like this also won’t exactly put out a highly accurate GIS product either 😛  I’m outputting them to WGS84 from the georeferencing plugin.  Times on my older Dell E1705 are around 5 minutes doing a polynomial interpolation with Lanczos interpolation.  Then again, I remember back in the mid to late 1990’s when DRG production at the USGS on the old Data Generals would take a whole lot longer, so I wasn’t going to complain 🙂

The output isn’t so bad really.  Here’s a sample of the output draped on top of Yahoo Street Maps (Google Maps and reprojection on the fly doesn’t seem to work so well in QGIS right now).

Setting the Fredericksburg 1889 map to 50 percent transparency in QGIS and zooming in to old town, you can see that the map fits to a modern map surprisingly well.

I’ll probably play around with this some more and maybe upload the georeferenced maps to archive.org or something along those lines.