Map Sandbox Project

A sandbox for designing and implementing comparative thematic maps.

Mapping Points and Polygons

In this post I give a brief summary of some core mapping concepts and demonstrate how to draw geographical points and areas using Mapbox/Leaflet.

Working with geolocation data

By now, thanks to the ubiquity of GPS on mobile devices, most people are familiar with the concept of assigning a coordinate point to a location on the surface of the Earth. This point is given by a pair of spherical angular coordinates (also referred to as geographic coordinates) (λ, φ) called longitude and latitude in a geographic coordinate system. Longitude is the east/west angle from the prime meridian, while latitude is the north/south angle from the equator.

Datums

Because the Earth is not a perfect sphere, it is necessary when doing geospatial calculations to take into account the deviations of the Earth’s ellipsoidal surface from a spherical surface. Such a model of the Earth’s surface is referred to as a datum. The predominant geographic reference system used for this is the World Geodetic System, the latest revision of which is WGS 84. Other datums one might encounter are the NAD 83 and GRS 80. Transforming between these reference models can result in slight coordinate shifts (from a few centimeters to a meter).

Projections

In order to visualize a region of the globe on the flat surface of a map, one must project the region from the ellipsoidal surface of the Earth, where a point is specified in spherical coordinates (φ, λ), to a planar surface, where a point is specified in rectangular coordinates (x, y). Such a map projection results in a distortion of one or more of the following properties: distance, area, shape, or direction. There are dozens of established projection schemes one can choose from, each optimized to solve a particular type of problem. Two of the most common types of projections are (both conformal, angle preserving): Mercator and transverse Mercator.

Coordinate systems

In order to keep track of the various coordinate systems to help simplify transforming between them, a repository was created, with unique spatial reference identifiers (SRID) assigned to each one. Each coordinate system must specify the projection and associated datum (e.g. WGS 84) upon which it is based. Here are a few common coordinate systems:

  • EPSG:4326 (uses WGS 84 datum): This is the default geometric (spherical angular) coordinate system for the WGS 84 datum. It is sometimes referred to as the WGS 84 coordinate system.

  • EPSG:3857 (uses WGS 84 datum): Commonly referred to as the ‘Web Mercator’ system (also known as EPSG:900913), this coordinate system is based on a Mercator projection, and is used by many mapping applications (e.g. Google maps, Bing, OpenStreetMaps, Mapbox/Leaflet, etc). This specification has a somewhat curious (and confusing) history outlined well in an article by Alastair Aitchison. In practice, when using an API based on this system, one specifies the coordinates in spherical angular coordinates (ESPG:4326), and the service then carries out the projection into the EPSG:3857 system for rendering.

  • State Plane Coordinate System (SPCS) (uses NAD 83 datum): This is a collection of coordinate systems resulting from a local projection (typically either the transverse Mercator or the Lambert conformal conic projection). The St. Louis Metropolitan Police Department use this system to report their crime incidents, in particular, they use the EPSG:26996 Missouri-East SPCS. The distances (typically given in units of feet, but sometimes meters) of the coordinates are measured relative to an origin unique to that projection lying outside of the state.

Esri has a concise overview of all of these concepts that I found useful.

GeoJSON data format

GeoJSON is a commonly used specification for encoding geospatial data using the JSON data format. The specification allows for several different types of features, such as a “Point”, a “LineString” and “Polygon”. The GeoJSON spec does not require the points within a polygon to be ordered (see this thread), but many applications that import GeoJSON do require polygon points to be ordered (either clockwise or count-clockwise). The spec also allows each feature to be assigned a set of user-defined properties (such as a description or perhaps a numerical value used to assign a color).

As an example, consider the following GeoJSON data describing a point and a polygon:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
{ "type": "FeatureCollection",
  "features": [
    { "type": "Feature",
      "geometry": {"type": "Point", "coordinates": [-90.193378, 38.631263]},
      "properties": {"myProperty": "val1"}
      },
    { "type": "Feature",
       "geometry": {
         "type": "Polygon",
         "coordinates": [
                [ [-90.196158, 38.62781], [-90.193031, 38.627043],
                [-90.193368, 38.626247], [-90.19641, 38.626997] ]
           ]
       },
       "properties": {
         "myProperty": "val2",
         "anotherProperty": 42
         }
       }
     ]
   }

Interactive maps with Mapbox

The GeoJSON format can be used to encode data for use in Mapbox. The Mapbox API for displaying maps is built on the Leaflet library. The Mapbox examples page shows many different use cases of loading GeoJSON data. Also, checkout this Leaflet example. Below I show an example where I draw a few features on a map for St. Louis: a polygon showing the outline of City Garden, and the old and new locations of the tech incubator T-REX.

Example of drawing points and a polygon (JavaScript source code).

Base Map

A base map is the canvas on which we will draw our geoanalytic visualizations. This post gives a brief summary of base map design using Mapbox.

Mapbox

In order to build a thematic map to visualize crime statistics and other neighborhood attributes, we’ll need a base map to provide the background context that shows basic geographical information like streets, buildings, parks, and waterways. In this project, we’ll tap into the powerful mapping tool Mapbox. It provides tools for styling and deploying maps that are based on OpenStreetMap data, with API’s allowing maps to be easily embedded in web pages and mobile apps.

The service operates under a freemium model, so you can sign up for free and get started immediately. Subscription rates are primarily based on map views, with the free plan allowing for 3000 views/month. They provide an online map editor for very basic features (e.g. choosing the colors of streets, land areas, waterways, etc) and a desktop app call Tilemill for advanced map design. So far the online editor has been sufficient for my needs, but it’s nice knowing the advanced customization capability is there if I need it.

Base map design

The base map is meant to provide a geographical context for the thematic data being visualized. The base map styling should be done using neutral colors and the features should be somewhat subdued because we’ll be adding colored features to this base layer that encode the statistics we’re interested in and we want those features to stand out.

A base map I designed using the Mapbox online editor.

Embedding a map

Mapbox provides a JavaScript library mapbox.js that is integrated with the leaflet.js library for building interactive maps. I’m a JavaScript newbie at the moment, so my coding is largely based on finding pre-existing examples to use as a starting point and then adding my own tweaks and customizations.

It’s straightforward to embed a map into a web page. First you need to add a reference to the library and the mapbox CSS in the <head> element:

1
2
<script src='//api.tiles.mapbox.com/mapbox.js/v1.5.2/mapbox.js'></script>
<link href='//api.tiles.mapbox.com/mapbox.js/v1.5.2/mapbox.css' rel='stylesheet'/>

Somewhere in your html source, you also need to provide the actual JavaScript code that embeds the map into the correct element. The key mapbox function is L.mapbox.map(element_id, map_id, options), where element_id is the id of the html element and map_id is the unique identifier of your base map, e.g.:

1
2
3
4
5
6
<script type='text/javascript'>
    var map = L.mapbox.map('mbmap', 'jamieinfinity.g7e1c0h2', {
                scrollWheelZoom: false
              });
    map.setView(new L.LatLng(38.632, -90.24), 13);
</script>

Finally, wherever you want the map to appear in your content you’ll need to place the <div id='element_id'> tag. For the map above, this is:

1
<div style='height:360px' id='mbmap' class='mapboxmap'></div>

Resources

Over the past few months I’ve been collecting useful resources as I study crime mapping and learn how to build maps. This post contains an updated list of web links, books, and tools.

Crime mapping references

Crime mapping is an active area of research in the broader field of criminology. There are a couple of really nice review articles that have become standard references (as far as I can tell, as an outsider). Both of these were published by the National Institute of Justice (NIJ):

K.D. Harries, ‘Mapping crime: principles and practice’, U.S. Dept. of Justice, National Institute of Justice, Washington D.C. (1999). [pdf]

J.E. Eck, S. Chainey, J.G. Cameron, M. Leitner, R.E. Wilson, ‘Mapping crime: understanding hot spots’, U.S. Dept. of Justice, National Institute of Justice, Washington D.C. (2005). [pdf]

I also found the following article by J. Ratcliffe useful:

J. Ratcliffe, ‘Crime Mapping: Spatial and Temporal Challenges’, in Handbook of Quantitative Criminology (Eds. A.R. Piquero & D. Weisburd), Springer (2010). [pdf]

S. Chainey and J. Ratcliffe have co-authored one of the definitive books on crime mapping:

S. Chainey and J. Ratcliffe, ‘GIS and Crime Mapping’, Wiley (2007). [amazon]

The book I’ve found most useful, however, is an ArcGIS tutorial on crime mapping by W.L. Gorr and K.S. Kurland. One of the challenges with learning the core standard practices of this field as an outsider is finding reference data that one can use to reproduce the calculations being described. This tutorial is the only source I found that provides such data (unfortunately, the Esri license agreement prohibits me from redistributing the data).

W.L. Gorr and K.S. Kurland, ‘GIS Tutorial for Crime Analysis’, ESRI Press (2011). [amazon]

Finally, another useful reference is a book on infographics by I. Meirelles. It’s an excellent book, and has a great chapter on map visualizations:

I. Meirelles, ‘Design for Information’, Rockport Publishers (2013). [amazon]

Software / tools

One of the most widely used geographical information systems is ArcGIS. It has turnkey solutions for all of the common geographical mapping use-cases, but also has the toolset for tackling advanced problems. I’ve only just begun to use it by walking through the ‘GIS Tutorial for Crime Analysis’ mentioned above, which provides a six month license key. Esri offers a home use license for $100. They also have integrated Python scripting to help automate and scale up the overall GIS workflow (see their Python blog and GitHub repo for more info).

Although I intend to continue to dive deeper into learning how to put ArcGIS to good use, for me at the moment its main role is as a reference to compare my own calculations and visualizations to. Standard geographical data operations are available in ArcGIS, but these canned routines are essentially black boxes. I’m interested in understanding (at least partly) the underlying math and science the geoanalysis routines. As such, I will be ‘reinventing the wheel’ to some degree by implementing my own solutions to some of these standard problems (like binning and smoothing data points).

Starting out, I will be using the scientific computing platform Mathematica, which is a much more general-purpose tool than ArcGIS, and does have many built-in GIS features (such as importing shape files and transforming between geo projections). I was a developer at Wolfram for about six years, and so know my way around Mathematica pretty well. They also have a home use license, for \$295 (or for an annual subscription of \$149).

ArcGIS and Mathematica provide a spring board to get up and running doing calculations and making visualizations. However, I’m also interested in developing a suite of tools in Python, which is open source, if for no other reason than that I’ve been wanting to learn Python for a while, and so a project like this provides a context and motivation to do so. There are several open source GIS tools and libraries I can tap into. QGIS is an open source framework that seems to be trying to offer many of the same features as ArcGIS. Like ArcGIS, it has been integrated with a number of related Python libraries (see the PyQGIS page and the PyQGIS cookbook). Another useful Python library is GDAL, which seems mainly focussed on raster data manipulations (as opposed to vector/polygon operations).

For rendering maps, there are some awesome open source tools available. Leaflet is a JavaScript library for building interactive maps on the web. It has been integrated into Mapbox, which provides additional features such as building styled base maps that can be hosted on their servers. It integrates data from OpenStreetMap integrated into its services. I will be making heavy use of Mapbox in my subsequent posts. They even have a mobile app SDK for iOS and Android, which I may tap into. You can also combine the power of D3.js with Leaflet, which was done to great effect in a widget tool called Crosslet.

The domain of GIS and mapping has its own Stack Exchange site, which is a valuable resource for working with any of the above tools.

Crime data

Data sources

The best unified source of crime incident data that I know of is through the Socrata service. It is an open data platform that many city governments are using to host their data. It has a feature-rich web interface, where a user can query the data, generate and save visualizations (including mapping), and export data. Socrata also offers a developer API. In the table below, I provide a list of URL’s I curated several months ago (it may be slightly out of date).

City / County Socrata Portal
Austin http://data.austintexas.gov/
Baltimore http://data.baltimorecity.gov/
Belleville, IL https://data.illinois.gov/belleville
Boston https://data.cityofboston.gov/
Champaign, IL https://data.illinois.gov/champaign
Chicago http://data.cityofchicago.org/
Honolulu https://data.honolulu.gov/
Kansas City https://data.kcmo.org/
Madison https://data.cityofmadison.com/
New Orleans http://data.nola.gov/
New York City https://data.cityofnewyork.us
Oakland https://data.oaklandnet.com
Raleigh https://data.raleighnc.gov
Redmond, WA https://data.redmond.gov
Rockford, IL https://data.illinois.gov/rockford
Salt Lake City https://data.slcgov.com
San Francisco https://data.sfgov.org
Seattle http://data.seattle.gov/
Somerville, MA http://data.somervillema.gov
Wellington, FL https://data.wellingtonfl.gov/
Cook County, IL https://datacatalog.cookcountyil.gov


Having a Socrata data portal, however, doesn’t guarantee the city is making crime incident data available. For example, the city of Saint Louis does not (to my knowledge) have a Socrata portal, but the St. Louis Metropolitan Police Department does provide monthly crime reports containing tabulated incident data. In future posts, I’ll be focusing on crime data for Chicago, St. Louis, San Francisco, and Seattle:

City Crime data link
Chicago https://data.cityofchicago.org/Public-Safety/Crimes-2001-to-present/ijzp-q8t2
St. Louis http://www.slmpd.org/Crimereports.shtml
San Francisco https://data.sfgov.org/Public-Safety/SFPD-Reported-Incidents-2003-to-Present/dyj4-n68b
Seattle https://data.seattle.gov/Public-Safety/Seattle-Police-Department-Police-Report-Incident/7ais-f98f


Data access

For a broad overview of current trends in how city governments are making crime data available to the public, check out the following two-part article the Sunlight Foundation published earlier this fall:

A. Green, ‘The Landscape of Municipal Crime Data’, The Sunlight Foundation blog, 9/10/13. [link]; A. Green, ‘The Impact of Opening Up Crime Data’, The Sunlight Foundation blog, 9/12/13. [link]

In the first article, the author summarizes the broad range of levels of effort and quality of open data practices in U.S. cities, while the second highlights some of the notable success stories of citizens putting available data to good use. Even more valuable is the raw set of research notes the author has made available via Google docs, which includes additional links to data sources.

An important issue to be aware of when looking for crime data is the question of data ownership and the potential legal pitfalls stemming from this. A few years ago there was a federal court case that eventually reached a settlement, where the company Public Engines (who operate CrimeReports) sued ReportSee (who operate SpotCrime) for programmatically scraping their website. The details of the lawsuit can be found in this article:

M. Masnick, ‘Who Owns Public Crime Data?’, TechDirt blog, 6/14/10. [link]

and a summary of the settlement and its ramifications are given in these two articles:

J. Ellis, ‘How public is public data? With Public Engines v. ReportSee, new access standards could emerge’, Nieman Lab, 2/17/11. [link]

A. Hochberg, ‘Disputes over crime maps highlight challenge of outsourcing public data’, Poynter news, 5/11/13. [link]

To briefly summarize: Public Engines had entered into agreements with certain city police departments, acting as a third party taking in data from the agencies and providing a public interface to the data. ReportSee had scraped data from the CrimeReports web site operated by Public Engines.

In the settlement, ReportSee is barred from using data from CrimeReports and also can not ask for data from agencies that are contracted with Public Engines. That second restriction does not apply to everyone, however, since apparently the contracts do not forbid the participating police departments from making their data available to others. But in practice many departments are treating their contracts with third party data providers as exclusive, which likely violates open data laws that many states have legislated. At first glance, the settlement also seems to be in contradiction with the landmark Feist v. Rural ruling that established that raw facts can not be copyrighted. However, according to the Hochberg article, nowhere in their site licensing or in the lawsuit does Public Engines make any copyright claims on the data.

Administrative divisions

There are two main types of administrative division needed for this project: city neighborhoods and census blocks. In the table below I provide links to neighborhood boundaries for several cities:

City Neighborhood boundaries link
Chicago https://data.cityofchicago.org/Facilities-Geographic-Boundaries/Boundaries-Neighborhoods/9wp7-iasj
St. Louis XXX
San Francisco https://data.sfgov.org/Service-Requests-311-/Neighborhoods/ejmn-jyk6
Seattle https://data.seattle.gov/dataset/Neighborhoods/2mbt-aqqx


The United States Census Bureau provides socioeconomic statistics for regions at varying levels of granularity: nation, state, county, etc. The smallest level of granularity is at the so-called census block level. These are of interest for this project because the Census Bureau produces demographic data assigned to these blocks, such as population, which will allow me to calculate a ‘per capita’ crime rate. Census block boundary files and demographic data can be downloaded directly from the Census Bureau site. The links are provided below. Note: obtaining demographic data such as population is a multistep process outlined in this pdf.

Resource link
TIGER/Line® shapefiles http://www.census.gov/cgi-bin/geo/shapefiles2013/main
Demographic data http://factfinder2.census.gov/


Motivation and Goals

This project is an attempt to build a tool for visualizing a side-by-side comparison of neighborhood livability metrics. My initial focus is on crime statistics, but what I’m building should be extensible to other kinds of socioeconomic attributes. In this post I describe my motivation and goals for the project.

Backdrop

I became interested in crime mapping about a year ago when I was relocating to St. Louis for a new job and had done a bit of research weighing the pros & cons of different neighborhoods. St. Louis has a reputation of having a higher crime rate than most other cities in the U.S., consistently being toward the top of nation-wide rankings, and this certainly played on my mind at the time. I ended up moving into a downtown loft, which is a fifteen minute walk from where I work. After living here for nearly a year, I’ve grown to love this diverse city and all it has to offer: it has a rapidly growing, supportive entrepreneur community, a vibrant tech community with a wide range of active meetup groups, a great selection of restaurants and microbreweries, the beautiful historic Forest Park filled with top-notch museums, frequent neighborhood festivals, a vital sports culture, and on and on….

St. Louis, then, seems to embody a glaring contradiction: How can such a vibrant city rank as one of the most dangerous cities in the U.S. to live? It challenges the natural tendency of popular media to present an overly simplified cut-and-dry narrative that is easy to comprehend. St. Louis residents, both long-time natives and recent transplants, generally feel the predominant national conversation about St. Louis is unfairly dominated by crime statistics, overshadowing all of its other redeeming qualities, which can have a tangible impact on the local economy by deterring businesses and individuals from moving here.

There is a recent push by community leaders to try to change the conversation by focussing on a critical flaw in how the statistics are reported: the numbers calculated for St. Louis are aggregated only at the core city level and exclude the surrounding suburbs of the larger metropolitan area, which is in sharp contrast to most other cities where the crime rate is calculated for the entire metropolitan area. City and county officials and community activists are exploring the possibility of an eventual city-county merger, but in the short term, regional police leaders are considering formally combining city and county crime reporting when compiling Missouri’s statistics for the FBI, which issues the annual city crime rankings. Such a merger would have lowered the city’s rank in 2012 from 3rd to 8th.

To some, combining city and county statistics may seem like a statistical slight of hand to improve the overall numbers without actually addressing the underlying problems leading to elevated crime. It’s true that such a measure is aimed at the perception of crime, which is only a small part of law enforcement’s overall strategy; they are also actively exploring and deploying a variety of measures to lower crime, such as hot spot and neighborhood policing and grassroots community initiatives such as SiRV to help reduce the factors that produce a culture of violence in communities.

A different approach

My interest in crime mapping is limited to the scope of addressing how crime statistics are calculated and conveyed: I believe that better tools for visualizing and understanding crime rates can be built and made freely available, and that people would benefit from such an effort.

The core problem I’d like to solve is: how can one meaningfully compare crime in one region to crime in some other region? If a national ranking is to be meaningful, then parameters used for each city, such as the defined boundary, must be programmatically determined in a way that ensures the resulting statistics are properly normalized across all cities. It shouldn’t be possible for an individual city to alter the way they report crime in order to effect the results of the calculation. The effort in St. Louis to combine city and county reporting is an attempt to correct for this lack of normalization, which is a good thing. But it shouldn’t be necessary if the underlying calculation is properly normalized to begin with.

I feel crime rates aggregated over an entire city or metro area are not of much value. It’s much more useful to be able to compare areas on a human scale of several blocks within one’s neighborhood. Being able to compare neighborhoods between cities side-by-side is useful, because it provides a relational context to better interpret meaning of the numerical values.

So, one of my central goals is to build a tool that allows the user to choose a neighborhood in one city and compare it to a neighborhood in another city, and do this comparison in a statistically meaningful, ‘apples-to-apples’, way so that they could judge for themselves the relative safety of a neighborhood. Such a tool could also be extended to other types of data, such as socioeconomic data and livability metrics, to gain a more holistic profile of a neighborhood.

This type of tool does not exist freely on the web or in a mobile app. And so as a software engineer and physicist, I feel I am uniquely qualified to do the end-to-end ground work required to build such a thing. For me, it’s critical to be very transparent about how the data is being manipulated: ideally this type of tool would allow the user to trace the full computational path if they so desired, to find out what assumptions/choices lead to the final visualization.

There are several sites that provide crime maps on the web: CrimeReports, CrimeMapping, RAIDS Online, and SpotCrime. As far as I can tell, the sole visualization mode offered by these sites is to show the crime incident data as scatter points on a map, which doesn’t provide much insight. The key service these sites provide, which is by no means trivial, is that they aggregate the crime data from many different city municipalities into a unified interface. Simply showing the raw incident data has the virtue of not introducing any subjective biases into the data, which can sometimes happen when building a thematic map (e.g. a heat map, or choropleth map, etc) if one isn’t careful.

On the other end of the spectrum, there are several feature-rich GIS systems available capable of advanced geographical data analysis, but these have a fairly steep learning curve that makes them more suitable for experts than for the average web or mobile app user. The leader in this space (by a long stretch) is the ArcGIS software, but there are also some open source tools out there as well, such as QGIS.

I believe it’s possible to find a balance between these two extremes that provides the user more insight than what can be gleaned by simply showing the raw incident data, but at the same time provides a streamlined elegant interface that doesn’t require expert-level skill to interpret the information being visualized.

There are a handful of innovative sites that have taken a step in this direction by offering more informative thematic maps. The real estate site Trulia provides a density map of crime aggregated over census blocks (as of Dec. ‘13, they don’t have data for St. Louis). The open government site AxisPhilly has created an interactive choropleth map showing how crime is changing over time in Philadelphia, aggregated by neighborhood, and the independent news site MinnPost provides a similar interactive map showing crime statistics broken down by neighborhood.

In the next section I’ll sketch an outline of what I hope to build.

Project goals

I’m doing this project ‘just for fun’, purely as an evening/weekend side-project. Aside from the reasons I outlined above, I’m drawn to this project because there is a mix of problems to be solved that engage different parts of my brain: data wrangling/modeling/analytics, infographics and UI/UX design, and software engineering.

My long term goal for this project is to build an iOS app for side-by-side comparison of the crime density between two different neighborhoods (either within the same city, or between two different cities). In due course, I will write out a set of user stories (probably using waffle.io, which is integrated with GitHub issues). However, realistically I am still a month or two away from embarking on that.

In the short term, this project will be a series of exploratory spikes as I make my way through the fundamentals of geographical analytics and figure out how to best achieve the desired features. I have already begun working my way through various articles and books – in a later post I’ll summarize these resources.

Here is a rough sketch of features and related issues that I’ll be exploring:

  • obtaining and working with crime incident data: be able to translate between shape files and geojson files. Also understand the hierarchy of crime categories.
  • transform between map projections: do computations in a planar distance-preserving projection, but then project into whatever projection is appropriate for map visualization (typically spheroidal lat/long coordinates).
  • binning of points on a discrete spatial grid: try to minimize subjectivity in the following (i.e. make determinations algorithmically if possible)
    • choice of grid geometry: square, hexagonal, Voronois, or irregular polygons like census blocks.
    • choice of grid element size (e.g. 100 feet, 300 feet, 1000 feet, etc).
    • be able to project between grids (i.e. the Census Bureau has various indicator data assigned to block polygons, like population, so one needs to be able to translate these values onto other grids).
  • applying Gaussian smoothing to estimate incident density: what is the best choice of smoothing radius?
  • time binning and smoothing: what are good time interval values for binning and for smoothing (moving average)?
  • determine day/night for an incident, given time of day and lat/long.
  • be able to apply spatial clustering algorithms, such as computing the nearest neighbor index and Getis Ord G*. As part of this, understand the notion of a random distribution as a null hypothesis.
  • also be able to cluster based on crime profile similarity metric
  • choice of color scale for visualizing density: probably best choice wi ll be to base on quantiles of density distribution for region(s) of interest

Blog goals

This blog is primarily for my own personal record keeping: it’s a running account as I figure out how to build this tool. At times my notes may have the feel of a tutorial, but anyone happening across this site should keep in mind that I’m learning this material as I go and do not pretend to be an expert in crime mapping or geoanalytics. On the other hand, sometimes it’s useful to see how someone else was able to make sense of a subject they are learning for the first time.

This is also a companion blog accompanying a GitHub repository where I plan to make my code available. Initially I will be tapping into the scientific computing platform Mathematica (recently rebranded as Wolfram Language), which is my go-to tool for these kinds of projects (I worked at Wolfram for nearly six years before pivoting my career into mobile app development). Eventually I’ll be transitioning over to coding everything in python, since there are so many open-source geo-processing tools available. As a side effect, I think it will be useful to have two distinct implementations to help verify and validate as I go along.