a blog on spatial research and data visualization in R. by paul bidanset © 2013-15

Posts tagged maps

Creating a TRULY Interactive Map of Craft Breweries in VA Using the leafletR Package (Guest Blog Post by Dr. Keegan Hines)

map


It’s a good feeling when a great friend who is smarter than you offers to write a blog post, for your blog, that’s better than anything you’ve written so far. Friends, colleagues, people who’ve not yet realized they are at the wrong site: please allow me to introduce to you the awe-inspiring Dr. Keegan Hines. He got his PhD in neuroscience from the University of Texas at Austin in 2014 and is now a data scientist doing some super-secret-James-Bond-machine-learning work for a DoD contractor near D.C. When he is not breathing life into spatially-centered instructional R blogs, he is part of an improv comedy troop, does some consulting work, and serves a mean campfire omelet. Without further adieu…

Microbreweries and Interactive Maps With Leaflet

This is a guest post from Keegan Hines, he’s a neat fella, you can follow him on the internet.

This post is about interactive data visualizations and some powerful new tools that are available to the R community. Moving beyond static data graphics toward interactive experiences allows us to present our audience with much more complex information in an easily digestable way. To empower these interactive graphics, we’re going to utilize tools such as HTML and javascript, technologies that drive the web-based interactive experiences you have every day. But the best part is that we’ll benefit from these technologies without having to learn anything about web development. We’re going to create some amazing things using only R!

As a guiding example, lets return to a previous blog post where Paul visualized the locations of microbreweries in Virginia . In that post, Paul introduced Plotly, a super cool company that allows you to create and deploy interactive graphics on their web-based service. Here, we’re going to do this all ourselves, with help from a new R package called leaflet. So let’s jump right in.

Here’s some boiler-plate stuff. We need to install the package manually from github and then load it.


devtools::install_github("rstudio/leaflet")
    library(leaflet)
    library(ggmap)
    

So first thing, let’s grab some location that we might want to put on a map. I’ll use a function from the ggmap package.

      
 somePlace <-ggmap::geocode("Washington,DC")      
 somePlace              

So we have a dataframe (with one row) and lat/lon coordinates for some arbitrary point in Washington, DC. We’re going to use functions from the leaflet package to generate a map around this point.

      
leaflet(somePlace) %>% addTiles() %>% addMarkers()
      

Now we have this draggable, zoomable, interactive map with a single line of R!

A little explanation of what we just did. In case it’s unfamiliar, I’ll first point out that we’re using the forward pipe %>% thing. The forward pipe was introduced in the magrittr package and has now been adopted in lots of places. The idea is that we can pass the output of a function as the input to the next function. This allows us to write code that reads left to right and is more aligned with our logic. For example:

      
# this is nested and confusing, we have to read it inside-out
    sqrt(sum(c(1,2,3)))
      
# this is sequential and awesome, it reads left to right
    c(1,2,3) %>% sum() %>% sqrt()

So back to leaflet. The first funciton we use is called leaflet() and this returns a base leaflet object, sort of the starting point for everything we might do. We passed our data frame as an argument to leaflet(), and so any later functions that might require data will look to this data frame.

We then sent the output of leaflet() to another function, addTiles(). This is because the output of leaflet() doesn’t have enough visual information to actually create a map – we haven’t provided enough detail yet about what we want. The function addTiles() updates the leaflet object by providing the visual look and feel through different “tiles”. In fact, there’s many different styles of map we could make, just by choosing different tiles. Here’s some examples:


leaflet(somePlace) %>% addProviderTiles("Stamen.Watercolor") %>% addMarkers()


leaflet(somePlace) %>% addProviderTiles("Stamen.Toner") %>% addMarkers()

The full list of available tiles is here.

And so the third function in this simple example is addMarkers(). This function’s purpose is pretty obvious and results in the big blue marker thing on the map. What it does is look through the provided data frame for any columns that are similar to “lat” or “lon” and then plots them. And it’ll do so for every row in the data frame, so it’s effortless to put lots of points on a map, as we’ll see below. There are also a few other other functions that are similar and plot slightly different things. You might be able to guess what addCircles() or addPolyLines() are capable of, but as an example:

          
leaflet(somePlace) %>%
	addProviderTiles("Stamen.Toner") %>%
	addCircles(radius=400,color='firebrick')

So let’s move on to our more interesting example – the breweries. I’ve scraped a list of microbreweries in Virginia and gotten their names, websites, addresses and so. Since I want lat/lon info as well, I’ve used ggmap::geocode() to estimate those as well. The result is a dataframe called ‘breweries’ that has 106 rows and looks like this:

            
> names(breweries)
[1] "Name"    "Address" "Phone"   "Website" "lat"     "lng"    		

> head(breweries[,c(1,4:6)])
	 Name                         Website                 lat             lng
1  Wolf Hills Brewing Co    www.wolfhillsbrewing.com     36.71231    -81.96560
2  Blue Mountain Brewery www.bluemountainbrewery.com     37.96898    -78.83499
3 Quattro Goomba Brewery       www.goombabrewery.com     38.98597    -77.61748
4   Hops Grill & Brewery          www.hopsonline.com     38.83758    -77.05116
5   Port City Brewing Co     www.portcitybrewing.com     38.80800    -77.10137
6  Loose Shoe Brewing Co    www.looseshoebrewing.com     37.56500    -79.06352

So let’s put em on a map.

      
 leaflet(breweries) %>% addTiles() %>% addMarkers()
      

Pretty effortless I’d say! This is great except we don’t know which brewery is which, it’s just anonymous points on a map. We could try to add some text to a map, but remember our goal is to utilize web-based interactivity. So we’re going to take advantage of a click-based popup by inserting the Name column of the data frame.

      
leaflet(breweries) %>% addTiles() %>% addMarkers(popup=breweries$Name)
      

And one final trick to make it just a little smoother. We want to add a hyperlink to the website in the popup. Since our data frame has a column for all the websites, we could do this easily in a similar way to what we just did with the Name column. But we can take it a step further. Now I promised you that we don’t need to know any web development stuff in order to make these maps (and we don’t!). But if you happen to have a little side knowledge, you can embed any HTML or javascript that you want. In this case, I’m going to use HTML’s < a > tag for hyperlinks, so that each brewery name actually links out to its website.

popup_style<-paste0("<a href=http://",breweries$Website," target='_blank'>",breweries$Name,"</a>") leaflet(breweries) %>% addTiles() %>% addMarkers(popup=popup_style)

Now we can easily zoom around and explore Virginia’s thriving craft-brew industry! You have to admit that’s a pretty awesome thing we were able to create with just a couple lines of R. And the interactivity allows us to encode a lot information (locations, names, and website of all the breweries) in a simple exerience that any viewer can explore at their own pace. As you might guess, this is just the beginning of what we can do with leaflet, and there’s a great guide at RStudio’s site.

If you’re like me, you’re very excited about incorporating web-based interactivity in your data analyses with R. This general idea of wrapping javascript-based experiences into an easy-to-use R package is something that’s gaining a lot of traction lately. To me, this is one of the most exciting innovations in the R community in the last couple years and is taking off in many exciting directions. If you want to learn more, I’m developing a course for DataSociety entitled “Advanced Visualization With R”. In the course, we’ll explore many of these web-based technologies including Leaflet, rCharts, Shiny and more, so look to sign up later this summer!

 

Adding Google Drive Times and Distance Coefficients to Regression Models with ggmap and sp


Space, a wise man once said, is the final frontier.

Not the Buzz Alrdin/Light Year, Neil deGrasse Tyson kind (but seriously, have you seen Cosmos?). Geographic space. Distances have been finding their way into metrics since the cavemen (probably). GIS seem to make nearly every science way more fun…and accurate!

Most of my research deals with spatial elements of real estate modeling. Unfortunately, “location, location, location” has become a clichéd way to begin any paper or presentation pertaining to spatial real estate methods. For you geographers, it’s like setting the table with Tobler’s first law of geography: a quick fix (I’m not above that), but you’ll get some eye-rolls. But location is important!

One common method of taking location and space into account in real estate valuation models is by including distance coefficients (e.g. distance to downtown, distance to center of city). The geographers have this straight-line calculation of distance covered,  and R can spit out distances between points in a host of measurement systems (euclidean, great circle, etc.). This straight-line distance coefficient is a helpful tool when you want to help reduce some spatial autocorrelation in a model, but it doesn’t always tell the whole story by itself (please note: the purpose of this post is to focus on the tools of R and introduce elements of spatial consideration into modeling. I’m purposefully avoiding any lengthy discussions on spatial econometrics or other spatial modeling techniques, but if you would like to learn more about the sheer awesomeness that is spatial modeling, as well as the pit-falls/pros and cons of each, check out Luc Anselin and Stewart Fotheringham for starters. I also have papers being publishing this fall and would be more than happy to forward you a copy if you email me. They are:

Bidanset, P. & Lombard, J. (2014). The effect of kernel and bandwidth specification in geographically weighted regression models on the accuracy and uniformity of mass real estate appraisal. Journal of Property Tax Assessment & Administration. 11(3). (copy on file with editor).

and

Bidanset, P. & Lombard, J. (2014). Evaluating spatial model accuracy in mass real estate appraisal: A comparison of geographically weighted regression (GWR) and the spatial lag model (SLM). Cityscape: A Journal of Policy Development and Research. 16(3). (copy on file with editor).).

Straight-line distance coefficients certainly can help account for location, as well as certain distance-based effects on price. Say you are trying to model negative externalities of a landfill in August, assuming wind is either random or non-existent, straight-line distance from the landfill to house sales could help capture the cost of said stank. Likewise with capturing potential spill-over effects of an airport  – the sound of jets will diminish as space increases, and the path of sound will be more or less a straight line.

But  again, certain distance-based elements cannot be accurately represented with this method. You may expect ‘distance to downtown’ to have an inverse relationship with price: the further you out you go, more of a cost is incurred (in time, gas, and overall inconvenience) getting to work and social activities, so demand for these further out homes decreases, resulting in cheaper priced homes (pardon the hasty economics). Using straight-line distances to account commute in a model, presents some problems (aside: There is nary a form of visualization capable of presenting one’s point more professionally than Paint, and as anyone who has ever had the misfortune of being included in a group email chain with me knows, I am a bit of a Paint artist.). If a trip between a person’s work and a person’s home followed a straight line, this would be less of a problem (artwork below).

commute1But we all know commuting is more complicated than this. There could be a host of things between you and your place of employment that would make a straight-line distance coefficient an inept method of quantifying this effect on home values … such as a lake:

commute2… or a Sarlacc pit monster:

commute3

Some cutting edge real estate valuation modelers are now including a ‘drive time’ variable. DRIVE TIME! How novel is that? This presents a much more accurate way to account for a home’s distance – as a purchaser would see it – from work, shopping, mini-golf, etc. Sure it’s been available in (expensive) ESRI packages for some time, but where is the soul in that? The altruistic R community has yet again risen to the task.

To put some real-life spin on the example above, let’s run through a very basic regression model for modeling house prices.

sample = read.csv("C:/houses.csv", header=TRUE)
model1 <- lm(ln.ImpSalePrice. ~ TLA + TLA.2 + Age + Age.2 + quality + condition, data = sample)

We read in a csv file “houses” that is stored on the C:/ drive and name it “sample”. You can name it anything, even willywonkaschocolatefactory. We’ll name the first model “model1”. The dependent variable, ln.ImpSalePrice.,  is a log form of the sale price. TLA is ‘total living area’ in square feet. Age is, well, age of the house, and quality and condition are dummy variables. The squared variables of TLA and Age are to capture any diminishing marginal returns.

AIC stands for ‘Akaike information criterion’. Some guy from Japan coined it in the 70’s and it’s a goodness-of-fit measurement to compare models used on the same sample (the lower the AIC, the better).

AIC(model1)
[1] 36.35485

The AIC of model1 is 36.35.

Now we are going to create some distance variables to add to the model. First we’ll do the straight-line distances. We make a matrix  called “origin” consisting of  start-points, which in this case is the long/lat of each house in our dataset.

origin

We next create a destination – to where we will be measuring the distance. For this example, I decided to measure the distance to a popular shopping mall downtown (why not?). I obtained the long/lat coordinates for the mall by right clicking on it in Google Maps and clicking “whats here?” (also could’ve geocoded in R).

destination

Now we use the  spDistsN1 function to calculate the distance. We denote longlat=TRUE so we can get the value from origin to destination in kilometers. The second line just adds this newly created column of distances to our dataset and names it dist.

km <- spDistsN1(origin, destination, longlat=TRUE)
sample$dist

This command I learned from a script on Github – initially committed by Peter Schmiedeskamp – which alerted me to the fact that R was capable of grabbing drive-times from the Google Maps API.  You can learn a great deal from his/their work so give ’em a follow!

library(ggmap)
library(plyr)

google_results

location is the column containing each house’s lat/long coordinates, in the following format (36.841287,-76.218922). locMall is a column in my data set with the lat/long coords of the mall in each row. Just to clarify: each cell in this column had the exact same value, while each cell of “location” was different.  Also something amazing: mode can either be “driving,” “walking,” or “bicycling”!

Now let’s look at the results:

head(google_results,4)
from to m km miles seconds minutes
1 (36.901373,-76.219024) (36.848950, -76.288018) 10954 10.954 6.806816 986 16.433333
2 (36.868871,-76.243859) (36.848950, -76.288018) 7279 7.279 4.523171 662 11.033333
3 (36.859805,-76.296122) (36.848950, -76.288018) 2101 2.101 1.305561 301 5.016667
4 (36.938692,-76.264474) (36.848950, -76.288018) 12844 12.844 7.981262 934 15.566667
hours
1 0.27388889
2 0.18388889
3 0.08361111
4 0.25944444

Amazing, right? And we can add this to our sample and rename it “newsample”:

newsample

Now let’s add these variables to the model and see what happens.

model2
AIC(model2)
[1] 36.44782

Gah, well, no significant change. Hmm…let’s try the drive-time variable…

model3
AIC(model3)
[1] 36.10303

Hmm…still no dice. Let’s try them together.

AIC(model3)
model4
AIC(model4)
[1] 32.97605

Alright! AIC has been reduced by more than 2 so they together have a statistically significant effect on the model.

Of course this is a grossly reduced model, and would never be used for actual valuation/appraisal purposes, but it does lay elementary ground work for creating distance-based variables, integrating them, and demonstrating their ability to marginally improve models.

Thanks for reading. So to bring back Cap’n Kirk, I think a frontier more ultimate than space, in the modeling sense,  is space-time – not Einstein’s, rather ‘spatiotemporal’.  That will will be for another post!

Toodles,

Paul

Throw some, throw some STATS on that map…(Part 1)


R is a very powerful and free (and fun) software package that allows you to do, pretty much anything you could ever want. Someone told me that there’s even code that allows you to order pizza (spoiler alert: you actually cannot order pizza using R :( ). But if you’re not hungry, the statistical capabilities are astounding. Some people hate code; their brains shut down and they get sick when they look at it, subsequently falling to the floor restricted to the fetal position. I used to be that guy, but have since stood up, gained composure, sat back down, and developed a passion for statistical programming. I hope to teach the R language with some more intuition in order to keep the faint-of-heart vertical and well.

Alright so for the start in this series, I’m going to lay the foundation for a Baltimore, MD real estate analysis and demonstrate some extremely valuable spatial and statistical functions of R. So without too much blabbing, let’s jump in…

For those of you completely new to R, its interface allows you to download different packages which perform different functions. People use R for so many different data-related reasons, and the inclusion of all or most of the packages would be HUGE, so each one, housed in various servers located around the world, can be downloaded simply. For the first-time use of each package, you’ll need to install it. They will then be on your machine and you will simply load them for each future use.

For the initial map creation, we need to install the following (click Packages->Install Package(s)and holding Ctrl allows you to select multiple ones at a time):

foreign
RgoogleMaps
ggmap

Since these are now installed to our machine, we simply load these packages each session we use them. Loading just ggmap and RgoogleMaps will automatically load the others we just downloaded. With each session, open a script and once you’ve written out your code, highlight it and right-click “Run Line or Selection,” or just press Ctrl -> R. A quick note: unlike other programming languages like SAS and SQL, R is case sensitive.

To load run:

library(ggmap)
library(RgoogleMaps)

We will specify the object of the map center as CenterOfMap. Anything to the left of “<-” in R is the title and anything to the right are the specified contents of the object. Now for the map we’re using, the shape of Baltimore behaves pretty well, so we can just type within the geocode() command “Baltimore, MD” ( R is smart and that’s all it takes).

CenterOfMap <- geocode("Baltimore, MD")

Not all areas are as symmetrically well behaved as Baltimore, and for other cases, my preferred method of centrally displaying an area's entirety begins with entering the lat/long coordinates of your preferred center. For this, I go to Google Maps, find the area I wish to map, right click on my desired center and click "What's here?" and taking the lat/long coordinates which are then populated in the search bar above. For Baltimore, I'm going to click just north of the harbor.

The code would then look like this:

CenterOfMap <- geocode(" 39.299768,-76.614929")

Now that we told R where the center of our map will be, lets make a map! So remember, left of the "<-" will be our name. I'd say naming the map 'BaltimoreMap' will do.

Baltimore <- get_map(c(lon=CenterOfMap$lon, lat=CenterOfMap$lat),zoom = 12, maptype = "terrain", source = "google")
BaltimoreMap <- ggmap(Baltimore)
BaltimoreMap

Alright, to explain what just happened, getmap() is the command to construct the map perimeters and lay down its foundation. I'm going to retype the code with what will hopefully explain it more intuitively.

get_map(c(lon='The longitude coordinate of the CenterOfMap object we created. The dollar sign shows what follows is part of what is before it, for example ExcelSpreadsheet$ColumnA, lat='The latitude coordinate of the CenterOfMap object we created.', zoom = 'The zoom level of map display. Play around with this and see how it changes moving from say,5 to 25', maptype = 'We assigned "terrain" but there are others to suit your tastes and preferences. Will show more later.', source = 'We assigned "google" but there are other agents who provide types of mapping data')

And the grand unveiling of the first map...

Now that is one good lookin' map. Just a few lines of code, too.

I'll show you some other ways to manipulate it. I like to set the map to black & white often times so the contrast (or lack thereof) of the values later plotted are more defined. I prefer the Easter bunny/night club/glow-in-the-dark type spectrums, and so, I usually plot on the following:

Baltimore <- get_map(c(lon=CenterOfMap$lon, lat=CenterOfMap$lat),zoom = 12, maptype = "toner", source = "stamen")
BaltimoreMap <- ggmap(Baltimore)
BaltimoreMap

We just set the night sky for the meteor shower. Notice that all we did was change maptype from "terrain" to "toner," and source from "google" to "stamen."

A few other examples:

Baltimore <- get_map(c(lon=CenterOfMap$lon, lat=CenterOfMap$lat),zoom = 12,source = "osm")
BaltimoreMap <- ggmap(Baltimore)
BaltimoreMap

This map looks great but it's pretty busy - probably not the best to use if you will be plotting a colorful array of values later.

Here's a fairly standard looking one, similar to Google terrain we covered above.

Baltimore <- get_map(c(lon=CenterOfMap$lon, lat=CenterOfMap$lat),zoom=12)
BaltimoreMap <- ggmap(Baltimore, extent="normal")
BaltimoreMap

And one for the hipsters...

Baltimore <- get_map(c(lon=CenterOfMap$lon, lat=CenterOfMap$lat),zoom = 12, maptype = "watercolor",source = "stamen")
BaltimoreMap <- ggmap(Baltimore)
BaltimoreMap

George Washington and the cartographers of yesteryear would be doing cartwheels if they could see this now. The upcoming installments in this series will cover:

1) Implementing Shapefiles and GIS Data
2) Plotting Statistics and other Relationship Variables on the Maps
3) Analyzing Real Estate Data and Patterns of Residential Crime and Housing Prices

Thanks for reading this!If you have any problems with coding or questions whatsoever, please shoot me an email (pbidanset[@]gmail.com) or leave a comment below and I'll get back to you as my schedule permits (should be quickly). Cheers.

All works on this site (spatioanalytics.com) are subject to copyright (all rights reserved) by Paul Bidanset, 2013-2015. All that is published is my own and does not represent my employers or affiliated institutions.