I came across an interesting dataset the other day while reading the Boston Globe (online of course, does anyone buy actual newspapers anymore?). The story, originally from Ward 5 Online, was discussing the number of properties in my current hometown, Somerville, Massachusetts, that have tax liens over 10,000 dollars! The article provided a table from the City that included the address, owner name, and amount in back taxes for each property. The story reported that this data was made available by the City’s aldermen through a request regarding derelict buildings that were shedding bricks (yikes!).
After I read the story I started to search for some of the properties to see if they were in my neighborhood. I quickly realized that this data would be better utilized if it were in a map. Thanks to Google Fusion I was able to quickly modify the table and map the addresses and back tax balances. In the map I created I only included the amount owed, the location, the status (many of the properties have been taken over by the city), and the number of years that the property is late on their taxes.
Back Taxes Legend: • 10k to 15k •15k to 20k • 20k to 25k • >25,000k
This data can be further analyzed with a number of other free datasets including census, income, or home sales data. One could then perform any number of analyses to see if there are any spatial patterns in regards to these large tax liens throughout the City (aka, future blog idea).
Also, this map is not perfect. We all know that there are limitations to mapping with Google, especially in reference to thematic mapping, but overall one cannot complain about the speed and efficiency of creating a simple, but effective map with Google Fusion. One item I would like to see from Google would be an embeddable legend. Esri’s JavaScript API can do it, why can’t Google’s? Now, we can have a discussion about how Google is collecting all of this data, having us create it for them(for free), and then doing who knows what with it, but that is for another day (aka, another future blog post).
Over the next couple of days I will take my dataset over to GeoCommons and work with their tools to create a better map and perhaps do some analysis (aka, another future blog post).
Until next time…