Data Analysis and the Promise of Speedy Government
Combining data with new analytics techniques can help governments react nimbly and purposefully. It's hard work, but the potential payoff is worth it.
In times of emergency, good government means fast government, able to react nimbly and purposefully to new conditions as they arise. Having speedy government tomorrow, though, depends on preparation and prepositioning of critical resources today with the understanding that seemingly random events often fit into actionable patterns. By understanding these patterns now through clever combinations of data and new modeling techniques, governments can improve their responses and become more effective.
In large cities, for example, must police officers simply patrol hoping they will see a crime, or might they use data on offenders, past crimes, neighborhood conditions and time of day to improve surveillance? Hot-spot crime analysis has grown in popularity for just this reason, giving patrols a way to focus their energies on specific areas that have a higher propensity for crime. Recent advances continue to refine these statistical methods, such as a new system developed for and in use by Seattle, Los Angeles and Santa Cruz that feeds past and current crime trends into the model used to predict earthquake aftershocks, adapting that model to predict crime as well.
Building a fully integrated system -- making different datasets speak to each other and instilling a cooperative strategy across departments -- is a lot of hard work. But being able to look at different phenomena at once can pay off both during emergencies and in the day-to-day functioning of a city. Chicago, for example, has been developing WindyGrid, a predictive-analytics platform that has begun to reveal relationships such as spikes in stolen trash bins when a block's streetlights go out. Those little extra costs add up, and now the city knows what kind of actions need to be taken while it works on a streetlight repair.
WindyGrid's planned capacity includes being able to preemptively react to a range of emergency situations -- knowing when a water main is likely to break, for instance, or being able to respond more quickly during a massive snowstorm. The goal is a preemptive platform comprehensive enough to solve issues in areas ranging from infrastructure to public safety while accessible enough that a city employee could simply query the database when he or she has a hunch that might result in better service and big savings.
For decades, cities have worked to optimize their ambulance response times by having drivers park in locations with a high incidence of emergencies rather than wait in firehouses. In New York City, after an initial brainstorming process where various theories about placement were introduced, the data team was able to granularly measure 911 responses from dial to arrival, which enabled systemic improvements to the entirety of the response. More important, it demonstrates the iterative nature of this work, from common-sense hypotheses to data-driven enhancements allowing in the end a deep and comprehensive understanding of the entire 911 transaction.
The timely presentation of data also can help citizens when disaster hits. When Sandy hit the Northeast in 2012, governments and agencies at all levels utilized Web-based GIS platforms to provide specific, timely information. Citizens could act before the storm hit, making use of such tools as New York City's online map of evacuation zones and shelters. Once the storm hit, they could respond more intelligently, thanks to online tools such as the Federal Emergency Management Agency's Check Your Home map, which allowed evacuated residents to compare satellite photography and know whether they needed to prepare to deal with damage or destruction of their homes. In disasters the focus remains on getting information out as quickly as possible, but we can expect to see more analytical tools being developed to make this data go even farther, aiding in the allocation of resources and concentration of efforts.
Whether governments are responding to routine daily service needs or to catastrophe, data and its sophisticated analysis hold the keys to efficiency and resiliency. And as budgets become tighter, we need that efficiency and focus more than ever.
This column has been revised to clarify New York City's use of data analysis to improve its 911 response process.
Join the Discussion
After you comment, click Post. You can enter an anonymous Display Name or connect to a social profile.
Trump Picks Iowa Governor for China Ambassador7 hours ago
Businesses: Anti-LGBT Bills Could Cost Texas $8.5 Billion and More Than 100,000 Jobs8 hours ago
Up Next for North Carolina's Governor Who Just Lost Re-Election: Meeting Trump12 hours ago
Kasich Asks Electors Not to Vote for Him Dec. 1912 hours ago
Medicaid Coverage for Addiction Treatment Varies Dramatically by State12 hours ago
Christie Vetoes Bill Limiting Solitary Confinement13 hours ago