Government and the Predictive Power of Language
From preventing terrorism to spotting restaurant heath violations, a form of artificial intelligence called natural language processing can help connect the dots.
As of October 2017, according to Britain's domestic intelligence service, about 850 Britons had travelled to Syria and Iraq to fight for the Islamic State. For the United Kingdom, that meant there was more to fear than the spread of extremism to other countries. The possibility that hardened Islamic State fighters would return to British soil intensified the threat of domestic terrorism.
In its efforts to predict and prevent such attacks, the U.K. has been employing not only conventional intelligence-gathering techniques but also one that previously wouldn't have been possible at scale: Using artificial intelligence, agencies charged with preventing terrorism can follow the growth of extremism through attitudes expressed online. While previously such an effort would have required an enormous commitment of manpower, computer programs can now scrape the internet and bring only high-risk cases to the attention of humans.
Computers can discern signs of danger in text because of the rapidly expanding field of natural language processing, or NLP. NLP is the branch of artificial intelligence dedicated to teaching computers to understand and use human language. It has massive implications not only for fighting extremism but for the broader capabilities of government as well.
Governments need to make decisions based on evidence, but most of the information they've collected to date is siloed, incompatible with other data, and not obviously divisible into categories -- unstructured data, in other words. Intelligence reports and interviews don't boil down into a spreadsheet any more than do letters to Congress or reader comments appended to online articles. But properly analyzed, all of them can provide important clues for decision-makers.
When we can analyze data, we find incredible uses for it -- reading mammograms, for instance, or recognizing faces, predicting crime sprees and even detecting pregnancy from subtle changes to a shopping list. Imagine the power of that kind of data analytics parsing the intelligence stored in the written (and spoken) word. NLP can power improvements, large or small, to any public-sector organization. It can transform efforts to analyze public feedback, improve predictions, help manage regulatory compliance and enhance policy analysis.
Analyzing public feedback is a common practice among retailers and other businesses. Some track customer complaints back to the specific interaction. But tools that work for the private sector may not be adequate for governments; they need assessment tools that meet their specific needs. For several years, for example, the city of Washington, D.C., has been using a branch of NLP called "sentiment analysis" to more accurately assess citizen's satisfaction using online sources. A better understanding of citizens' pain points can help administrators allocate resources and adjust processes to better serve constituents.
Another example comes from predictive modelling for policing, which correlates written police reports with geolocation and crime stats to suggest where to expect -- and prevent -- the next crimes. Police reports can be hasty and miss details, but patterns from multiple sources can paint a more complete picture. NLP pulls those patterns from the verbal noise. Predictive modeling has helped cities from Chicago to Santa Barbara, most recently contributing to a 39 percent drop in violent crime over seven years in Durham, N.C.
The predictive powers of NLP are vast. The federal Food and Drug Administration uses NLP to predict adverse drug interactions. Australian regulators are using it to detect dishonest sales practices. The Chicago Department of Public Health predicts which restaurants are more likely to have dangerous violations of the health code by comparing temperature and location data with nearby sanitation complaints.
NLP also can help automate some regulatory compliance tasks. The federal Securities and Exchange Commission, for example, uses it to process two terabytes of swaps disclosures a day. Agencies that are required to declassify information can automate the identification of sensitive documents. And imagine the efficiency gains when computers can fill out forms in natural human language. A Deloitte study expects the cumulative gains from that and other cognitive technologies to free up a quarter of government workers' on-the-job hours.
Cognitive technologies like NLP will fundamentally transform what public servants can achieve. With some applications, such as military intelligence, whoever moves fastest now may set a decisive lead into the future. Other applications may simply help public servants discover relationships within their area of expertise they never noticed. Either way, what used to be isolated facts are forming into patterns.