Government Technology's Complicated History
The public sector has been notoriously slow to embrace technology. Is that finally changing?
This is part of Governing's special 30th anniversary coverage.
Over the course of three decades, I’ve written extensively about government technology. While the technologies themselves have gotten faster, and cheaper, leading to some remarkable innovations and disrupting entire industries, change has been a lot slower in state and local government. But that’s not necessarily a bad thing: One of government’s essential roles is to enable entrepreneurship and let the private sector develop best practices. Indeed, the three biggest evolutions in government technology over the past 30 years bear that out.
When Governing started publishing in 1987, the era of decentralized computing was about to begin. That year, state and local governments were still firmly in the mainframe computing camp, using big iron primarily to collect, sort and file data for the big government programs of the day, including Medicaid and unemployment insurance. But in the private sector, mainframes were on the way out.
Companies were replacing mainframes with personal computers, which were a lot cheaper. The lower costs enabled states and localities to build systems capable of tackling more specific programs and processes, such as document management or city permitting. But running and managing so much more technology became complicated. Governments needed someone who could take charge of the different types of computers, networks and storage devices and keep everything up to date and connected.
The rise of the chief information officer in the 1990s coincided with the rise of the internet, which brought us email, websites and more information than we knew what to do with. The era of e-government was underway, and some savvy CIOs saw an opportunity to begin delivering government services digitally. The battle cry was, “Services online, not in line.” The goal was to allow citizens to conduct business on government websites, and if there was a fee, pay it online too.
E-government would go on to save millions in labor costs, but its adoption was slower than anticipated. CIOs were able to demonstrate the functionality of digital services, but they couldn’t make agencies implement them. One big reason: What CIOs saw as labor-saving automation, agencies and their workers viewed as job-killing technology.
The advent of the smartphone, though, forced many of these agencies to get on board. Mobile devices changed citizens’ expectations: They wanted to engage with government through their phones. The rise of smartphones a decade ago coincided with the beginning of a new era of open government. The advances of the e-government era enabled states and localities to introduce loads of new services and, as a result, collect massive amounts of data. Thanks to public demand, governments started sharing that data with citizens. The trend has spawned a wave of startup companies that have taken this wealth of data and created third-party government services. Social media has also helped the open government era to grow, making the public sector more accessible and more accountable.
These three evolutions have brought government technology to an interesting but extremely challenging point in time. Step into a government IT shop today and you’ll still find mainframe computers dating back to 1987 and clunky personal computers from the 1990s. Many governments are running on old, outdated systems, making them vulnerable to cyberattacks.
But in that same IT shop today, you’re just as likely to find an experiment involving artificial intelligence or driverless cars or next-generation cloud services. In the next 30 years, those technologies and others will continue to disrupt the way governments deliver services to citizens. Since government CIOs came along, states and localities have shown an ability to become more nimble in adopting and responding to technological shifts. Over the three decades to come, they’ll have to do all that and more.