The Snake Within

There are ways to spot an employee with an eye on bringing down the network.
October 1, 2008
Ellen Perlman
By Ellen Perlman  |  Former columnist
Ellen Perlman was a GOVERNING staff writer and technology columnist.

A few months ago, a San Francisco IT employee hijacked the city's computer network and shut its users out. He was arrested, but the network situation was so dire that, word has it, Mayor Gavin Newsom himself visited the perpetrator's jail cell to get the passwords. Without them, the city was locked out of its payroll, health care, police and court systems.

A lot of people know this guy. Well, not him personally. But he fits a type. Other IT departments have been hit by similar insider attacks. They were "lucky" in that their woes never made headline news. But there probably were early signs of the impending hack -- if only the IT department had been able to recognize them.

Chief information officers and other government IT experts long have lectured that insider threats can be as insidious and harmful as those of an unknown prowler. Still, the San Francisco case dumbfounded many IT and security officers around the country who watched as the situation unfolded in the media. Although it shocked Dan Lohrmann, Michigan's chief information security officer, "that one individual could hold an entire city as big as San Francisco hostage," he concedes it could happen anywhere. To say it couldn't, Lohrmann notes, "is like guaranteeing you're not going to get in a car accident." That said, there are a lot of things that an IT office can do to prevent a hostile system takeover. Background checks, access controls, identity management and other processes and procedures help, as does making sure no one is left alone inside the system to do harm. But there's more to it than that.

In studying the problem, researchers at Carnegie Mellon University had little trouble finding 74 cases in which current or former employees or contractors used their authorized network access to cause harm. Sometimes, the organization might have helped push someone over the edge. The study indicated that management decisions on employee performance or some aspect of the organization "sometimes yield unintended consequences that increase risk of insider attack." Some of the common danger signs are outlined in a study on insider threats by the U.S. Secret Service and Carnegie Mellon's Software Engineering Institute Computer Emergency Response Team (CERT).

CERT came up with a model based on 49 cases that can help IT managers in state and local government understand and manage the risk of insider threats. Personality could be one clue. Psychologists who worked on the study say the kinds of people predisposed toward doing harm show certain signs. "There are one or two people who don't get along with people -- can't take criticism," says Dawn Cappelli, a senior member of the CERT staff. "Everyone feels they have to walk on eggshells around them." She has not seen a single insider case where someone said about the attacker, "Oh, but he's such a nice guy."

Any little or big thing -- a job change, a smaller raise than expected -- can set this person off. The person may start acting out or coming to work late, which could lead to sanctions, which could further feed the person's anger and provoke an attack. Often, Cappelli says, when management ignores bad behavior or doesn't handle it with a demotion or firing, the chances of stopping an attack diminish.

In every case that CERT studied, attackers were database or system administrators who understood the powers of the system. They would create "back door" accounts -- privileged accounts that sit on the system but that no one else knows about. Then, if management were to become wary of this individual and disable his regular account, this back door one is available. Often, the attackers give the accounts a name -- such obvious monikers as "John Smith," "Batman" or "James Bond." Others are harder to detect. In San Francisco, the 43-year-old suspect used "Maggot617."

Sometimes the insider plants a logic bomb or a time bomb -- some malicious code is entered into a system and set to go off at a later time. Most employee-attackers plan for their assault to take place days or weeks after they are fired or quit. In one case Cappelli studied, the employee set a logic bomb for six months after he departed from the organization. That made it particularly difficult for the organization to figure out what happened.

Based on its research, CERT has come up with models, training, workshops and an insider-threat diagnostic tool. The idea is to root out areas that make a city or state vulnerable to insider threats.

Ellen Perlman
Ellen Perlman | Former columnist |