It's not the loud pronouncements by hacking groups or the highly visible denial-of-service attacks that scare cybersecurity experts. It's silence.
In the escalating battle against cyber attackers, the focus has been on new security software and cyber hygiene, but one of the greatest tools against "the adversary," as cyber attackers are called in industry parlance, is the relatively low-tech approach of sharing information about attacks.
Yet contractors continue to remain mum on many intrusions - citing liability concerns - creating a vacuum that reduces their ability to fight attacks. The U.S. Defense Department continues to hunt for a way to increase reporting when both classified and unclassified sensitive data are compromised.
"The bad guys are fast; they have no intellectual property boundaries, no rules, they just execute and with all this funding they could kill us if we don't match that with good information sharing," said Phyllis Schneck, vice president and chief technology officer for the public sector at McAfee Security. "It's like a weather forecast; the more data you have, the more lives you can save if you can forecast the tornado or the hurricane."
McAfee highlighted the issue of information sharing when it released a report Aug. 3 about an effort to track a group of intruders. The project, Operation Shady RAT, found that the intruders had grabbed data from 72 different entities, including 13 defense contractors and 22 government agencies, in 14 different countries, with more than two-thirds of those attacks targeting the U.S.
The project's name refers to a technique of using remote access tools (RAT) to infiltrate networks. In order to gain access to the networks, the attackers employed spear phishing, sending emails that appear to be from a recognized contact that encourages a download concealing malicious hardware.
The group used the same set of tools for five years, suggesting that later victims might have been able to respond more effectively if they had learned of the pattern in earlier attacks.
To push for greater disclosure, the DoD has been exploring two avenues: a new Defense Federal Acquisition Regulations Supplement (DFARS) rule that would make mandatory the reporting of intrusions that compromise certain types of sensitive information; and the Defense Industrial Base (DIB) Cyber Pilot program, a voluntary program that includes roughly two dozen companies reporting intrusions involving classified and sensitive data, and disclosure by the DoD of threats it has detected.
But reporting attacks, even to government agencies that promise anonymity, is not without risks, said Alan Chvotkin, executive vice president of the Professional Services Council. "It's reputation liability, legal liability and business liability," he said.
But although Chvotkin said that contractors agree with the notion of improving security, there are questions about the rule.
"One of the underlying concerns in the DFARS proposed rule is that it makes security a contract compliance issue, so does a breach incur not only some liability and exposure but also a contract breach because you haven't met the standards? Even if you've met the regulations, errors still occur."
He also pointed to the unknown risk of liability, acknowledging concerns about trust as it relates to company anonymity during the reporting process.
"Trust develops over time," he said. "As companies have participated, that trust factor goes up. Just like voluntary disclosure and others, you come to the first one reluctantly."
The issue of trust is very real, said Bill Marshall, managing director of The Chertoff Group and former deputy chief of staff for cyber at the National Security Agency.
"There's a significant lack of trust between the government and the private sector," he said. "There's also a lack of understanding as far as concerns and needs on both sides of the fence, and that's an impediment."
He pointed to the potential repercussions of information leaks. "What if a penetration shows up in The Washington Post? What if you have to explain that to your shareholders?"
Jeff Moulton, a researcher at the Georgia Tech Research Institute, said there would need to be a means for enforcement for the rule to be effective.
"There has got to be an ironclad way to make sure that there are serious repercussions for a person who discloses information," he said. "If somebody wants to torpedo the stock price of a company, all they have to do is release that information."
"It worked wonderfully," he said. "It found specific evidence of attacks taking place in one company that was occurring in three other companies that those other companies didn't know about."
He noted that even when companies volunteer, reporting is still an issue.
"There are at least two to three times the number of attacks than are presented to the community, and that's among people that are agreeing to share the data," he said.
Experts said voluntary reporting would be most effective if smaller companies were included in the process, whereas most of the companies in the DIB Cyber Pilot are large. Larger companies typically have large cybersecurity staffs and conduct extensive research on intrusions, while smaller companies may not have the resources to invest in this type of research.
By sharing data between larger and smaller companies, the contracting community as a whole would likely be better protected as the transfer of sensitive data occurs across the spectrum of company size.
While there has been discussion of implementing a program similar to the DIB Cyber Pilot on a larger scale, the problem of cost looms. Speaking about the DIB Cyber Pilot, Deputy Defense Secretary William Lynn talked about the cost issue at a press conference in July.
"One of the reasons this is a short pilot is that for 90 days, people are willing to hold their breath and not worry about the 'who pays' part," he said. "But when you get beyond that, when we get more permanent, there is a question of who pays, and that's one of the central questions that we're tackling."
"Quite frankly, this is a cost that they're trying to drive as close to zero as they can, and the costs keep going up," Marshall said.
Those costs are hard to justify for many companies, as there isn't a simple risk/reward equation that companies can do, and potential gains in security are hard to compare against the costs.
"The view that the regulations need to change is a recognition that there is not a financial incentive for them to do that," Marshall said. "That's one of the things that is kind of an arrow in the quiver that has to be used judiciously."
And the cost to companies is not alone. The issue of government resources to provide data analysis and potentially enforcement of mandates raises important questions, Moulton said.
"The government doesn't have enough people to police themselves, so how are they going to go out and verify that companies are doing this?" he said.
Chvotkin voiced the same concern.
"It calls on the resources available to the government. How much are they willing to spend?" Chvotkin asked.
The DFARS proposed rule would also include a mandate to provide "adequate security," meaning the cost would be twofold: creating an appropriate security system and providing the manpower to produce the report for the Pentagon in the event of an intrusion.
But the concerns about cost are insignificant compared to what is being lost, Paller said.
"They're losing America's greatest treasures. Their fears are irrelevant," he said. "They've lost some of the stuff that our entire economic infrastructure is based upon."
In the escalating battle against cyber attackers, the focus has been on new security software and cyber hygiene, but one of the greatest tools against "the adversary," as cyber attackers are called in industry parlance, is the relatively low-tech approach of sharing information about attacks.
Yet contractors continue to remain mum on many intrusions - citing liability concerns - creating a vacuum that reduces their ability to fight attacks. The U.S. Defense Department continues to hunt for a way to increase reporting when both classified and unclassified sensitive data are compromised.
"The bad guys are fast; they have no intellectual property boundaries, no rules, they just execute and with all this funding they could kill us if we don't match that with good information sharing," said Phyllis Schneck, vice president and chief technology officer for the public sector at McAfee Security. "It's like a weather forecast; the more data you have, the more lives you can save if you can forecast the tornado or the hurricane."
McAfee highlighted the issue of information sharing when it released a report Aug. 3 about an effort to track a group of intruders. The project, Operation Shady RAT, found that the intruders had grabbed data from 72 different entities, including 13 defense contractors and 22 government agencies, in 14 different countries, with more than two-thirds of those attacks targeting the U.S.
The project's name refers to a technique of using remote access tools (RAT) to infiltrate networks. In order to gain access to the networks, the attackers employed spear phishing, sending emails that appear to be from a recognized contact that encourages a download concealing malicious hardware.
The group used the same set of tools for five years, suggesting that later victims might have been able to respond more effectively if they had learned of the pattern in earlier attacks.
To push for greater disclosure, the DoD has been exploring two avenues: a new Defense Federal Acquisition Regulations Supplement (DFARS) rule that would make mandatory the reporting of intrusions that compromise certain types of sensitive information; and the Defense Industrial Base (DIB) Cyber Pilot program, a voluntary program that includes roughly two dozen companies reporting intrusions involving classified and sensitive data, and disclosure by the DoD of threats it has detected.
But reporting attacks, even to government agencies that promise anonymity, is not without risks, said Alan Chvotkin, executive vice president of the Professional Services Council. "It's reputation liability, legal liability and business liability," he said.
DFARS Proposed Rule
Dipping its toe into mandatory compliance, the Pentagon is circulating for comment until Aug. 29 the proposed new rule for the DFARS that would compel contractors to disclose intrusions. The rule would require that contractors provide "adequate security," report cyber incidents within 72 hours and conduct a review of their networks to search for information about the attacks.But although Chvotkin said that contractors agree with the notion of improving security, there are questions about the rule.
"One of the underlying concerns in the DFARS proposed rule is that it makes security a contract compliance issue, so does a breach incur not only some liability and exposure but also a contract breach because you haven't met the standards? Even if you've met the regulations, errors still occur."
He also pointed to the unknown risk of liability, acknowledging concerns about trust as it relates to company anonymity during the reporting process.
"Trust develops over time," he said. "As companies have participated, that trust factor goes up. Just like voluntary disclosure and others, you come to the first one reluctantly."
The issue of trust is very real, said Bill Marshall, managing director of The Chertoff Group and former deputy chief of staff for cyber at the National Security Agency.
"There's a significant lack of trust between the government and the private sector," he said. "There's also a lack of understanding as far as concerns and needs on both sides of the fence, and that's an impediment."
He pointed to the potential repercussions of information leaks. "What if a penetration shows up in The Washington Post? What if you have to explain that to your shareholders?"
Jeff Moulton, a researcher at the Georgia Tech Research Institute, said there would need to be a means for enforcement for the rule to be effective.
"There has got to be an ironclad way to make sure that there are serious repercussions for a person who discloses information," he said. "If somebody wants to torpedo the stock price of a company, all they have to do is release that information."
DIB Cyber Pilot
The Pentagon has also looked for a voluntary approach to the reporting problem. The DIB Cyber Pilot, lasting 90 days and including a limited number of companies, has been successful, said Alan Paller, who directs research at the SANS Institute."It worked wonderfully," he said. "It found specific evidence of attacks taking place in one company that was occurring in three other companies that those other companies didn't know about."
He noted that even when companies volunteer, reporting is still an issue.
"There are at least two to three times the number of attacks than are presented to the community, and that's among people that are agreeing to share the data," he said.
Experts said voluntary reporting would be most effective if smaller companies were included in the process, whereas most of the companies in the DIB Cyber Pilot are large. Larger companies typically have large cybersecurity staffs and conduct extensive research on intrusions, while smaller companies may not have the resources to invest in this type of research.
By sharing data between larger and smaller companies, the contracting community as a whole would likely be better protected as the transfer of sensitive data occurs across the spectrum of company size.
While there has been discussion of implementing a program similar to the DIB Cyber Pilot on a larger scale, the problem of cost looms. Speaking about the DIB Cyber Pilot, Deputy Defense Secretary William Lynn talked about the cost issue at a press conference in July.
"One of the reasons this is a short pilot is that for 90 days, people are willing to hold their breath and not worry about the 'who pays' part," he said. "But when you get beyond that, when we get more permanent, there is a question of who pays, and that's one of the central questions that we're tackling."
Cost and Oversight
Regardless of the technique employed to promote communication, the issue of cost remains. "Quite frankly, this is a cost that they're trying to drive as close to zero as they can, and the costs keep going up," Marshall said.
Those costs are hard to justify for many companies, as there isn't a simple risk/reward equation that companies can do, and potential gains in security are hard to compare against the costs.
"The view that the regulations need to change is a recognition that there is not a financial incentive for them to do that," Marshall said. "That's one of the things that is kind of an arrow in the quiver that has to be used judiciously."
And the cost to companies is not alone. The issue of government resources to provide data analysis and potentially enforcement of mandates raises important questions, Moulton said.
"The government doesn't have enough people to police themselves, so how are they going to go out and verify that companies are doing this?" he said.
Chvotkin voiced the same concern.
"It calls on the resources available to the government. How much are they willing to spend?" Chvotkin asked.
The DFARS proposed rule would also include a mandate to provide "adequate security," meaning the cost would be twofold: creating an appropriate security system and providing the manpower to produce the report for the Pentagon in the event of an intrusion.
But the concerns about cost are insignificant compared to what is being lost, Paller said.
"They're losing America's greatest treasures. Their fears are irrelevant," he said. "They've lost some of the stuff that our entire economic infrastructure is based upon."