Politico | How US agencies’ trust in untested software opened the door to hackers
- The federal government conducts only cursory security inspections of the software it buys from private companies
- As investigators race to assess the damage from the hacks, experts and lawmakers are calling for increased scrutiny of the third-party code
This story is published in a content partnership with POLITICO. It was originally reported by Eric Geller on politico.com on December 19, 2020.
The massive months-long hack of agencies across the US government succeeded, in part, because no one was looking in the right place.
The federal government conducts only cursory security inspections of the software it buys from private companies for a wide range of activities, from managing databases to operating internal chat applications. That created the blind spot that suspected Russian hackers exploited to breach the Treasury Department, the Department of Homeland Security, the National Institutes of Health and other agencies. After embedding code in widely used network management software made by a Texas company called SolarWinds, all they had to do was wait for the agencies to download routine software updates from the trusted supplier.
As investigators race to assess the damage from the hacks, experts and lawmakers are calling for increased scrutiny of the third-party code that government agencies allow on their networks and demanding a fix for a long-known weakness.
“The government desperately needs to set minimum security requirements for software and services, and refuse to buy anything that doesn't meet those standards,” said Senator Ron Wyden. “It is incredibly self-defeating for federal agencies to spend billions on security and then give government contracts to companies with insecure products.”
Over the past week, agencies rushed to scrub the malicious code from their networks while senior officials huddled in emergency meetings – all amid reports of more victims in the federal government, state governments and private industry. As the extent of the attack became clearer, cyber experts warned that cleaning up the mess could take months or years.
The US government relies on private vendors of all sizes to supply its agencies with software. Some have expert security teams, such as Amazon, which provides cloud hosting services, and SAP, whose software helps agencies process large quantities of data. But others, both large and small, have less rigorous security testing procedures and are more vulnerable to this kind of compromise, cyber analysts say.
And now that the hackers have had months to establish a foothold in the federal networks, the Cybersecurity and Infrastructure Security Agency warned, removing them “will be highly complex and challenging.”
Building better software
The SolarWinds hack – which officials have linked to Russia’s foreign intelligence service, the SVR and which Secretary of State Mike Pompeo on Friday publicly pinned on Russia – reflects a level of sophistication that may be impossible to completely block, but technical professionals and policymakers say new approaches to software development and procurement could at least give defenders a fighting chance.
Attacks on vendors in the software supply chain represent a known issue that needs to be prioritised, said Republican Jim Langevin, the co-founder of the Congressional Cybersecurity Caucus.
“The SolarWinds incident … underscores that supply chain security is a topic that needs to be front and centre,” Langevin said.
He said Congress needs to “incentivise” the companies to make their software more secure, which could require expensive changes.
Some others are calling for regulation.
“Absolutely there needs to be more oversight of these kinds of companies,” said Emile Monette, the former chief of CISA’s supply chain risk management programme. He said the government should require contractors to certify their software is free of even “moderate-impact bugs.” Typically, vendors assure only that their software is free of particularly dangerous vulnerabilities, labelled as “critical” or “high impact.”
Private companies regularly deploy software with undiscovered bugs because developers lack the time, skill or incentive to fully inspect them.
Monette said agencies must “be prepared to pay for increased security” in their purchases and encouraged the government to “double down on investments” in areas such as software security.
It can be hard, however, for federal agencies and Fortune 500 companies to identify weaknesses when they do not understand the complexity of what they are buying or the ways in which it could be defective.
“Security is not a significant consideration or even well understood,” said Bryan Ware, CISA’s former assistant director for cybersecurity. “Plenty of sophisticated [chief information officers] bought and deployed [SolarWinds’ software], so it’s not just the vendor I’m questioning.”
There is no central inventory of which government agencies use which software in which offices, which is part of why it has taken agencies so long to determine if they have been hacked.
“The first-order problem is still trying to get our arms around all of the applications and software that reside on the 101 civilian executive branch networks,” said former CISA Deputy Director Matthew Travis.
Travis bemoaned the decentralised approach and encouraged Congress to authorise CISA and OMB “to re-architect the archaic federal enterprise” and push more applications to the cloud.
The automated gatekeepers that do exist – two CISA-run network security programmes – also were not equipped to identify the SolarWinds intrusion, much less stop it.
US warns of ‘security risk’ from China-made drones as tech war heats up
The other, Continuous Diagnostics and Mitigation, brings together scanning and monitoring services that are supposed to spot and block suspicious behaviour on those networks. But CDM’s understanding of what should generate a red flag is limited to clearly suspicious activity, such as off-site transfers of massive encrypted files – which did not occur with the infected SolarWinds updates.
Calls for action on the Hill
Some in Congress are ready to act. In a statement, California Representative Ted Lieu said he was “working on legislation to ensure that vendors doing business with the United States government maintain a vulnerability disclosure policy.”
But new regulations might not solve the problem, technical specialists said.
“Government-mandated security requirements are probably more likely to HARM security than to HELP it,” Andy Keiser, a former top House Intelligence Committee aide and Trump transition national security adviser, wrote in an email. “The standards would be slow, outdated, cumbersome [and] pick incorrect winners and losers.”
Congress should “carefully explore penalties for negligence” in software design, Ware said, but only in a limited way, “because it could lead to negative unintended consequences.”
Regardless of who controls the Senate in the 117th Congress, the Democratic-led House is likely to be more open to new federal mandates. A House Homeland Security Committee aide, who requested anonymity to discuss internal planning, said that it was too early to discuss regulation but added, “I’m sure we will have hearings on [SolarWinds] in the new year.”
Getting under the bonnet
Rather than imposing new security requirements on vendors, some experts say agencies should pay more attention to the software they buy and routinely test it for flaws.
James Lewis, a cyber expert at the Centre for Strategic and International Studies, floated the idea of an executive order instructing agencies “to monitor and better manage their use of these kinds of platforms,” and requiring sector-specific regulators to demand the same of companies in critical industries, such as electricity and health care.
“Require something similar to what Apple does on the App Store,” Lewis said, noting that the tech giant reviews every submitted app and only approves those it certifies to be secure.
Some private companies do monitor third-party software in this way, but routine software auditing would likely be a massive burden on federal agencies, few of which have enough security personnel to handle this work on top of their existing tasks.
Ware said this could prevent a situation where one agency discovered a problem in software used across the government but failed to report it to those other customers.
Not everyone is convinced that this centralisation would work.
“Talent is in short supply everywhere, no one [is] going to volunteer people for transfer, and DHS doesn't have the clout to steal from the agencies with talent – the [intelligence community], DoD and FBI,” Lewis said. He suggested instead that the software security oversight start at OMB, whose authority to issue edicts to other agencies is more established.
Some parts of the government are already working to encourage better coding practices, though it’s slow going.
For more than two years, one federal agency has been convening meetings of outside experts to discuss the creation of a kind of ingredient label for software, a “bill of materials” that would provide transparency about the code used in each programme. While this software bill of materials would not completely solve the problem that led to the SolarWinds crisis, Ware and other cyber experts say it would encourage more careful coding by making an application’s digital contents more transparent.
Martin Matishak contributed to this report.