Regulatory mandates for software security like those in the Biden Administration’s National Cybersecurity Strategy could cause more problems than they solve.
Like “SBOMs will solve everything,” there is a regular cry to reform software liability, specifically in the case of products with insecurities and vulnerabilities. US Cybersecurity and Infrastructure Security Agency (CISA) Director Jen Easterly’s comments this week brought the topic back into focus, but it’s still a thorny issue. (There’s a reason certain things are called “wicked problems.”) The proposed remedy, taking up a full page of the Biden Administration’s National Cybersecurity Strategy, will cause more problems than it solves.
Software liability reform isn’t a new issue
The Hardening the Internet report of the National Infrastructure Advisory Council, written in 2004 (and ironically hosted by CISA), explicitly chose not to recommend revisiting “regulatory and proper role of government intervention,” based in large part on conversations around the challenges addressing liability. (Disclosure: I was a member of the supporting study group and one of the two principal writers of the report.) Why not? For the same reasons that basically haven’t changed.
Who decides when software security is good enough?
Whenever enhanced liability comes up, proponents generally suggest that a company could receive liability protection if they “do all the right things.” But who decides what that standard is? I suspect much of that standard will be a grab-bag of every possible desired security feature and practice, even if they aren’t specifically relevant to a given product.
I’m reminded of early arguments with the FedRAMP program management office, which insisted that Akamai, with a network deployed as a tenant in thousands of other companies’ data centers, should monitor the humidity in every single one of those data centers. (We assumed that those data centers had a much higher vested interest in their humidity, since to us, data centers were disposable. Why waste energy on monitoring that had negligible value at best?) Take that requirement and expand it. Every possible “best practice” will become a requirement, without which a company will face some form of enhanced liability.
CSO Executive Sessions / ASEAN: Hieu Minh Ngo on top cybersecurity vulnerabilities to watch out for
0 seconds of 16 minutes, 30 secondsVolume 0%
Security regulations favor big companies
Big companies generally thrive on new regulations – the more onerous, the better. After all, a fixed cost of compliance becomes a moat that protects their markets against smaller competition. Consider in 2007 when Mattel shipped Barbie dolls that included lead paint. While this resulted in a multi-million dollar fine to Barbie, it was also the impetus for the Consumer Product Safety Improvement Act to increased regulations (although Mattel had already violated existing ones) and create enhanced enforcement authority.
How badly did this harm Mattel? Over the next four years, its stock price quadrupled. Smaller players had it more difficult, and second-hand stores often pulled their entire inventory of children’s toys to avoid non-compliance. (With two small children at the time, this hit us right in the pocketbook.)
Whose line is that, anyway?
Some of the most notable “celebrity vulnerabilities” over the past decade haven’t been the fault of one company. Heartbleed. Log4j. Shellshock. Meltdown. What these all have in common is that the vulnerable code was in open-source software. If small companies have a hard time dealing with government regulation, what about volunteer projects? With the risk of being made the poster child for a new enforcement regime, I suspect many of those projects would have a hard time continuing to operate.
Open-source software is critical to the internet as it exists today and will continue to be for some time. If those developers abandon their projects – or worse, delete them on the advice of counsel – how much worse off would we all be?
The White House appears to want to solve this by holding not the original developers liable, but only the final-goods assembler. How to implement that desire into a law without creating loopholes big enough for well-funded legal teams to drive their companies through will, of course, be left as an exercise for the lobbyists funded by those companies when they help Congress write any such law.
Is security really a product requirement?
Let me start this section by saying that I strongly believe that companies ought to invest in product security. I’m not convinced that it’s a basic requirement just to be in the market. Other markets don’t have this requirement. Virtually every home sold in America is fundamentally insecure. Even core safety requirements are negotiable with your building inspector. For vehicles, it’s arguable whether industry (via the IIHS) or government (via the NHTSA) have driven the safety standards higher, but vehicle security seems to be driven more by insurance discounts on one end, and enforcement against vehicle thieves on the other, and car theft is substantially down over the past few decades.
Director Easterly makes another argument, that security features should be included by default and that all systems ship with secure configurations. That’s not actually a requirement on manufacturers. It’s a requirement on buyers. No longer would the government allow you to purchase anything but the most secure implementation. Functionally, that’s a tax you’re going to pay for as a buyer, both in price (because everyone must be charged for those features) and in usability (because many security features, at their strictest setting, make the underlying product difficult, if not outright impossible to use).
The software liability list goes ever on…
This isn’t a comprehensive list of all the issues. Think about complex cloud services that span multiple companies. Who is at fault when each system works as advertised, but combined they result in uncomfortable outcomes? What happens when a consumer uses a technology in a way it wasn’t intended? What happens when new types of vulnerabilities are discovered, rendering an entire industry “vulnerable”?
Every objection can, of course, be met with detailed rules that will cover most cases. The list of issues is long enough that it isn’t clear that a new overarching change in liability law is required. But Director Easterly is following one principle laid out in the 2004 NIAC report: using the bully pulpit of the government to inspire and drive change in industry. Hopefully we don’t have a substantial and dangerous change in liability law, and people can stop advocating for that change because we continue to get more and more secure software and products.