Two different reports came out in the last 24 hours about the costs and investments required for cybersecurity. The first, a paper from the RAND Institute’s Sasha Romanosky, claims that, on average, breaches only have a modest financial impact to organizations—but also notes that the real costs are mostly not born directly by the corporation:
while the potential for greater harm and losses appears to be increasing in time, evidence suggests that the actual financial impact to firms is considerably lower than expected. And so, if consumers are indeed mostly satisfied with firm responses from data breaches, and the costs from these events are relatively small, then firms may indeed lack a strong incentive to increase their investment in data security and privacy protection. If so, then voluntary adoption of the NIST Cybersecurity framework may prove very difficult and require additional motivation.
Bruce Schneier interprets this as meaning that there is a market failure requiring government intervention. That’s certainly one way to view it.
Another perspective: it’s a good idea to lower the cost of defending against breaches. That’s what is suggested by the second article, a study funded by my employer Veracode and conducted by Wakefield Research called “Bug Bounty Programs Are Not a Quick-Fix.” The research found that 83% of respondents released software without testing for or fixing software vulnerabilities; 36% use bug bounty programs; 93% believe that most flaws found in bug bounty programs could have been found and fixed by developer training or testing in the development phase, which 59% believe would be more cost effective.