[Infowarrior] - Schneier: Full Disclosure of Security Vulnerabilities a 'Damned Good Idea'
Richard Forno
rforno at infowarrior.org
Thu Jan 11 22:12:19 EST 2007
Schneier: Full Disclosure of Security Vulnerabilities a 'Damned Good Idea'
http://www2.csoonline.com/exclusives/column.html?CID=28073
Security guru Bruce Schneier sounds off on why full disclosure forces
vendors to patch flaws.
By Bruce Schneier
Full disclosurethe practice of making the details of security
vulnerabilities publicis a damned good idea. Public scrutiny is the only
reliable way to improve security, while secrecy only makes us less secure.
Unfortunately, secrecy sounds like a good idea. Keeping software
vulnerabilities secret, the argument goes, keeps them out of the hands of
the hackers (See The Vulnerability Disclosure Game: Are We More Secure?).
The problem, according to this position, is less the vulnerability itself
and more the information about the vulnerability.
But that assumes that hackers can¹t discover vulnerabilities on their own,
and that software companies will spend time and money fixing secret
vulnerabilities. Both of those assumptions are false. Hackers have proven to
be quite adept at discovering secret vulnerabilities, and full disclosure is
the only reason vendors routinely patch their systems.
To understand why the second assumption isn¹t true, you need to understand
the underlying economics. To a software company, vulnerabilities are largely
an externality. That is, they affect youthe usermuch more than they affect
it. A smart vendor treats vulnerabilities less as a software problem, and
more as a PR problem. So if we, the user community, want software vendors to
patch vulnerabilities, we need to make the PR problem more acute.
Full disclosure does this. Before full disclosure was the norm, researchers
would discover vulnerabilities in software and send details to the software
companieswho would ignore them, trusting in the security of secrecy. Some
would go so far as to threaten the researchers with legal action if they
disclosed the vulnerabilities.
Later on, researchers announced that particular vulnerabilities existed, but
did not publish details. Software companies would then call the
vulnerabilities ³theoretical² and deny that they actually existed. Of
course, they would still ignore the problems, and occasionally threaten the
researcher with legal action. Then, of course, some hacker would create an
exploit using the vulnerabilityand the company would release a really quick
patch, apologize profusely, and then go on to explain that the whole thing
was entirely the fault of the evil, vile hackers.
It wasn¹t until researchers published complete details of the
vulnerabilities that the software companies started fixing them.
Of course, the software companies hated this. They received bad PR every
time a vulnerability was made public, and the only way to get some good PR
was to quickly release a patch. For a large company like Microsoft, this was
very expensive.
So a bunch of software companies, and some security researchers, banded
together and invented ³responsible disclosure² (See "The Chilling Effect").
The basic idea was that the threat of publishing the vulnerability is almost
as good as actually publishing it. A responsible researcher would quietly
give the software vendor a head start on patching its software, before
releasing the vulnerability to the public.
This was a good ideaand these days it¹s normal procedurebut one that was
possible only because full disclosure was the norm. And it remains a good
idea only as long as full disclosure is the threat.
The moral here doesn¹t just apply to software; it¹s very general. Public
scrutiny is how security improves, whether we¹re talking about software or
airport security or government counterterrorism measures. Yes, there are
trade-offs. Full disclosure means that the bad guys learn about the
vulnerability at the same time as the rest of usunless, of course, they
knew about it beforehandbut most of the time the benefits far outweigh the
disadvantages.
Secrecy prevents people from accurately assessing their own risk. Secrecy
precludes public debate about security, and inhibits security education that
leads to improvements. Secrecy doesn¹t improve security; it stifles it.
I¹d rather have as much information as I can to make an informed decision
about security, whether it¹s a buying decision about a software product or
an election decision about two political parties. I¹d rather have the
information I need to pressure vendors to improve security.
I don¹t want to live in a world where companies can sell me software they
know is full of holes or where the government can implement security
measures without accountability. I much prefer a world where I have all the
information I need to assess and protect my own security.
Bruce Schneier is a noted security expert and founder and CTO of BT
Counterpane.
More information about the Infowarrior
mailing list