[Infowarrior] - Meddling With 'Full Disclosure' Is Unwelcome
Richard Forno
rforno at infowarrior.org
Tue Aug 26 12:59:22 UTC 2008
Boston Court's Meddling With 'Full Disclosure' Is Unwelcome
Bruce Schneier Email 08.21.08
http://www.wired.com/politics/security/commentary/securitymatters/
2008/08/securitymatters_0821
In eerily similar cases in the Netherlands and the United States,
courts have recently grappled with the computer-security norm of
"full disclosure," asking whether researchers should be permitted to
disclose details of a fare-card vulnerability that allows people to
ride the subway for free.
The "Oyster card" used on the London Tube was at issue in the Dutch
case, and a similar fare card used on the Boston "T" was the center
of the U.S. case. The Dutch court got it right, and the American
court, in Boston, got it wrong from the start -- despite facing an
open-and-shut case of First Amendment prior restraint.
The U.S. court has since seen the error of its ways -- but the damage
is done. The MIT security researchers who were prepared to discuss
their Boston findings at the DefCon security conference were
prevented from giving their talk.
The ethics of full disclosure are intimately familiar to those of us
in the computer-security field. Before full disclosure became the
norm, researchers would quietly disclose vulnerabilities to the
vendors -- who would routinely ignore them. Sometimes vendors would
even threaten researchers with legal action if they disclosed the
vulnerabilities.
Later on, researchers started disclosing the existence of a
vulnerability but not the details. Vendors responded by denying the
security holes' existence, or calling them just theoretical. It
wasn't until full disclosure became the norm that vendors began
consistently fixing vulnerabilities quickly. Now that vendors
routinely patch vulnerabilities, researchers generally give them
advance notice to allow them to patch their systems before the
vulnerability is published. But even with this "responsible
disclosure" protocol, it's the threat of disclosure that motivates
them to patch their systems. Full disclosure is the mechanism (.pdf)
by which computer security improves.
Outside of computer security, secrecy is much more the norm. Some
security communities, like locksmiths, behave much like medieval
guilds, divulging the secrets of their profession only to those
within it. These communities hate open research, and have responded
with surprising vitriol to researchers who have found serious
vulnerabilities in bicycle locks, combination safes (.pdf), master-
key systems and many other security devices.
Researchers have received a similar reaction from other communities
more used to secrecy than openness. Researchers -- sometimes young
students -- who discovered and published flaws in copyright-
protection schemes, voting-machine security and now wireless access
cards have all suffered recriminations and sometimes lawsuits for not
keeping the vulnerabilities secret. When Christopher Soghoian created
a website allowing people to print fake airline boarding passes, he
got several unpleasant visits from the FBI.
This preference for secrecy comes from confusing a vulnerability with
information about that vulnerability. Using secrecy as a security
measure is fundamentally fragile. It assumes that the bad guys don't
do their own security research. It assumes that no one else will find
the same vulnerability. It assumes that information won't leak out
even if the research results are suppressed. These assumptions are
all incorrect.
The problem isn't the researchers; it's the products themselves.
Companies will only design security as good as what their customers
know to ask for. Full disclosure helps customers evaluate the
security of the products they buy, and educates them in how to ask
for better security. The Dutch court got it exactly right when it
wrote: "Damage to NXP is not the result of the publication of the
article but of the production and sale of a chip that appears to have
shortcomings."
In a world of forced secrecy, vendors make inflated claims about
their products, vulnerabilities don't get fixed, and customers are no
wiser. Security research is stifled, and security technology doesn't
improve. The only beneficiaries are the bad guys.
If you'll forgive the analogy, the ethics of full disclosure parallel
the ethics of not paying kidnapping ransoms. We all know why we don't
pay kidnappers: It encourages more kidnappings. Yet in every
kidnapping case, there's someone -- a spouse, a parent, an employer
-- with a good reason why, in this one case, we should make an
exception.
The reason we want researchers to publish vulnerabilities is because
that's how security improves. But in every case there's someone --
the Massachusetts Bay Transit Authority, the locksmiths, an election
machine manufacturer -- who argues that, in this one case, we should
make an exception.
We shouldn't. The benefits of responsibly publishing attacks greatly
outweigh the potential harm. Disclosure encourages companies to build
security properly rather than relying on shoddy design and secrecy,
and discourages them from promising security based on their ability
to threaten researchers. It's how we learn about security, and how we
improve future security.
---
Bruce Schneier is Chief Security Technology Officer of BT Global
Services and author of Beyond Fear: Thinking Sensibly About Security
in an Uncertain World. You can read more of his writings on his website.
More information about the Infowarrior
mailing list