COMMENT: All Secure
Need to KnowBy Mary Ann Davidson
Getting better information is important, but some secrets are not for sharing.
A well-known term in many security communities is need to know : the idea that unless there is a compelling reason for someone to have access to information, that person shouldn’t have it. These days, need to know is almost an archaic concept. Having decided post-9/11 that the risk of not sharing data outweighs the risk of sharing it, even the U.S. government is embracing need to share . Web 2.0 is all about massive information sharing: the premise is that if we give up private enclaves of data, we will be collectively smarter and more productive.
At the risk of being called reactionary, I think need to know is still important, because highly sensitive information often has an element of time during which secrecy is vital. You don’t want your enemies to know your war plans until after the attack, or your competitors to know your product plans until after the launch. All things being equal, the more people who know something that is time-sensitive and not merely sensitive, the greater the chances that the data will leak, to the detriment of all concerned. “Loose lips sink ships,” to quote old World War II posters.
It’s hard to explain need to know to people who aren’t familiar with it, because almost everyone wants to be in the know or an A-lister. Generally, need to know can be determined only by the person who holds the information, not by the people who don’t hold it but might think they should. When a test is done properly, the people who don’t have a need to know will never know that they were the object of a need-to-know test and that they failed.
In the larger information security community, almost all attempts at selective information sharing about nonpublic security vulnerabilities have failed. Somebody, somewhere, leaks the information before mitigation or a fix is available, and we all suffer. (An example of a relative success was the recent multivendor cooperation to fix a domain name system vulnerability. Multiple affected vendors implemented correct, compatible-with-others fixes and released them at the same time.)
Technology and Need to Know
Oracle, as an organization, enforces need to know by using our own technology. Within Oracle we use Oracle Virtual Private Database functionality to limit access to security bugs in the bug database (the database we use to record and manage vulnerabilities in Oracle products). Oracle Virtual Private Database enforces the rule that people working on security bug fixes can see bug details for the issues they are working on but that the larger developer community cannot. We allow parts of the management chain to see these bugs, to ensure that they receive the attention they deserve—in open bug reports for particular products, for example. And although we allow customers to access the bug database, security bugs are not published (accessible), to prevent someone without a need to know from reading details on unfixed security bugs.
I personally have opted out of needing to know detailed information about security vulnerabilities. Although my security vulnerability handling team needs to know security bug details such as the severity of the bug or the correctness of a fix to analyze it, I don’t. Therefore, I don’t have access to security bug details, even though I am in the management chain of the vulnerability handlers. (I think enforcing need to know on myself sets a positive leadership example.)
Oracle Virtual Private Database helps us protect customers, by allowing the people who need access to bug details to get them so they can fix the bugs, while minimizing the chances that information about unpublished security bugs will leak to the larger community.
Oracle Label Security also transparently enforces need to know on data to users, based on a composite label. Transparently means not only that the users don’t necessarily know the data labeling scheme but also that they may not be aware that they failed the need-to-know test. Telling users, “You need a secret clearance to access this data” foolishly conveys that there is information they have asked to see but cannot, thus encouraging fishing expeditions. Merely failing to return data enforces true need to know, by not revealing that data exists to those users who can’t see it.
Data sharing may be the wave of the Web 2.0 future, but need to know is nonetheless a critical weapon in the information security arsenal. Loose lips still do sink ships.
Mary Ann Davidson is the chief security officer of Oracle, responsible for secure development practices, security evaluations, and assessments. She represents Oracle on the board of directors of the Information Technology Information Security Analysis Center (IT-ISAC), has served on the U.S. Defense Science Board, and is on the editorial review board of SC Magazine.