Back to Top

Wednesday, February 28, 2007

Disclosure policy = dead horse?

Over at the nCircle blog Ryan Poppa concludes that debating disclosure policy is beating a dead horse because after many years of debate there is still no industry standard. The only positive things in his opinion is that the continuing debate introduces people who might not have heard all the arguments in this matter to the subject. I would like to add a further benefit:

If the industry manages to create a standard regarding this subject, it will enable to use legal methods to persecute those who don't follow these standards. And before you all jump at me and say that I'm a corporate fanboy, let me say that this would help researchers too, because they would have a policy which, if they follow, will greatly reduce the risk of any legal retribution (unless the industry manages to screw it up and decide that 6 months is the timeframe they should be allowed).

Finally, to all of the full disclosure fans: full disclosure as a method does not have any inherent benefits. The motivation for any responsible security researcher should be consumer protection and personal gain in that order! You can not make the argument that disclosing a complete description of the flaw (possibly with exploit code) helps the users of those products / services / etc. if you are not making the disclosure in a place where it is probable for that message to reach a large amount of the customers. On the flip side most official places like the forums of a company are heavily moderated and most probably any such post will be deleted very quickly.

I don't have the silver bullet either for this problem, but I would like to encourage anyone thinking about disclosing flaws to consider going first to the makers of the product, since they have the best means to distribute any mitigating information / patch / etc to the users of their products. Any different approach is immoral.


  1. Lets for the moment forget that ethics don't particularly concern me in my work.

    I really don't think legislating disclosure policy is a good idea.

    First of all; its legislating censorship of information. Sure its (theoretically) for a limited amount of time, but once we start there's no way the government would relinquish control. (Sure the US government loosened crypto laws, but only after it was pointless because everyone else could clearly do what people in the US could, and was therefore potentially hurting industry)

    Exploits don't attack computers - people attack computers.

    Secondly; there is no benefit to the industry, because there should not be any legal avenues for prosecuting researchers.

    When crypto researchers find issues in government or other algorithms they aren't force to disclose their findings to anyone before publishing - its academic work. (I could be completely wrong, but nothing I've seen anywhere contradicts this)

    I think that security should be granted the same status of not having to comply with disclosure rules.

    Furthermore any such legislation could simply drive the research offshore because other countries will most likely not implement any such laws.

  2. Furthermore I think 6 months is a completely unacceptable time-frame for patches. Any standard patch should not take more than a month to develop, fully test and release, preferably faster - anything less is negligence on the part of the vendor.

    Of course, this isn't the case if it requires a complete architectural change of the software; but that is negligence on the part of the company anyway.

    Because lets not forget - security researchers aren't the only ones looking for bugs. And 6 months is an enormous time-frame to find a vulnerability.

    And I really do not understand why you would even consider holding a researcher responsible for bugs that the vendor created.

  3. I completely agree with you that six months is waaaaaaay too long. That's why I said that it would be a screwup for the industry if they decided to go with six months. An acceptable timeframe in my humble opinion would me 30 days.

    As for the researchers not creating the bugs: this is entirely 100% correct. The thing that they do (if they go on the full disclosure route without previous notification of the vendor) is to give tools to a large, mostly unethical crows without giving the tools to those affected by the bugs to protect themselves.

    And finally: I didn't advocate for the goverment to step in. I advocate for an industry standard where some big companies like MS (because we know that Oracle never will) stand up and say that "we won't persecute anyone who gave us X days in advance warning". And then, in case you find yourself in legal trouble you can point to these "standards" and say that this is an industry standard which you follow, at which point you hopefully will be off the hook (while currently it is all a little blurry in this area and many people back off when they get legal threats because they have no clue of what their chanches are of winning or loosing in court).

    And it would also be good for the vendor, since they could plan ahead and set some goals for their security response team (something like in 3 days we must be able if this is exploitable or not, in 5 days come up with a mitigation solution and inform the affected clients, in 15 days come up with a pach, in 20 days it must be tested by QA and so on).

  4. About the six months, sorry that was only an afterthought and I didn't check; my bad.

    And the reason I was talking about the government was this line:
    "If the industry manages to create a standard regarding this subject, it will enable to use legal methods to persecute those who don't follow these standards."

    To me legal sounds like legislation.

    And personally I think there is no need for an industry standard, because saying that if you do X we won't prosecute you is tantamount to implying that anything less is illegal, which it is not.

    On a completely unrelated note; I'm curious if you have any figurea about how many companies/researchers give/receive legal threats?