Update: at the request of the original author I removed the embedded version. Please follow the link provided above to view the presentation (it is available free of charge and without registration).
Some of my favourites:
- The confidence that people have in security is inversely proportional to how much they know about it.
- The more excited people are about a given security technology, the less they understand (1) that technology and (2) their own security problems.
- The problem with common sense is that it is not all that common.
- No serious security vulnerability, including blatantly obvious ones, will be dealt with until there is overwhelming evidence and widespread recognition that adversaries have already catastrophically exploited it. In other words, “significant psychological (or literal) damage is required before any significant security changes will be made”.
- Engineers don’t understand security. They think nature is the adversary, not people. They tend to work in solution space, not problem space. They think systems fail stochastically, not through deliberate, intelligent, malicious intent.
- The effectiveness of a security device, system, or program is inversely proportional to how angry or upset people get about the idea that there might be vulnerabilities.
- Within a few months of its availability, new technology helps the bad guys at least as much as it helps the good guys.
- Most of the time when security appears to be working, it’s because no adversary is currently prepared to attack.