Lecture 15 Notes; May 2, 1997; Notetaker: Eric Rosenthal Regarding the penetration exercise, don't try to cause damage to the system. Today's lecture is on designing secure systems that that have less vulnerabilities. We will skip over mathematical verification of systems since it is too slow and complicated. Preconditions and Postconditions for system. Preconditions: Describe the environment that you will be using. Basically the assumptions you will make before your code is executed. Postconditions: What the program will do with the Preconditions. Does what will happen violate the security policy? Basically the results. Saltzer and Schroeder's 8 Design Principles These principles apply to the design, implementation, and configuration of systems. They serve to prevent vulnerabilities and to limit damage in case of a break-in. 1: Least Privileged A process/user gets only the minimum privileges to perform its job. The idea of a "root" account violates this Principle. A way to implement this is using "roles" - different accounts have just the privileges they need for their roles. This principle can be applied to programming: A program can be made more secure by ignoring the preconditions and making sure that the results of running the program match the policy. Robust programming can be implemented by hiding the internal data structures and not giving the caller access to them. 2: Fail-Safe Defaults The default setting of the privileges in any questionable state should always be the least privileges. An example that violates this is if the su command cannot access the password file, it automatically grants root privileges. When a problem happens, the transactions should be rolled back to a safe state. 3: Economy of Mechanism Keep the design and implementation as simple as possible. Like KISS (Keep it simple, stupid!) Sendmail violates this because it is too large and complicated to verify. 4: Complete Mediation Check for permission at each access. There is a tradeoff between complete mediation and efficiency. Caching can violate this because the state of the cache may be different from the state of the system. File opening in UNIX can violate this if while you are editing the file, the permission changes - UNIX only checks permissions when you first open the file. 5: Open Design Make information on design of system available; don't use security through obscurity. A violation of this would be the DES s-boxes. Violation of principle can give you a false sense of security. Some information such as keys should remain private. 6: Separation of Privilege Base permission granting on multiple items. For example, to give root privilege, multiple items must be satisfied - you must know the root password, and log in from only certain terminals. Another example - ask for confirmation before deleting files. Another example - you require 2 people to certify a transaction, or have one person write and the other person validate. 7: Least Common Mechanism Minimize paths through which information is shared. For example, if you have a choice between putting a routine in a library or the kernal, choose the library because it will not be shared with other users, so your operations will be more secure. Dynamic load violates this principle. A covert channel of communication could be set up to violate this 8: Psychological Acceptability Security model must be accepted by users who will work with it. If you have a security model that is too hard to work with people will get around it, making security worse. A password is acceptable; a urine sample is not. It must be easy to understand, and it should not interfere with the users' work. It must fit the users' idea of security.