Notes for October 5, 1998

  1. Greetings and Felicitations!
    1. Web site up; some errors in General Information (John Hughes' mailing address) are fixed there
    2. John's hours: M1-3, Tu 3-4; mine, MWF 11-12. Ask if anyone can't make these.
    3. No class Wednesday, but discussion section then. (My office hrs cancelled then!)
    4. No office hours today; interviewing prospective admin assistant for the lab
  2. The telnetd is the same size as the Bourne shell. So, it may have been replaced by the Bourne shell ...
  3. How do you design a security policy?
    1. Risk analysis
    2. Analysis of other factors:
    3. Procedures
  4. Risk analysis
    1. What are the threats?
    2. How likely are they to arise?
    3. How can they best be dealt with?
  5. Analysis of other factors
    1. What else affects the policy (federal or state law, needs, etc.)?
    2. Law: as above; discuss jurisdiction (federal or local), problems (authorities' lack of knowledge about computers, etc.); chain of evidence
    3. Discuss cryptographic software controls (here, France, etc.)
  6. Procedures
    1. What procedures need to be put in place, and how will they affect security?
  7. Human Factors
    1. Principle of Psychological Acceptability (note: illegal violates this)
    2. Principle of common sense (it's not common; more when we discuss robust programming)
  8. Design Principles
    1. Principle of Psychological Acceptability
      1. Principle of Least Privilege
      2. Principle of Fail-Safe Defaults
      3. Principle of Economy of Mechanism (KISS principle, redone)
      4. Principle of Complete Mediation
      5. Principle of Separation of Privilege
      6. Principle of Least Common Mechanism
      7. Principle of Open Design


    You can also see this document in its native format, in Postscript, in PDF, or in ASCII text.
    Send email to cs153@csif.cs.ucdavis.edu.

    Department of Computer Science
    University of California at Davis
    Davis, CA 95616-8562



    Page last modified on 10/5/98