Notes for November 23, 1998

  1. Greetings and Felicitations!
    1. Homework due at 11:59 PM on Monday, November 30 (not just before class). Late day extended too.
  2. Puzzle of the Day
  3. ORCON (Originator Controlled; Graubert)
    1. Document/information can be passed on with approval of originator; real world justification is that originator of document trusts recipients not to release documents which they should not.
    2. Untrusted subject x marks object O ORCON on behalf of organization X and indicates it is releasable to subjects acting on behalf of organization Y.
      not releasable to subjects acting on behalf of other organizations without X's permission
      any copies made have the same restriction
    3. DAC: can't do this as the restriction would not copy over (y reads O into C, puts its own ACL on C)
    4. MAC: separate category with O, x, y. y wants to read O, copy to C; MAC means C has same category as O, x, y, so can't give z access to C.
      Say a new organization W wants to provide data in B to y but not to be shared with x or z. Can't use O's category. Hence you get explosion of categories.
      Real world parallel: individuals are "briefed" into a category and those represent a formal "need to know" policy that is standard across the entity; ORCON has no central clearinghouse to categorize data; originator makes rules.
  4. Solution?
    1. owner of object can't change ACL's relationship with object (MAC characteristic)
    2. on copy, ACL is copied as well (MAC characteristic)
    3. access control restrictions can be tailored on a subject/object basis (DAC characteristic)
  5. Malicious logic
    1. Quickly review Trojan horses, viruses, bacteria; include animal and Thompson's compiler trick
    2. Logic Bombs, Worms (Schoch and Hupp)
  6. Review trust and TCB
    1. Notion is informal
    2. Assume trusted components called by untrusted programs
  7. Ideal: program to detect malicious logic
    1. Can be shown: not possible to be precise in most general case
    2. Can detect all such programs if willing to accept false positives
    3. Can constrain case enough to locate specific malicious logic
    4. Can use: writing, structural detection (patterns in code), common code analyzers, coding style analyzers, instruction analysis (duplicting OS), dynamic analysis (run it in controlled environment and watch)
  8. Best approach: data, instruction typing
    1. On creation, it's type "data"
    2. Trusted certifier must move it to type "executable"
    3. Duff's idea: executable bit is "certified as executable" and must be set by trusted user
  9. Practise: blocking writing to communicate information or do damage
    1. Limit writing (use of MAC if available; show how to arrange system executables)
    2. Isolation
    3. Quarantine
  10. Practise: Trust
    1. Untrusted software: what is it, example (USENET)
    2. Check source, programs (what to look for); C examples
    3. Limit who has access to what
    4. Your environment (how do you know what you're executing); UNIX examples
    5. Least privilege; above with root
  11. Practise: detecting writing
    1. Integrity check files à la binaudit, tripwire; go through signature block
    2. LOCUS approach: encipher program, decipher as you execute.
    3. Co-processors: checksum each sequence of instructions, compute checksum as you go; on difference, complain


You can also see this document in its native format, in Postscript, in PDF, or in ASCII text.
Send email to cs153@csif.cs.ucdavis.edu.

Department of Computer Science
University of California at Davis
Davis, CA 95616-8562



Page last modified on 11/24/98