7ca78ee8b37692a01c4da24acd5174d2.ppt
- Количество слайдов: 22
Point-Based Trust: Define How Much Privacy is Worth Danfeng Yao Brown University Keith B. Frikken Miami University Mikhail J. Atallah Purdue University Roberto Tamassia Brown University Funded by NSF IIS-0325345, IIS-0219560, IIS-0312357, and IIS-0242421, ONR N 00014 -02 -1 -0364, CERIAS, and Purdue Discovery Park ICICS December, 2006, Raleigh NC 1
Outline of the talk 1. Introduction to privacy protection in authorization 3. Secure 2 -party protocol for knapsack problem 2. Point-based authorization and optimal credential selection 2. 1 New York State Division of Motor Vehicle 6 -Point Authentication System 4. Applications 2. 2 Knapsack problem 2
Protecting private information Alice Request for discount Request UID Request BBB Policy Releasing UID requires BBB UID Cred. UID (student ID) Grant the discount n Discount Policy requires UID BBB (better Cred. business bureau) Trust negotiation protocols [Winsborough Seamons Jones 00, Yu Ma Winslett 00, Winsborough Li 02, Li Du Boneh 03] 3
Our goals n Prevent pre-mature information leaking by both parties n n Support some kind of cumulative privacy quantitatively n n Credentials should be exchanged only if services can be established Disclosing more credentials should incur higher privacy loss Support flexible service model n n Allow customized (or personalized) access policies Adjustable services based on qualifications Our ultimate goal is to encourage users to participate in e-commerce 4
What can we learn from New York State DMV? 6 -point proof-of-identity for getting NY driver’s license Credential Points Passport 5 Utility bill 1 Birth certificate 4 Social security card 3 5
Another motivation – adjustable services Membership, Credential Discount Mastercard 2% Airline frequent flier 1% AAA 0. 5% Veteran 0. 5% Adjustable services based on the private information revealed 6
Point-based authorization model n n Credential type C 1, C 2, …, Cn The service provider defines n Point values p 1, p 2, …, pn of credentials ----- private n Threshold T for accessing a resource ----- private The user defines sensitivity scores a 1, a 2, …, an of credentials ----private Credential selection problem n The user (or client) wants to satisfy threshold T with the minimum disclosure of privacy Minimize Subject to n a i xi i=1 n xi = 0 not to disclose Ci xi= 1 disclose Ci p i xi ≥ T i=1 This can be converted to a knapsack problem 7
Example Threshold of accessing a resource: 10 Credential College ID Driver’s license Credit card SSN Point value 3 6 8 10 Credit card 10 30 50 100 11 60 14 80 SSN Sensitivity score 10 License, Credit card Driver’s license Sensitivity score College ID, Credit card College ID Point values SSN Credential Alice’s option 100 8
Where do points come from? n Reputation systems [Beth Borcherding Klein 94, Tran Hitchens Varadharajan Watters 05, Zouridaki Mark Hejmo Thomas 2005] 05 n This is future work, but here is an idea Evaluate Member of Evaluate 9
Converting CSP into a knapsack problem n Defines binary vector y 1, y 2, …, yn, where yi = 1 – xi n {ai}: Private to user n {pi }: Private to provider Maximize Subject to Bag of size T’, n=6 n a i yi i=1 n py i=1 i i n < T’ Let T’ = i=1 pi - T What to pick and steal? 10
Dynamic programming of knapsack problem n Dynamic programming for 0/1 knapsack problem n Construct a n-by-T’ table M, where M i, j = M i-1, j if j < pi n T’ = pi - T i=1 max {M i-1, j, M i-1, j-pi + ai } if j ≥ pi. . . . Mi-1, j-pi . . Mi-1, j . . . ? {ai }: Private to user {pi }: Private to provider 11
Overview of privacy-preserving knapsack computation n n Uses 2 -party maximization protocol [Frikken Atallah 04] Uses homomorphic encryption scheme n n E(x)E(y) = E(x + y) E(x)c = E(xc) Preserves privacy for both Two phases: table-filling and traceback M i, j = max {M i-1, j , - ∞ + ai } if j < pi max {M i-1, j , M i-1, j-pi + ai } if j ≥ pi n Add maximization and addition of ai to make the two computation procedures indistinguishable 12
Preliminary: 2 -party maximization protocol in a split format Player Input Output Privacy Alice 1, Alice 2 Alice’s share of max* Do not know which is the max Amazon 1, Amazon 2 Amazon’s share of max* * Alice’s share + Amazon’s share = max (Alice 1 + Amazon 1, Alice 2 + Amazon 2) Amazon 1 Alice 1 Amazon 2 Alice 2 Amazon’s Alice’s Max share Comparison can be done similarly [Frikken Atallah 04] 13
Our protocol for dynamic programming of 0/1 knapsack problem n n n M i, j = max { M i-1, j , - ∞ + ai } if j < pi Computed entries are encrypted and stored by the provider max {Mi-1, j , Mi-1, j-pi + ai } if j ≥ pi The provider splits the two candidates of Mi, j. . . . The client and provider engage in E(Mi-1, j-pi). . E(Mi-1, j) a 2 -party private maximization. . . ? protocol to compute the maximum The client encrypts her share of the maximum and sends it to the provider ai The provider computes and stores E(M Amazon Alice i-1, j) the encrypted Mi, j Amazon’s Alice’s Max share Alice i-1, j-pi) E(M Amazon 14
Our protocol for knapsack (Cont’d) n n At the end of the 2 -party dynamic programming, the provider has a n-by-T’ table of encrypted entries n T’ = pi - T Number of credentials n=4 i=1 E(M 1, 1) E(M 1, 2) E(M 1, 3) E(M 1, 4) E(M 1, 5) E(M 2, 1) E(M 2, 2) E(M 2, 3) E(M 2, 4) E(M 2, 5) E(M 3, 1) E(M 3, 2) E(M 3, 3) E(M 3, 4) E(M 3, 5) E(M 4, 1) E(M 4, 2) E(M 4, 3) E(M 4, 4) E(M 4, 5) How does the client find out the optimal selection of credentials? 15
Traceback protocol: get the optimal credential selection Item 0 0 Item 1 E(M 1, 1), E(F 1, 1) E(M 1, 2), E(F 1, 2) E(M 1, 3), E(F 1, 3) E(M 1, 4), E(F 1, 4) E(M 1, 5), E(F 1, 5) Item 2 E(M 2, 1), E(F 2, 1) E(M 2, 2), E(F 2, 2) E(M 2, 3), E(F 2, 3) E(M 2, 4), E(F 2, 4) E(M 2, 5), E(F 2, 5) Item 3 E(M 3, 1), E(F 3, 1) E(M 3, 2), E(F 3, 2) E(M 3, 3), E(F 3, 3) E(M 3, 4), E(F 3, 4) E(M 3, 5), E(F 3, 5) Item 4 E(M 4, 1), E(F 4, 1) E(M 4, 2), E(F 4, 2) E(M 4, 3), E(F 4, 3) E(M 4, 4), E(F 4, 4) E(M 4, 5), E(F 4, 5) E(Fi, j) Fi, j = 0 or 1 n Security in a semi-honest (honest-but-curious) model 16
Security and efficiency of our privacypreserving knapsack computation n n Informally, security means that private information is not leaked Security definitions n Semi-honest adversarial model n A protocol securely implements function f if the view of participants are simulatable with an ideal implementation of the protocol Theorem The basic protocol of the private two-party dynamic programming computation in the point-based trust management model is secure in the semi-honest adversarial model. Theorem The communication complexity between the provider and the client of our basic secure dynamic programming protocol is O(n. T'), where n is the total number of credentials and T' is the marginal threshold. 17
Fingerprint protocol: an improved traceback protocol n We want to exclude the provider in the traceback n To prevent tampering and reduce costs 1. Filling knapsack table 2. (Encrypted) last entry 3. Decrypt and identity optimal credential selection Fingerprint protocol is a general solution for traceback in DP 18
Fingerprint protocol (cont’d) Item No. Privacy score (decimal) Privacy score (binary) Transformed score 1 2 010 0001 2 3 011 0010 3 5 101 0100 4 8 1000 Knapsack result (decimal) Knapsack result (binary) Item numbers in the knapsack 3 … 0010 2 20 … 1111 1, 2, 3, 4 19
Application of point-based authorization: fuzzy location query in Presence systems Ex Where is Alice? Boss Mom Alice’s mom Where is Alice? Alice’s boss Where is Alice? Alice’s ex 20
Related work n Hidden credentials [Bradshaw Holt Seamons 04, Frikken Li Atallah 06] Private policy negotiation [Kursawe Neven Tuyls 06], Optimizing trust negotiation [Chen Clarke Kurose Towsley 05], Trust negotiation protocol/framework [Winsborough Seamons Jones 00, Yu Ma Winslett 00, Winsborough Li 02, Li Du Boneh 03, Li Li Winsborough 05] Anonymous credential approaches [Chaum 85, Camenisch Lysyanskaya 01] Secure Multiparty Computation [Atallah Li 04, Atallah Du 01] OCBE [Li Li 06] n Manet [Zouridaki Mark Hejmo Thomas 05] n Platform for Privacy Preferences (P 3 P) [W 3 C] n n 21
Conclusions and future work n n Our point-based model allows a client to choose the optimal selection of credentials We presented private 2 -party protocol for knapsack problem Our fingerprint protocol is a general solution for traceback Future work n Add typing to credentials n Reputation systems and points 22
7ca78ee8b37692a01c4da24acd5174d2.ppt