Скачать презентацию Topics in Computer Security Introduction PETs and TETs Скачать презентацию Topics in Computer Security Introduction PETs and TETs

a1c219f85c3cc3c23811a3c1dfb13d77.ppt

  • Количество слайдов: 64

Topics in Computer Security: Introduction: PETs and TETs Simone Fischer-Hübner Topics in Computer Security: Introduction: PETs and TETs Simone Fischer-Hübner

Overview I. III. IV. V. Introduction to Privacy Introduction to PETs Transparency Enhancing Tools Overview I. III. IV. V. Introduction to Privacy Introduction to PETs Transparency Enhancing Tools Anonymous Communication Technologies & TOR Private Information Retrieval

I. Introduction to Privacy: Privacy Dimensions n n Informational selfdetermination Spatial privacy I. Introduction to Privacy: Privacy Dimensions n n Informational selfdetermination Spatial privacy

Basic Privacy principles (implemented in EU-Directive 95/46/EC) Legitimisation by law, informed consent (Art. 7 Basic Privacy principles (implemented in EU-Directive 95/46/EC) Legitimisation by law, informed consent (Art. 7 EU Directive) § Data minimisation and avoidance (Art. 6 I c, Art. 7) n Purpose specification and purpose binding (Art. 6 I b) n • ”Non-sensitive” data do not exist !

Example for Purpose Misuse n Lidl Video Monitoring Scandal Example for Purpose Misuse n Lidl Video Monitoring Scandal

Basic privacy principles (II) n n n Transparency, rights of data subjects Supervision (Art. Basic privacy principles (II) n n n Transparency, rights of data subjects Supervision (Art. 28) and Sanctions (Art. 24) Requirement of security mechanisms (Art. 17)

EU Directive 2002/58/EC on privacy and electronic communications Ø Location data other than Traffic EU Directive 2002/58/EC on privacy and electronic communications Ø Location data other than Traffic data (Art. 9): · · May only be processed when made anonymous, or with the informed consent of the user/subscriber Where consent has been obtained, the user/subscriber must still have possibility of temporarily refusing the processing of location data

Privacy Challenges of Emerging Technologies. . . n n Global networks, cookies, webbugs, spyware, Privacy Challenges of Emerging Technologies. . . n n Global networks, cookies, webbugs, spyware, . . . LBS Ambient Intelligence, RFID Biometrics. . .

Privacy Risks of Social Networks - Facebook n n n Intimate personal details about Privacy Risks of Social Networks - Facebook n n n Intimate personal details about social contacts, personal life, etc. Not only accessible by ”friends” The Internet never forgets completely. .

Privacy Risks of Social Networks – Facebook (II) Privacy Risks of Social Networks – Facebook (II)

Privacy Risks of Social Networks – Facebook Beacons Privacy Risks of Social Networks – Facebook Beacons

II. Introduction to PETs Need for Privacy-Enhancing Technologies n n n Law alone is II. Introduction to PETs Need for Privacy-Enhancing Technologies n n n Law alone is not sufficient for protecting privacy in our Network Society PETs needed for implementing Law PETs for empowering users to exercise their rights

Classifications of PETs 1. PETs for minimizing/ avoiding personal data (-> Art. 6 I Classifications of PETs 1. PETs for minimizing/ avoiding personal data (-> Art. 6 I c. , e. EU Directive 95/46/EC) (providing Anonymity, Pseudonymity, Unobservability, Unlinkability) § At communication level: • Mix nets • Onion Routing, TOR • DC nets • Crowds § At application level: Anonymous Ecash • Private Information Retrieval • Anonymous Credentials 2. PETs for the safeguarding of lawful processing (-> Art. 17 EU Directive 95/46/EC) • P 3 P • Privacy policy languages • Transparency Enhancing Tools (TETs) 3. Combination of 1 & 2 • Privacy-enhanced Identity Management •

III. Transparency Enhancing Tools Directive 95/46/EC - Transparency n Art. 6: personal data must III. Transparency Enhancing Tools Directive 95/46/EC - Transparency n Art. 6: personal data must be processed fairly and lawfully n n Art. 10/11: Controller must provide Information n n n Recital No. 38: data subject must be given accurate and full information the identity of the controller / representative the purposes of the processing any further information (recipients, replies obligatory or voluntary, consequences of failure to reply, existence of the right of access / to rectify the data) Art. 12 (a): Right of access (get information from controller about e. g. data processing, purpose, recipients, etc. ). Art 12 (b): Right to rectification, blocking deletion. Art. 14: ensure that data subjects are aware of the existence of the right to object e. g. data processing for direct marketing

Flash Eurobarometer 2003 Survey n n n 37% of companies said they systematically provide Flash Eurobarometer 2003 Survey n n n 37% of companies said they systematically provide data subjects with the identity of the data controller 46% said they always informed data subjects of the purpose for which the data would be used 42% of EU citizens are aware that those collecting personal information are obliged to provide individuals with certain information (such as at least their identity and the purpose of the data collection)

Transparency Enhancing Tools: Example: “Data Track” in PRIME n Transparency: ”Data Track” providing: n Transparency Enhancing Tools: Example: “Data Track” in PRIME n Transparency: ”Data Track” providing: n n User side-DB with user-friendly search function for transaction records (incl. data, pseudonyms, credentials, timestamp, policy) Online-Functions for exercising rights Advanced search

Online Functions for Exercising Rights n n Problem: Users do not know their privacy Online Functions for Exercising Rights n n Problem: Users do not know their privacy rights and do not exercise them Can Online Functions help to overcome this threshold and raise trust ?

Issues to be addressed at the user side n n Authentication for digital identity Issues to be addressed at the user side n n Authentication for digital identity – not straight forward if pseudonyms were used Access request should not reveal more than known by service provider

Issues to be addressed by service side n n n Service side automated response Issues to be addressed by service side n n n Service side automated response support needed Laws might need to be updated to allow Online requests (e. g. in Sweden the PUL only provides the right to access data once in a year) Service side transparency and accountability tools need to be privacy-enhanced

Example: E-Government Transparency Service: My. Page/Min. Side n Provides full Transparency, but could also Example: E-Government Transparency Service: My. Page/Min. Side n Provides full Transparency, but could also be used as a perfect profiling tool

Accountability vs. Privacy n n For transparency of data use/accountability: ”Policy-aware” transaction logs needed, Accountability vs. Privacy n n For transparency of data use/accountability: ”Policy-aware” transaction logs needed, which however contain personal data about users and data subjects Appropriate protection schems for logs needed (access control, pseudonymisation, . . . )

IV. Anonymous Communication Technologies Definitions - Anonymity n Anonymity: The state of being not IV. Anonymous Communication Technologies Definitions - Anonymity n Anonymity: The state of being not identifiable within a set of subjects (e. g. set of senders or recipients), the anonymity set Source: Pfitzmann/Hansen

Definitions - Unobservability n Unobservability ensures that a user may use a resource or Definitions - Unobservability n Unobservability ensures that a user may use a resource or service without others being able to observe that the resource or service is being used Source: Pfitzmann/Hansen

Definitions - Unlinkability n Unlinkability of two or more items (e. g. , subjects, Definitions - Unlinkability n Unlinkability of two or more items (e. g. , subjects, messages, events): n n Within the system, from the attacker’s perspective, these items are no more or less related after the attacker’s observation than they were before Unlinkability of sender and recipient (relationship anonymity): n It is untraceable who is communicating with whom

Definitions - Pseudonymity n n Pseudonymity is the use of pseudonyms as IDs Pseudonymity Definitions - Pseudonymity n n Pseudonymity is the use of pseudonyms as IDs Pseudonymity allows to provide both privacy protection and accountability Person pseudonym Role pseudonym Relationship pseudonym Role-relationship pseudonym Transaction pseudonym L I N K A B I L I T y Source: Pfitzmann/Hansen

Definitions - Pseudonymity (cont. ) Source: Pfitzmann/Hansen Definitions - Pseudonymity (cont. ) Source: Pfitzmann/Hansen

Mix-nets (Chaum, 1981) Bob Alice A 2, r 1 A 3, r 2 Bob, Mix-nets (Chaum, 1981) Bob Alice A 2, r 1 A 3, r 2 Bob, r 3, msg K 3 K 2 K 1 msg Mix 3 Mix 1 A 3, r 2 Bob, r 3, msg K 3 K 2 Bob, r 3, msg K 3 Mix 2 Ki: public key of Mixi, ri: random number, Ai: address of Mixi

Functionality of a Mix Server (Mixi) Input Message Mi Discard repeated messages Collect messages Functionality of a Mix Server (Mixi) Input Message Mi Discard repeated messages Collect messages in batch or pool Sufficient messages from many senders ? Change outlook *) i Reorder *) decrypts Mi = EKi[Ai+1, ri, Mi+1] with the private key of Mixi, ignores random number ri, obtains address Ai+1 and encrypted Mi+1 Output Message Mi+1 to Mixi+1 MIX Message DB

Why are random numbers needed ? If no random number ri is used : Why are random numbers needed ? If no random number ri is used : E Ki(M, Ai+1 ) Mixi M Mixi+1 Address(Mixi+1) = Ai+1 =? E Ki (M, Ai+1)

Sender Anonymity with Mix-nets Sender (Alice) chooses Mix-Sequence Mix 1, …. . , Mixn+1 Sender Anonymity with Mix-nets Sender (Alice) chooses Mix-Sequence Mix 1, …. . , Mixn+1 = recipient Ai (i =1. . n+1): address of Mixi Ki (i=1. . n+1): public key of Mixi zi: random bit strings M: message for recipient Mi: message that Mixi will receive Sender prepares her message: Mn+1 = EKn+1 (M) Mi = EKi (zi, Ai+1, Mi+1) for i=1…n and sends M 1 to Mix 1

Sender Anonymity with Mix-nets (cont. ) Sender (Alice) Ek 1(z 1, A 2, M Sender Anonymity with Mix-nets (cont. ) Sender (Alice) Ek 1(z 1, A 2, M 2) Mix 1 Mix 2 Mix 3 Recipient (Bob) Ekn+1(M) Each Mixi decrypts: EKi(zi, Ai+1, Mi+1) -> Ai+1: address of next Mix Mi+1: EKi+1(zi+1, Ai+2, Mi+2), encoded message for Mixi+1 zi: random string, to be discarded and forwards Mi+1 to Mixi+1

Recipient Anonymity with Mix- nets Recipient Bob chooses Mix-Sequence Mix 1, …. . , Recipient Anonymity with Mix- nets Recipient Bob chooses Mix-Sequence Mix 1, …. . , Mixm and creates anonymous return address RA: Rm+1 = e Rj = Ekj(cj, Aj+1, Rj+1) for j=1. . m RA = (c 0, A 1, R 1) e : label of return address cj: symmetric key, used by Mixj to encode message on the return path Aj (j =1. . m): address of Mixj kj (j=1. . m): public key of Mixj zj: random bit strings Recipient Bob sends RA anonymously to Sender Alice: RA Sender Alice Mix 1 Mix 2 Mixm Ekm(zm, Am-1, Ekm-1(…EK 1(z 1, A 0, RA). . )) Bob

Recipient anonymity with Mix- nets (cont. ) Sender Alice replies: Mix 1 c 0(M), Recipient anonymity with Mix- nets (cont. ) Sender Alice replies: Mix 1 c 0(M), R 1 Mix 2 Mix 3 Bob cm(cm-1(…c 0(M)…)), e Each Mixj receives: cj-1(…c 0(M). . ), Rj, decrypts: Rj = Ekj(cj, Aj+1, Rj+1) -> (cj, Aj+1, Rj+1), forwards: cj(cj-1(…c 0(M)…)), Rj+1 to Mixj+1 Label e indicates Bob which c 0, . . , cm he has to use to decrypt M

Two-Way Anonymous Conversation Two-Way Anonymous Conversation

Existing Mix-based systems for HTTP (real-time) n Simple Proxies n n n Anonymizer. com Existing Mix-based systems for HTTP (real-time) n Simple Proxies n n n Anonymizer. com Proxy. Mate. com Mix-based Systems considering traffic analysis: n n n Onion Routing (Naval Research Center) TOR (Free Haven project) JAP (TU Dresden, ”Mix Cascade”)

Onion Routing n n n Onion = Object with layers of public key encryption Onion Routing n n n Onion = Object with layers of public key encryption to produce anonymous bi-directional virtual circuit between communication partners and to distribute symmetric keys Initiator's proxy constructs “forward onion” which encapsulates a route to the responder (Faster) symmetric encryption for data communication via the circuit U X Z Y X Y Y Z Z Z

Forward Onion for route W-X-Y-Z: X exp-timex, Y, Ffx, Kfx, Fbx, Kbx Y exp-timey, Forward Onion for route W-X-Y-Z: X exp-timex, Y, Ffx, Kfx, Fbx, Kbx Y exp-timey, Z, Ffy, Kfy, Fby, Kby, Z exp_timez, NULL, Ffz, Kfz, Fbz, Kbz, PADDING Each node N receives (PKN = public key of node N): n {exp-time, next-hop, Ff, Kf, Fb, Kb, payload} PKN n exp-time: expiration time n next_hop: next routing node n (Ff, Kf) : function / key pair for symmetric encryption of data moving forward in the virtual circuit n (Fb, Kb) : function/key pair for symmetric encryption of data moving backwards in the virtual circuit n payload: another onion (or null for responder´s proxy)

Onion Routing- Building up virtual circuit Create command accompanied by Onion: n If node Onion Routing- Building up virtual circuit Create command accompanied by Onion: n If node receives onion, it peels off one layer, keeps forward/backward encryption keys, it chooses a virtual circuit (vc) identifier and sends create command+ vc identifier + (rest of) onion to next hop. n It stores the vc identifier it receives and the one that it sent out as a pair. n Until circuit is destroyed -> whenever it receives data on one connection, it sends it off to the other n Forward encryption is applied to data moving in the forward direction, backward encryption is applied in the backward direction

Example: Virtual Circuit with Onion Routing Send data by the use of send command: Example: Virtual Circuit with Onion Routing Send data by the use of send command: Data sent by the initiator is ”pre-encrypted” prepeatedly by his proxy. If W received data sent back by last Z, it applies the inverse of the backward cryptographic operations (outermost first).

Onion Routing - Review n Functionality: n n n Hiding of routing information in Onion Routing - Review n Functionality: n n n Hiding of routing information in connection oriented communication relations Nested public key encryption for building up virtual circuit Expiration_time field reduces costs of replay detection Dummy traffic between Mixes (Onion Routers) Limitations: n First/Last-Hop Attacks by n n Timing correlations Message length (No. of cells sent over circuit)

TOR (2 nd Generation Onion Router) TOR (2 nd Generation Onion Router)

First Step n n TOR client obtains a list of TOR nodes from a First Step n n TOR client obtains a list of TOR nodes from a directory server Directory servers maintain list of which onion routers are up, their locations, current keys, exit policies, etc. TOR client Directory server

TOR circuit setup n Client proxy establishes key + circuit with Onion Router 1 TOR circuit setup n Client proxy establishes key + circuit with Onion Router 1 TOR client

TOR circuit setup n Client proxy establishes key + circuit with Onion Router 1 TOR circuit setup n Client proxy establishes key + circuit with Onion Router 1 n Proxy tunnels through that circuit to extend to Onion Router 2 TOR client proxy

TOR circuit setup n Client proxy establishes key + circuit with Onion Router 1 TOR circuit setup n Client proxy establishes key + circuit with Onion Router 1 Proxy tunnels through that circuit to extend to Onion Router 2 n Etc. n TOR client proxy

TOR circuit setup n n Client proxy establishes key + circuit with Onion Router TOR circuit setup n n Client proxy establishes key + circuit with Onion Router 1 Proxy tunnels through that circuit to extend to Onion Router 2 Etc. Client applications connect and communicate over TOR circuit TOR client proxy

TOR circuit setup n n Client proxy establishes key + circuit with Onion Router TOR circuit setup n n Client proxy establishes key + circuit with Onion Router 1 Proxy tunnels through that circuit to extend to Onion Router 2 Etc. Client applications connect and communicate over TOR circuit TOR client proxy

TOR circuit setup n n Client proxy establishes key + circuit with Onion Router TOR circuit setup n n Client proxy establishes key + circuit with Onion Router 1 Proxy tunnels through that circuit to extend to Onion Router 2 Etc. Client applications connect and communicate over TOR circuit TOR client proxy

TOR circuit setup n n Client proxy establishes key + circuit with Onion Router TOR circuit setup n n Client proxy establishes key + circuit with Onion Router 1 Proxy tunnels through that circuit to extend to Onion Router 2 Etc. Client applications connect and communicate over TOR circuit TOR client proxy

TOR circuit setup n n Client proxy establishes key + circuit with Onion Router TOR circuit setup n n Client proxy establishes key + circuit with Onion Router 1 Proxy tunnels through that circuit to extend to Onion Router 2 Etc. Client applications connect and communicate over TOR circuit TOR client proxy

TOR circuit setup n n Client proxy establishes key + circuit with Onion Router TOR circuit setup n n Client proxy establishes key + circuit with Onion Router 1 Proxy tunnels through that circuit to extend to Onion Router 2 Etc. Client applications connect and communicate over TOR circuit TOR client proxy

TOR circuit setup n n Client proxy establishes key + circuit with Onion Router TOR circuit setup n n Client proxy establishes key + circuit with Onion Router 1 Proxy tunnels through that circuit to extend to Onion Router 2 Etc. Client applications connect and communicate over TOR circuit TOR client proxy

TOR: Building up a two-hop circuit and fetching a web page Alice Link is TOR: Building up a two-hop circuit and fetching a web page Alice Link is TLS-encrypted OR 1 Link is TLS-encrypted OR 2 Unencrypted Web site Create c 1, E (g x 1) Created c 1, g y 1, H(K 1) Relay c 1 {Extend, OR 2, E (g x 2)} Relay c 1 {Extended, g y 2, H(K 2)} Relay c 1 {{Begin

TOR - Review n Some improvemnets in comparision with Onion Routing: n n Perfect TOR - Review n Some improvemnets in comparision with Onion Routing: n n Perfect forward secrecy Resistant to replay attacks Many TCP streams can share one circuit Seperation of ”protocol cleaning” from anonymity: n n n n Standard SOCKS proxy interface (instead of having a seperate application proxy for each application) Content filtering via Privoxy Directory servers Variable exit policies End-to-end integrity checking Hidden services Still vulnerable to end-to-end timing and size correlations

Private Information Retrieval (PIR) n Privacy for the item of interest: n n n Private Information Retrieval (PIR) n Privacy for the item of interest: n n n Allows a user to retrieve an item from a database/news server without revealing which item he is interested in Application example: patent database Simple (but expensive) solution: n Download all database entries and make local selection

Enhanced PIR solution – Cooper/Birman 1995 n n n t+1 servers with identical databases Enhanced PIR solution – Cooper/Birman 1995 n n n t+1 servers with identical databases (composed of m cells each) are queried To each database an m-bit vector is sent, where each bit represents a cell in the database. If the bit is 1, the corresponing cell is selected, otherwise not. If the pth cell should be read, the t+1 query vectors are created acoording to the following scheme: n n n t query vectors are random bit vectors of length m Create the t+1 st vector by exclusive-oring the t random bit vectors and then flipping the pth bit (in order to read cell p) This will create a set of t+1 bit-vectors that, when exclusive-ored together, will yield the bit-vector Ip: Ip[j] = 0 if j <> p 1 if j= p 0 p 1 0 0 0 . . . . 0

PIR – Example of Sample Bit. Vectors for t=3, p=1 Vector 0 1(=p) 2 PIR – Example of Sample Bit. Vectors for t=3, p=1 Vector 0 1(=p) 2 3 1 0 7 0 1 1 0 1 . . . . 1 1 1 0 00 0 . . . . 1 0 0 1 . . . . 0 0 0 1 1 . . . . 0 V 1 1 1 V 2 1 1 0 V 3 V 1 0 1 1 p 1 V 2 V 3 0 4 0 6 . . . . m-1 0 flip V 4 0 IP= V 1 V 2 V 3 V 4 0 0 p 1 0 0 0 . . . . 0

PIR- Bit-Vector Protocol Client Server 1 Server 2 . . . Step 1. Choose PIR- Bit-Vector Protocol Client Server 1 Server 2 . . . Step 1. Choose V 1, V 2, . . . Vt+1 such that Ip= V 1 V 2 V 3. . . Vt+1 Step 2. Step 3. answer = Communication of bit vectors and responses must be encrypted Server t+1

PIR - Review n n n Protection goal: Unlinkability of a user and an PIR - Review n n n Protection goal: Unlinkability of a user and an item of interest Security: If each of the bits in the t random bit vectors are set to 1 with probability ½ then an attacker who has access to at most t of the requests/responses associated with the bit-vectors will gain no information about which cell the client is reading Updates are complex: Changes/Adding new messages must take place simultanously

Repetition Repetition

Repetition: Diffie-Hellman Key exchange Global Public Elements: q: prime number : < q and Repetition: Diffie-Hellman Key exchange Global Public Elements: q: prime number : < q and is a primitive root of q [If is a primitive root of prime number p, then the numbers: mod p, 2 mod p, …, p-1 mod p are distinct and are a permutation of {1. . p-1}. For any integer b

Diffie-Hellman Key Exchange q: prime number, : primitive root of q K= XA XB Diffie-Hellman Key Exchange q: prime number, : primitive root of q K= XA XB mod q

Some Literature n n Andreas Pfitzmann et al. ”Communication Privacy”, in: Aquisti et al. Some Literature n n Andreas Pfitzmann et al. ”Communication Privacy”, in: Aquisti et al. (Eds. ), Digital Privacy – Theory, Technologies, and Practices, Auerbach Publications, 2008 TOR: Anonymity Online, http: //www. torproject. org/ Roger Dingledine, Nick Mathewson, Paul Syverson, TOR: The Second-Generation Onion Router, Proceedings of the 13 th Usenix Security Symposium, August 2004, http: //www. torproject. org/svn/trunk/doc/design-paper/tordesign. pdf David Cooper, Kenneth Birman, ”Preserving Privacy in a Network of Mobile Computers”, Proceedings of the 1995 IEEE Symposium on Security and Privacy, Oakland, May 1995.