Main Page

From Obfuscation Project
Main Page
Jump to: navigation, search
Line 5: Line 5:
 
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept "secret", opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].
 
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept "secret", opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].
  
Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. This is because the obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through wall. If you are moving around town, then it should not go through buildings, etc.  
+
[1]
 +
 
 +
[2]
 +
 
 +
Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.
 +
 
 +
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret.
 +
 
 +
[3]
 +
 
 +
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from a positioning system and a real-time obfuscator implemented in the cloud.
 +
 
 +
[4]
 +
 
 +
As a way to further illustrate how to perform location privacy in an amusing way, we developed the Obfuscation Game. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.
 +
 
 +
Try the game and hope you enjoy it!
  
The paper [3] gives an example on
 
  
 
References:
 
References:

Revision as of 19:10, June 1, 2020

Obfuscation Project of the UMDES Group at the University of Michigan

We call this page the "Obfuscation Project". A better but less catchy name should probably be: "Opacity Enforcement and Its Application to Location Privacy".

Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept "secret", opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].

[1]

[2]

Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.

The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret.

[3]

The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from a positioning system and a real-time obfuscator implemented in the cloud.

[4]

As a way to further illustrate how to perform location privacy in an amusing way, we developed the Obfuscation Game. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.

Try the game and hope you enjoy it!


References:

[1] Overview of opacity: Jacob et al.

[2] History of opacity

[3] Location privacy: WODES 2014

[4] Demonstration of opacity: IFAC 2017

[3] [4] [5]

Yi-Chin Wu: opacity enforcement: TAC, optimal, JAR

[6] [7]

Yiding Ji: opacity enforcement: public-private insertion; public-private edit

[8] Xiang

Personal tools
Namespaces
Variants
Actions
Navigation
Toolbox
EECS @ UM
Tools