# Main Page

Main Page
 Revision as of 19:47, June 1, 2020 (view source)Stephane (Talk | contribs) (→Our research on opacity enforcement)← Older edit Revision as of 19:49, June 1, 2020 (view source)Stephane (Talk | contribs) (→Obfuscation Project of the DES Group at the University of Michigan)Newer edit → Line 7: Line 7: [1] Overview of opacity: Jacob et al. [1] Overview of opacity: Jacob et al. − [2] History of opacity + S.\ Lafortune, F.\ Lin, and C.\ Hadjicostis, + On the History of Diagnosability and Opacity in Discrete Event Systems,'' + \emph{Annual Reviews in Control}, + Vol.\ 45, pp.\ 257-266, 2018. Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid. Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid. Line 13: Line 16: The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. − [3] WODES 2014 + Y.-C.\ Wu, K.A.\ Sankararaman, and S.\ Lafortune, + Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,'' + in + \emph{Proceedings of the 12th International Workshop on Discrete Event Systems},'' + May 2014. The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40. The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40. [4] WODES 2018 [4] WODES 2018 + R.\ Meira G\'oes, B.C.\ Rawlings, N.\ Recker, G.\ Willett, and S.\ Lafortune, + Demonstration of Indoor Location Privacy Enforcement using Obfuscation,'' + in + \emph{Proceedings of the 14th International Workshop on Discrete Event Systems}, + June 2018, pp.\ 157-163. As a way to further illustrate how to perform location privacy in an amusing way, we developed the Obfuscation Game. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the [http://obfuscationgame.eecs.umich.edu Obfuscation Game], our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be. As a way to further illustrate how to perform location privacy in an amusing way, we developed the Obfuscation Game. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the [http://obfuscationgame.eecs.umich.edu Obfuscation Game], our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.

## Obfuscation Project of the DES Group at the University of Michigan

We call this page the Obfuscation Project. A better but less catchy name should probably be: Opacity Enforcement and Its Application to Location Privacy.

Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept secret, opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].

[1] Overview of opacity: Jacob et al.

S.\ Lafortune, F.\ Lin, and C.\ Hadjicostis, On the History of Diagnosability and Opacity in Discrete Event Systems, \emph{Annual Reviews in Control}, Vol.\ 45, pp.\ 257-266, 2018.

Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.

The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret.

Y.-C.\ Wu, K.A.\ Sankararaman, and S.\ Lafortune, Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement, in \emph{Proceedings of the 12th International Workshop on Discrete Event Systems}, May 2014.

The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40.

[4] WODES 2018

R.\ Meira G\'oes, B.C.\ Rawlings, N.\ Recker, G.\ Willett, and S.\ Lafortune,


Demonstration of Indoor Location Privacy Enforcement using Obfuscation, in \emph{Proceedings of the 14th International Workshop on Discrete Event Systems},

June 2018, pp.\ 157-163.


As a way to further illustrate how to perform location privacy in an amusing way, we developed the Obfuscation Game. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.

The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5].

[5] JAR

Try the Obfuscation Game. We hope you enjoy it!

#### Our research on opacity enforcement

Our results on opacity and its enforcement have been published in the papers below. Opacity enforcement by insertion of fictitious events, or by deletion of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin also studied opacity enforcement by supervisory control in his dissertation. Current students Rômulo Meira-Góes and Andrew Wintenberg are also considering opacity enforcement in their doctoral research. Other contributors to our efforts have been: Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani.

Y.C.\ Wu and S.\ Lafortune, Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures, {\em Discrete Event Dynamic Systems: Theory and Applications}. Vol.\ 23, No.\ 3, September 2013, pp.\ 307-339.

Y.-C.\ Wu and S.\ Lafortune,


Synthesis of Insertion Functions for Enforcement of Opacity Security Properties, \emph{Automatica}. Vol.\ 50, No.\ 5, May 2014, pp.\ 1336-1348.

X.\ Yin and S.\ Lafortune,


A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers, \emph{Automatica}. Vol.\ 80, pp.\ 162-171, June 2017.

Y.C.\ Wu and S.\ Lafortune, Synthesis of Optimal Insertion Functions for Opacity Enforcement, \emph{IEEE Transactions on Automatic Control}. Vol.\ 61, No.\ 3, March 2016, pp.\ 571-584.

Y.-C.\ Wu, G.\ Lederman, and S.\ Lafortune, Enhancing opacity of stochastic discrete event systems using insertion functions, in \emph{Proceedings of the 2016 American Control Conference}, July 2016, pp.\ 2053--2060.

X.\ Yin and S.\ Lafortune, A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems, \emph{IEEE Transactions on Automatic Control}. Vol.\ 61, No.\ 8, August 2016, pp.\ 2140-2154.

C.\ Keroglou and S.\ Lafortune, Verification and Synthesis of Embedded Insertion Functions for Opacity Enforcement, in \emph{Proceedings of the 56th IEEE Conference on Decision and Control}, December 2017, pp.\ 4217-4223.

C.\ Keroglou, S.\ Lafortune, and L.\ Ricker, Insertion Functions with Memory for Opacity Enforcement, in \emph{Proceedings of the 14th International Workshop on Discrete Event Systems},

June 2018, pp.\ 405-410.

Y.\ Ji, Y.-C.\ Wu, and S.\ Lafortune,
Enforcement of Opacity by Public and Private Insertion Functions,
\emph{Automatica},
Vol.\ 93, July 2018, pp.\ 369-378.


Y.\ Ji, X.\ Yin, and S.\ Lafortune, Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions, % old title: Opacity Enforcement using Edit Functions, \emph{IEEE Transactions on Automatic Control}. Vol.\ 64, No.\ 10, October 2019, pp.\ 4369-4376.

Y. Ji, X. Yin, and S. Lafortune, "Enforcing Opacity by Insertion Functions under Multiple Energy Constraints, % and Incomplete Information," Automatica. Vol. 108, October 2019, pp. 108476 (1-14).

S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," IEEE Transactions on Automatic Control, Vol. 65, No. 4, April 2020, pp. 1739-1746.