https://wiki.eecs.umich.edu/obfuscation/index.php?title=Special:Contributions/Stephane&feed=atom&deletedOnly=&limit=50&target=Stephane&topOnly=&year=&month=Obfuscation Project - User contributions [en]2022-01-22T20:55:15ZFrom Obfuscation ProjectMediaWiki 1.17.0https://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-10-15T16:47:57Z<p>Stephane: /* Our Publications on Opacity and its Enforcement */</p>
<hr />
<div><br />
== '''Obfuscation Project of the UMDES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement by Edit Functions and Its Application to Location Privacy''.<br />
<br />
<br />
==== What is Opacity? ====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank right now; perhaps you could also be at the coffee shop next to the bank. Opacity is defined for a dynamical model that captures how observations are emitted to the outside world. This would be a mobility model in the above tracking example. Opacity can also capture inferencing about the past; for instance, the secret information could be that you visited the bank in some prior time interval.<br />
<br />
For an overview of the study of opacity, please refer to [A]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [B].<br />
<br />
[A] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[B] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
==== Applications of Opacity ====<br />
Opacity is defined in the context of a formal dynamical model, typically a nondeterministic transition system such as an NFA or a DFA with unobservable events; Petri net models have also been considered in the literature. Since it is a formal property related to information flow, it can be used to model many scenarios of privacy or security in cyber and cyber-physical systems. Our group has used location privacy to illustrate our theoretical contributions to opacity enforcement, but this is by no means the only application domain of opacity. In our current work, we have been looking at privacy issues in the context of contact tracing. Other groups have looked at other application domains, such as smart homes, mobile agents in sensor networks, encryption guarantees in pseudo-random generators, and so forth; see [A] above as well as Chapter 8 in the recent book:<br />
<br />
[C] C.H. Hadjicostis, "Estimation and Inference in Discrete Event Systems.''<br />
Springer, 2020.<br />
<br />
==== Our Publications on Opacity and its Enforcement ====<br />
<br />
The UMDES group at Michigan has been doing work on opacity and its enforcement for many years. Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Opacity enforcement by such edit functions produces a new output stream of events from the stream of events produced by the system. Hence, it acts as an output interface. We call this output interface an ''obfuscator'' and we refer to the method of opacity enforcement by edit functions as ''obfuscation''. Note that edit functions must satisfy strict constraints on the altered (or obfuscated) output stream of events. So this is not the same as security by obscurity. <br />
<br />
Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. In this case, the output stream of events is reduced so that secret-revealing strings never happen.<br />
<br />
Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement by obfuscation during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts on this topic. <br />
<br />
Current student Andrew Wintenberg is considering novel opacity verification and enforcement problems in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Location privacy is not the only application of interest for opacity enforcement by edit functions (or obfuscation). Also, the synthesis problems that we solve in our papers for different types of obfuscators are generic and therefore applicable to a wide range of domains. Recently, we have also developed case studies based on contact tracing. <br />
<br />
<br />
'''Financial acknowledgement:''' We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
[1] Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
[2] Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
[4] Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
[5] Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
[6] X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
[7] X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
[8] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
[9] C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
[10] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
[11] Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
[12] Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64, No. 10, October 2019, pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
[13] Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
[14] S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
[15] S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
[16] C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
[17] A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
[18] A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.<br />
<br />
==== Location Privacy: An Approach based on Opacity ====<br />
<br />
To illustrate our theoretical work on opacity enforcement by insertion and edit functions, or ''obfuscation'', we have used ''location privacy'' as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
The paper [10] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40. The real-time obfuscator is based on the symbolic implementation of edit functions in the paper [8] and on the SynthSMV tool presented in the paper [D] below.<br />
<br />
[D] B.C. Rawlings, S. Lafortune, and B.E. Ydstie,<br />
"Supervisory Control of Labeled Transition Systems Subject to Multiple Reachability Requirements via Symbolic Model Checking,"<br />
''IEEE Transactions on Control Systems Technology''.<br />
Vol. 8, No. 2, March 2020, pp. 644-652.<br />
<br />
The following video shows a simulation of the implementation described in [10]. The grid shown in the video represents the first floor of the Clark Kerr building on the campus of the University of California at Berkeley. [[ Media:Obfuscation-video.ogv ]]<br />
<br />
<br />
==== Obfuscation Game: Can You Keep a Secret? ====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [8] and on the SynthSMV tool presented in the paper [D].<br />
<br />
Several people contributed to the development of the Obfuscation Game; alphabetically: Andrew Bourgeois, Isaac Dubuque, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
Here is a video showing a user playing the Obfuscation Game: [[Media:Obfgame.mp4]]<br />
<br />
<br />
==== Obfuscation App: Real-time Obfuscation for Location Privacy ====<br />
<br />
The methodology from [8] used for the background obfuscator in the Obfuscation Game was also deployed in an ''Obfuscation app'' running on an ipad for real-time obfuscation when a user is moving in an area with secret locations and obstacles. Given a bounded geographical area modeled by a grid with obstacle cells and secret cells, an obfuscator is constructed so that the reported (or obfuscated) location never visits the secret grid cells. At the same time, the reported location should be within some given bound from the actual one, and the resulting obfuscated trajectory should be a valid one (i.e., avoid the obstacles). The app monitors the location of the user in real-time using GPS and the obfuscator synthesized by the methodology in [8] is used in real-time to select a reported position each time the user enters a new cell. <br />
<br />
We produced a video of the actual and obfuscated trajectories as a user holding an ipad running the app was moving in the Gerstacker Grove on the North Campus of the University of Michigan in Ann Arbor. In that video, the blue circle is the actual user location from GPS, the blue diamond is the actual cell location used as input to the obfuscator, the purple diamond is the reported position by the obfuscator, the black grid cells are obstacles, and the orange grid cell is the secret location. Here, the constraint was that the obfuscated position should never be more than 2 cells away from the actual one; moreover, the obfuscator is allowed to move at most 3 cells when the user moves one cell. It can be seen that the obfuscated trajectory is a valid one, which is provably guaranteed by construction of the obfuscator (if one exists).<br />
<br />
Video of app implementing real-time obfuscation for location privacy: [[Media:Walk_grove.mp4]]<br />
<br />
Several people contributed to the development of the Obfuscation app; alphabetically: Andrew Bourgeois, Isaac Dubuque, Dylan Lawton, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-10-15T16:47:34Z<p>Stephane: /* Our Publications on Opacity and its Enforcement */</p>
<hr />
<div><br />
== '''Obfuscation Project of the UMDES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement by Edit Functions and Its Application to Location Privacy''.<br />
<br />
<br />
==== What is Opacity? ====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank right now; perhaps you could also be at the coffee shop next to the bank. Opacity is defined for a dynamical model that captures how observations are emitted to the outside world. This would be a mobility model in the above tracking example. Opacity can also capture inferencing about the past; for instance, the secret information could be that you visited the bank in some prior time interval.<br />
<br />
For an overview of the study of opacity, please refer to [A]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [B].<br />
<br />
[A] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[B] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
==== Applications of Opacity ====<br />
Opacity is defined in the context of a formal dynamical model, typically a nondeterministic transition system such as an NFA or a DFA with unobservable events; Petri net models have also been considered in the literature. Since it is a formal property related to information flow, it can be used to model many scenarios of privacy or security in cyber and cyber-physical systems. Our group has used location privacy to illustrate our theoretical contributions to opacity enforcement, but this is by no means the only application domain of opacity. In our current work, we have been looking at privacy issues in the context of contact tracing. Other groups have looked at other application domains, such as smart homes, mobile agents in sensor networks, encryption guarantees in pseudo-random generators, and so forth; see [A] above as well as Chapter 8 in the recent book:<br />
<br />
[C] C.H. Hadjicostis, "Estimation and Inference in Discrete Event Systems.''<br />
Springer, 2020.<br />
<br />
==== Our Publications on Opacity and its Enforcement ====<br />
<br />
The UMDES group at Michigan has been doing work on opacity and its enforcement for many years. Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Opacity enforcement by such edit functions produces a new output stream of events from the stream of events produced by the system. Hence, it acts as an output interface. We call this output interface an ''obfuscator'' and we refer to the method of opacity enforcement by edit functions as ''obfuscation''. Note that edit functions must satisfy strict constraints on the altered (or obfuscated) output stream of events. So this is not the same as security by obscurity. <br />
<br />
Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. In this case, the output stream of events is reduced so that secret-revealing strings never happen.<br />
<br />
Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement by obfuscation during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts on this topic. <br />
<br />
Current student Andrew Wintenberg is considering novel opacity verification and enforcement problems in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Location privacy is not the only application of interest for opacity enforcement by edit functions (or obfuscation). Also, the synthesis problems that we solve in our papers for different types of obfuscators are generic and therefore applicable to a wide range of domains. Recently, we have also developed case studies based on contact tracing. <br />
<br />
'''Financial acknowledgement:''' We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
[1] Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
[2] Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
[4] Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
[5] Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
[6] X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
[7] X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
[8] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
[9] C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
[10] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
[11] Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
[12] Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64, No. 10, October 2019, pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
[13] Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
[14] S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
[15] S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
[16] C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
[17] A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
[18] A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.<br />
<br />
==== Location Privacy: An Approach based on Opacity ====<br />
<br />
To illustrate our theoretical work on opacity enforcement by insertion and edit functions, or ''obfuscation'', we have used ''location privacy'' as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
The paper [10] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40. The real-time obfuscator is based on the symbolic implementation of edit functions in the paper [8] and on the SynthSMV tool presented in the paper [D] below.<br />
<br />
[D] B.C. Rawlings, S. Lafortune, and B.E. Ydstie,<br />
"Supervisory Control of Labeled Transition Systems Subject to Multiple Reachability Requirements via Symbolic Model Checking,"<br />
''IEEE Transactions on Control Systems Technology''.<br />
Vol. 8, No. 2, March 2020, pp. 644-652.<br />
<br />
The following video shows a simulation of the implementation described in [10]. The grid shown in the video represents the first floor of the Clark Kerr building on the campus of the University of California at Berkeley. [[ Media:Obfuscation-video.ogv ]]<br />
<br />
<br />
==== Obfuscation Game: Can You Keep a Secret? ====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [8] and on the SynthSMV tool presented in the paper [D].<br />
<br />
Several people contributed to the development of the Obfuscation Game; alphabetically: Andrew Bourgeois, Isaac Dubuque, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
Here is a video showing a user playing the Obfuscation Game: [[Media:Obfgame.mp4]]<br />
<br />
<br />
==== Obfuscation App: Real-time Obfuscation for Location Privacy ====<br />
<br />
The methodology from [8] used for the background obfuscator in the Obfuscation Game was also deployed in an ''Obfuscation app'' running on an ipad for real-time obfuscation when a user is moving in an area with secret locations and obstacles. Given a bounded geographical area modeled by a grid with obstacle cells and secret cells, an obfuscator is constructed so that the reported (or obfuscated) location never visits the secret grid cells. At the same time, the reported location should be within some given bound from the actual one, and the resulting obfuscated trajectory should be a valid one (i.e., avoid the obstacles). The app monitors the location of the user in real-time using GPS and the obfuscator synthesized by the methodology in [8] is used in real-time to select a reported position each time the user enters a new cell. <br />
<br />
We produced a video of the actual and obfuscated trajectories as a user holding an ipad running the app was moving in the Gerstacker Grove on the North Campus of the University of Michigan in Ann Arbor. In that video, the blue circle is the actual user location from GPS, the blue diamond is the actual cell location used as input to the obfuscator, the purple diamond is the reported position by the obfuscator, the black grid cells are obstacles, and the orange grid cell is the secret location. Here, the constraint was that the obfuscated position should never be more than 2 cells away from the actual one; moreover, the obfuscator is allowed to move at most 3 cells when the user moves one cell. It can be seen that the obfuscated trajectory is a valid one, which is provably guaranteed by construction of the obfuscator (if one exists).<br />
<br />
Video of app implementing real-time obfuscation for location privacy: [[Media:Walk_grove.mp4]]<br />
<br />
Several people contributed to the development of the Obfuscation app; alphabetically: Andrew Bourgeois, Isaac Dubuque, Dylan Lawton, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-25T19:51:02Z<p>Stephane: /* Obfuscation Project of the UMDES Group at the University of Michigan */</p>
<hr />
<div><br />
== '''Obfuscation Project of the UMDES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement by Edit Functions and Its Application to Location Privacy''.<br />
<br />
<br />
==== What is Opacity? ====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank right now; perhaps you could also be at the coffee shop next to the bank. Opacity is defined for a dynamical model that captures how observations are emitted to the outside world. This would be a mobility model in the above tracking example. Opacity can also capture inferencing about the past; for instance, the secret information could be that you visited the bank in some prior time interval.<br />
<br />
For an overview of the study of opacity, please refer to [A]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [B].<br />
<br />
[A] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[B] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
==== Applications of Opacity ====<br />
Opacity is defined in the context of a formal dynamical model, typically a nondeterministic transition system such as an NFA or a DFA with unobservable events; Petri net models have also been considered in the literature. Since it is a formal property related to information flow, it can be used to model many scenarios of privacy or security in cyber and cyber-physical systems. Our group has used location privacy to illustrate our theoretical contributions to opacity enforcement, but this is by no means the only application domain of opacity. In our current work, we have been looking at privacy issues in the context of contact tracing. Other groups have looked at other application domains, such as smart homes, mobile agents in sensor networks, encryption guarantees in pseudo-random generators, and so forth; see [A] above as well as Chapter 8 in the recent book:<br />
<br />
[C] C.H. Hadjicostis, "Estimation and Inference in Discrete Event Systems.''<br />
Springer, 2020.<br />
<br />
==== Our Publications on Opacity and its Enforcement ====<br />
<br />
The UMDES group at Michigan has been doing work on opacity and its enforcement for many years. Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Opacity enforcement by such edit functions produces a new output stream of events from the stream of events produced by the system. Hence, it acts as an output interface. We call this output interface an ''obfuscator'' and we refer to the method of opacity enforcement by edit functions as ''obfuscation''. Note that edit functions must satisfy strict constraints on the altered (or obfuscated) output stream of events. So this is not the same as security by obscurity. <br />
<br />
Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. In this case, the output stream of events is reduced so that secret-revealing strings never happen.<br />
<br />
Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement by obfuscation during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts on this topic. <br />
<br />
Current student Andrew Wintenberg is considering novel opacity verification and enforcement problems in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Location privacy is not the only application of interest for opacity enforcement by edit functions (or obfuscation). Also, the synthesis problems that we solve in our papers for different types of obfuscators are generic and therefore applicable to a wide range of domains. Recently, we have also developed case studies based on contact tracing. <br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
[1] Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
[2] Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
[4] Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
[5] Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
[6] X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
[7] X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
[8] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
[9] C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
[10] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
[11] Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
[12] Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64, No. 10, October 2019, pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
[13] Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
[14] S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
[15] S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
[16] C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
[17] A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
[18] A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.<br />
<br />
<br />
==== Location Privacy: An Approach based on Opacity ====<br />
<br />
To illustrate our theoretical work on opacity enforcement by insertion and edit functions, or ''obfuscation'', we have used ''location privacy'' as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
The paper [10] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40. The real-time obfuscator is based on the symbolic implementation of edit functions in the paper [8] and on the SynthSMV tool presented in the paper [D] below.<br />
<br />
[D] B.C. Rawlings, S. Lafortune, and B.E. Ydstie,<br />
"Supervisory Control of Labeled Transition Systems Subject to Multiple Reachability Requirements via Symbolic Model Checking,"<br />
''IEEE Transactions on Control Systems Technology''.<br />
Vol. 8, No. 2, March 2020, pp. 644-652.<br />
<br />
The following video shows a simulation of the implementation described in [10]. The grid shown in the video represents the first floor of the Clark Kerr building on the campus of the University of California at Berkeley. [[ Media:Obfuscation-video.ogv ]]<br />
<br />
<br />
==== Obfuscation Game: Can You Keep a Secret? ====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [8] and on the SynthSMV tool presented in the paper [D].<br />
<br />
Several people contributed to the development of the Obfuscation Game; alphabetically: Andrew Bourgeois, Isaac Dubuque, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
Here is a video showing a user playing the Obfuscation Game: [[Media:Obfgame.mp4]]<br />
<br />
<br />
==== Obfuscation App: Real-time Obfuscation for Location Privacy ====<br />
<br />
The methodology from [8] used for the background obfuscator in the Obfuscation Game was also deployed in an ''Obfuscation app'' running on an ipad for real-time obfuscation when a user is moving in an area with secret locations and obstacles. Given a bounded geographical area modeled by a grid with obstacle cells and secret cells, an obfuscator is constructed so that the reported (or obfuscated) location never visits the secret grid cells. At the same time, the reported location should be within some given bound from the actual one, and the resulting obfuscated trajectory should be a valid one (i.e., avoid the obstacles). The app monitors the location of the user in real-time using GPS and the obfuscator synthesized by the methodology in [8] is used in real-time to select a reported position each time the user enters a new cell. <br />
<br />
We produced a video of the actual and obfuscated trajectories as a user holding an ipad running the app was moving in the Gerstacker Grove on the North Campus of the University of Michigan in Ann Arbor. In that video, the blue circle is the actual user location from GPS, the blue diamond is the actual cell location used as input to the obfuscator, the purple diamond is the reported position by the obfuscator, the black grid cells are obstacles, and the orange grid cell is the secret location. Here, the constraint was that the obfuscated position should never be more than 2 cells away from the actual one; moreover, the obfuscator is allowed to move at most 3 cells when the user moves one cell. It can be seen that the obfuscated trajectory is a valid one, which is provably guaranteed by construction of the obfuscator (if one exists).<br />
<br />
Video of app implementing real-time obfuscation for location privacy: [[Media:Walk_grove.mp4]]<br />
<br />
Several people contributed to the development of the Obfuscation app; alphabetically: Andrew Bourgeois, Isaac Dubuque, Dylan Lawton, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-25T19:38:37Z<p>Stephane: /* What is Opacity? */</p>
<hr />
<div><br />
== '''Obfuscation Project of the UMDES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement by Edit Functions and Its Application to Location Privacy''.<br />
<br />
<br />
==== What is Opacity? ====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank right now; perhaps you could also be at the coffee shop next to the bank. Opacity is defined for a dynamical model that captures how observations are emitted to the outside world. This would be a mobility model in the above tracking example. Opacity can also capture inferencing about the past; for instance, the secret information could be that you visited the bank in some prior time interval.<br />
<br />
For an overview of the study of opacity, please refer to [A]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [B].<br />
<br />
[A] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[B] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
==== Applications of Opacity ====<br />
Opacity is defined in the context of a formal dynamical model, typically a nondeterministic transition system such as an NFA or a DFA with unobservable events; Petri net models have also been considered in the literature. Since it is a formal property related to information flow, it can be used to model many scenarios of privacy or security in cyber and cyber-physical systems. Our group has used location privacy to illustrate our theoretical contributions to opacity enforcement, but this is by no means the only application domain of opacity. In our current work, we have been looking at privacy issues in the context of contact tracing. Other groups have looked at other application domains, such as smart homes, mobile agents in sensor networks, encryption guarantees in pseudo-random generators, and so forth; see [A] above as well as Chapter 8 in the recent book:<br />
<br />
[C] C.H. Hadjicostis, "Estimation and Inference in Discrete Event Systems.''<br />
Springer, 2020.<br />
<br />
==== Our Publications on Opacity and its Enforcement ====<br />
<br />
The UMDES group at Michigan has been doing work on opacity and its enforcement for many years. Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Location privacy is not the only application of interest for obfuscation. The synthesis problems we solve in our papers for different types of obfuscators are generic and therefore applicable to a wide range of domains. Recently, we have also developed case studies based on contact tracing. <br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
[1] Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
[2] Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
[4] Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
[5] Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
[6] X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
[7] X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
[8] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
[9] C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
[10] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
[11] Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
[12] Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64, No. 10, October 2019, pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
[13] Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
[14] S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
[15] S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
[16] C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
[17] A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
[18] A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.<br />
<br />
<br />
==== Location Privacy: An Approach based on Opacity ====<br />
<br />
To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used ''location privacy'' as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
The paper [10] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40. The real-time obfuscator is based on the symbolic implementation of edit functions in the paper [8] and on the SynthSMV tool presented in the paper [D] below.<br />
<br />
[D] B.C. Rawlings, S. Lafortune, and B.E. Ydstie,<br />
"Supervisory Control of Labeled Transition Systems Subject to Multiple Reachability Requirements via Symbolic Model Checking,"<br />
''IEEE Transactions on Control Systems Technology''.<br />
Vol. 8, No. 2, March 2020, pp. 644-652.<br />
<br />
The following video shows a simulation of the implementation described in [10]. The grid shown in the video represents the first floor of the Clark Kerr building on the campus of the University of California at Berkeley. [[ Media:Obfuscation-video.ogv ]]<br />
<br />
<br />
==== Obfuscation Game: Can You Keep a Secret? ====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [8] and on the SynthSMV tool presented in the paper [D].<br />
<br />
Several people contributed to the development of the Obfuscation Game; alphabetically: Andrew Bourgeois, Isaac Dubuque, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
Here is a video showing a user playing the Obfuscation Game: [[Media:Obfgame.mp4]]<br />
<br />
<br />
==== Obfuscation App: Real-time Obfuscation for Location Privacy ====<br />
<br />
The methodology from [8] used for the background obfuscator in the Obfuscation Game was also deployed in an ''Obfuscation app'' running on an ipad for real-time obfuscation when a user is moving in an area with secret locations and obstacles. Given a bounded geographical area modeled by a grid with obstacle cells and secret cells, an obfuscator is constructed so that the reported (or obfuscated) location never visits the secret grid cells. At the same time, the reported location should be within some given bound from the actual one, and the resulting obfuscated trajectory should be a valid one (i.e., avoid the obstacles). The app monitors the location of the user in real-time using GPS and the obfuscator synthesized by the methodology in [8] is used in real-time to select a reported position each time the user enters a new cell. <br />
<br />
We produced a video of the actual and obfuscated trajectories as a user holding an ipad running the app was moving in the Gerstacker Grove on the North Campus of the University of Michigan in Ann Arbor. In that video, the blue circle is the actual user location from GPS, the blue diamond is the actual cell location used as input to the obfuscator, the purple diamond is the reported position by the obfuscator, the black grid cells are obstacles, and the orange grid cell is the secret location. Here, the constraint was that the obfuscated position should never be more than 2 cells away from the actual one; moreover, the obfuscator is allowed to move at most 3 cells when the user moves one cell. It can be seen that the obfuscated trajectory is a valid one, which is provably guaranteed by construction of the obfuscator (if one exists).<br />
<br />
Video of app implementing real-time obfuscation for location privacy: [[Media:Walk_grove.mp4]]<br />
<br />
Several people contributed to the development of the Obfuscation app; alphabetically: Andrew Bourgeois, Isaac Dubuque, Dylan Lawton, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-25T19:34:17Z<p>Stephane: /* Applications of Opacity */</p>
<hr />
<div><br />
== '''Obfuscation Project of the UMDES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement by Edit Functions and Its Application to Location Privacy''.<br />
<br />
<br />
==== What is Opacity? ====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [A]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [B].<br />
<br />
[A] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[B] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
<br />
==== Applications of Opacity ====<br />
Opacity is defined in the context of a formal dynamical model, typically a nondeterministic transition system such as an NFA or a DFA with unobservable events; Petri net models have also been considered in the literature. Since it is a formal property related to information flow, it can be used to model many scenarios of privacy or security in cyber and cyber-physical systems. Our group has used location privacy to illustrate our theoretical contributions to opacity enforcement, but this is by no means the only application domain of opacity. In our current work, we have been looking at privacy issues in the context of contact tracing. Other groups have looked at other application domains, such as smart homes, mobile agents in sensor networks, encryption guarantees in pseudo-random generators, and so forth; see [A] above as well as Chapter 8 in the recent book:<br />
<br />
[C] C.H. Hadjicostis, "Estimation and Inference in Discrete Event Systems.''<br />
Springer, 2020.<br />
<br />
==== Our Publications on Opacity and its Enforcement ====<br />
<br />
The UMDES group at Michigan has been doing work on opacity and its enforcement for many years. Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Location privacy is not the only application of interest for obfuscation. The synthesis problems we solve in our papers for different types of obfuscators are generic and therefore applicable to a wide range of domains. Recently, we have also developed case studies based on contact tracing. <br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
[1] Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
[2] Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
[4] Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
[5] Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
[6] X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
[7] X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
[8] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
[9] C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
[10] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
[11] Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
[12] Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64, No. 10, October 2019, pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
[13] Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
[14] S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
[15] S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
[16] C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
[17] A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
[18] A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.<br />
<br />
<br />
==== Location Privacy: An Approach based on Opacity ====<br />
<br />
To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used ''location privacy'' as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
The paper [10] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40. The real-time obfuscator is based on the symbolic implementation of edit functions in the paper [8] and on the SynthSMV tool presented in the paper [D] below.<br />
<br />
[D] B.C. Rawlings, S. Lafortune, and B.E. Ydstie,<br />
"Supervisory Control of Labeled Transition Systems Subject to Multiple Reachability Requirements via Symbolic Model Checking,"<br />
''IEEE Transactions on Control Systems Technology''.<br />
Vol. 8, No. 2, March 2020, pp. 644-652.<br />
<br />
The following video shows a simulation of the implementation described in [10]. The grid shown in the video represents the first floor of the Clark Kerr building on the campus of the University of California at Berkeley. [[ Media:Obfuscation-video.ogv ]]<br />
<br />
<br />
==== Obfuscation Game: Can You Keep a Secret? ====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [8] and on the SynthSMV tool presented in the paper [D].<br />
<br />
Several people contributed to the development of the Obfuscation Game; alphabetically: Andrew Bourgeois, Isaac Dubuque, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
Here is a video showing a user playing the Obfuscation Game: [[Media:Obfgame.mp4]]<br />
<br />
<br />
==== Obfuscation App: Real-time Obfuscation for Location Privacy ====<br />
<br />
The methodology from [8] used for the background obfuscator in the Obfuscation Game was also deployed in an ''Obfuscation app'' running on an ipad for real-time obfuscation when a user is moving in an area with secret locations and obstacles. Given a bounded geographical area modeled by a grid with obstacle cells and secret cells, an obfuscator is constructed so that the reported (or obfuscated) location never visits the secret grid cells. At the same time, the reported location should be within some given bound from the actual one, and the resulting obfuscated trajectory should be a valid one (i.e., avoid the obstacles). The app monitors the location of the user in real-time using GPS and the obfuscator synthesized by the methodology in [8] is used in real-time to select a reported position each time the user enters a new cell. <br />
<br />
We produced a video of the actual and obfuscated trajectories as a user holding an ipad running the app was moving in the Gerstacker Grove on the North Campus of the University of Michigan in Ann Arbor. In that video, the blue circle is the actual user location from GPS, the blue diamond is the actual cell location used as input to the obfuscator, the purple diamond is the reported position by the obfuscator, the black grid cells are obstacles, and the orange grid cell is the secret location. Here, the constraint was that the obfuscated position should never be more than 2 cells away from the actual one; moreover, the obfuscator is allowed to move at most 3 cells when the user moves one cell. It can be seen that the obfuscated trajectory is a valid one, which is provably guaranteed by construction of the obfuscator (if one exists).<br />
<br />
Video of app implementing real-time obfuscation for location privacy: [[Media:Walk_grove.mp4]]<br />
<br />
Several people contributed to the development of the Obfuscation app; alphabetically: Andrew Bourgeois, Isaac Dubuque, Dylan Lawton, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-23T19:41:36Z<p>Stephane: /* Obfuscation Project of the UMDES Group at the University of Michigan */</p>
<hr />
<div><br />
== '''Obfuscation Project of the UMDES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement by Edit Functions and Its Application to Location Privacy''.<br />
<br />
<br />
==== What is Opacity? ====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [A]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [B].<br />
<br />
[A] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[B] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
<br />
==== Applications of Opacity ====<br />
Opacity is defined in the context of a formal model, typically a nondeterministic transition system such as an NFA or a DFA with unobservable events; Petri net models have also been considered in the literature. Since it is a formal property related to information flow, it can be used to model many scenarios of privacy or security in cyber and cyber-physical systems. Our group has used location privacy to illustrate our theoretical contributions to opacity enforcement, but this is by no means the only application domain of opacity. In our current work, we have been looking at privacy issues in the context of contact tracing. Other groups have looked at other application domains, such as smart homes, mobile agents in sensor networks, encryption guarantees in pseudo-random generators, and so forth; see [A] above as well as Chapter 8 in the recent book:<br />
<br />
[C] C.H. Hadjicostis, "Estimation and Inference in Discrete Event Systems.''<br />
Springer, 2020.<br />
<br />
<br />
==== Our Publications on Opacity and its Enforcement ====<br />
<br />
The UMDES group at Michigan has been doing work on opacity and its enforcement for many years. Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Location privacy is not the only application of interest for obfuscation. The synthesis problems we solve in our papers for different types of obfuscators are generic and therefore applicable to a wide range of domains. Recently, we have also developed case studies based on contact tracing. <br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
[1] Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
[2] Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
[4] Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
[5] Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
[6] X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
[7] X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
[8] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
[9] C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
[10] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
[11] Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
[12] Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64, No. 10, October 2019, pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
[13] Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
[14] S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
[15] S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
[16] C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
[17] A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
[18] A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.<br />
<br />
<br />
==== Location Privacy: An Approach based on Opacity ====<br />
<br />
To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used ''location privacy'' as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
The paper [10] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40. The real-time obfuscator is based on the symbolic implementation of edit functions in the paper [8] and on the SynthSMV tool presented in the paper [D] below.<br />
<br />
[D] B.C. Rawlings, S. Lafortune, and B.E. Ydstie,<br />
"Supervisory Control of Labeled Transition Systems Subject to Multiple Reachability Requirements via Symbolic Model Checking,"<br />
''IEEE Transactions on Control Systems Technology''.<br />
Vol. 8, No. 2, March 2020, pp. 644-652.<br />
<br />
The following video shows a simulation of the implementation described in [10]. The grid shown in the video represents the first floor of the Clark Kerr building on the campus of the University of California at Berkeley. [[ Media:Obfuscation-video.ogv ]]<br />
<br />
<br />
==== Obfuscation Game: Can You Keep a Secret? ====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [8] and on the SynthSMV tool presented in the paper [D].<br />
<br />
Several people contributed to the development of the Obfuscation Game; alphabetically: Andrew Bourgeois, Isaac Dubuque, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
Here is a video showing a user playing the Obfuscation Game: [[Media:Obfgame.mp4]]<br />
<br />
<br />
==== Obfuscation App: Real-time Obfuscation for Location Privacy ====<br />
<br />
The methodology from [8] used for the background obfuscator in the Obfuscation Game was also deployed in an ''Obfuscation app'' running on an ipad for real-time obfuscation when a user is moving in an area with secret locations and obstacles. Given a bounded geographical area modeled by a grid with obstacle cells and secret cells, an obfuscator is constructed so that the reported (or obfuscated) location never visits the secret grid cells. At the same time, the reported location should be within some given bound from the actual one, and the resulting obfuscated trajectory should be a valid one (i.e., avoid the obstacles). The app monitors the location of the user in real-time using GPS and the obfuscator synthesized by the methodology in [8] is used in real-time to select a reported position each time the user enters a new cell. <br />
<br />
We produced a video of the actual and obfuscated trajectories as a user holding an ipad running the app was moving in the Gerstacker Grove on the North Campus of the University of Michigan in Ann Arbor. In that video, the blue circle is the actual user location from GPS, the blue diamond is the actual cell location used as input to the obfuscator, the purple diamond is the reported position by the obfuscator, the black grid cells are obstacles, and the orange grid cell is the secret location. Here, the constraint was that the obfuscated position should never be more than 2 cells away from the actual one; moreover, the obfuscator is allowed to move at most 3 cells when the user moves one cell. It can be seen that the obfuscated trajectory is a valid one, which is provably guaranteed by construction of the obfuscator (if one exists).<br />
<br />
Video of app implementing real-time obfuscation for location privacy: [[Media:Walk_grove.mp4]]<br />
<br />
Several people contributed to the development of the Obfuscation app; alphabetically: Andrew Bourgeois, Isaac Dubuque, Dylan Lawton, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-23T19:40:37Z<p>Stephane: /* Obfuscation Game: Can you keep a secret? */</p>
<hr />
<div><br />
== '''Obfuscation Project of the UMDES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement by Edit Functions and Its Application to Location Privacy''.<br />
<br />
<br />
==== What is Opacity? ====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [A]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [B].<br />
<br />
[A] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[B] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
<br />
==== Applications of Opacity ====<br />
Opacity is defined in the context of a formal model, typically a nondeterministic transition system such as an NFA or a DFA with unobservable events; Petri net models have also been considered in the literature. Since it is a formal property related to information flow, it can be used to model many scenarios of privacy or security in cyber and cyber-physical systems. Our group has used location privacy to illustrate our theoretical contributions to opacity enforcement, but this is by no means the only application domain of opacity. In our current work, we have been looking at privacy issues in the context of contact tracing. Other groups have looked at other application domains, such as smart homes, mobile agents in sensor networks, encryption guarantees in pseudo-random generators, and so forth; see [A] above as well as Chapter 8 in the recent book:<br />
<br />
[C] C.H. Hadjicostis, "Estimation and Inference in Discrete Event Systems.''<br />
Springer, 2020.<br />
<br />
<br />
==== Our Publications on Opacity and its Enforcement ====<br />
<br />
The UMDES group at Michigan has been doing work on opacity and its enforcement for many years. Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Location privacy is not the only application of interest for obfuscation. The synthesis problems we solve in our papers for different types of obfuscators are generic and therefore applicable to a wide range of domains. Recently, we have also developed case studies based on contact tracing. <br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
[1] Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
[2] Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
[4] Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
[5] Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
[6] X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
[7] X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
[8] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
[9] C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
[10] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
[11] Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
[12] Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64, No. 10, October 2019, pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
[13] Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
[14] S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
[15] S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
[16] C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
[17] A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
[18] A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.<br />
<br />
==== Location Privacy: An Approach based on Opacity ====<br />
<br />
To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used ''location privacy'' as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
The paper [10] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40. The real-time obfuscator is based on the symbolic implementation of edit functions in the paper [8] and on the SynthSMV tool presented in the paper [D] below.<br />
<br />
[D] B.C. Rawlings, S. Lafortune, and B.E. Ydstie,<br />
"Supervisory Control of Labeled Transition Systems Subject to Multiple Reachability Requirements via Symbolic Model Checking,"<br />
''IEEE Transactions on Control Systems Technology''.<br />
Vol. 8, No. 2, March 2020, pp. 644-652.<br />
<br />
The following video shows a simulation of the implementation described in [10]. The grid shown in the video represents the first floor of the Clark Kerr building on the campus of the University of California at Berkeley. [[ Media:Obfuscation-video.ogv ]]<br />
<br />
<br />
==== Obfuscation Game: Can You Keep a Secret? ====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [8] and on the SynthSMV tool presented in the paper [D].<br />
<br />
Several people contributed to the development of the Obfuscation Game; alphabetically: Andrew Bourgeois, Isaac Dubuque, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
Here is a video showing a user playing the Obfuscation Game: [[Media:Obfgame.mp4]]<br />
<br />
==== Obfuscation App: Real-time Obfuscation for Location Privacy ====<br />
<br />
The methodology from [8] used for the background obfuscator in the Obfuscation Game was also deployed in an ''Obfuscation app'' running on an ipad for real-time obfuscation when a user is moving in an area with secret locations and obstacles. Given a bounded geographical area modeled by a grid with obstacle cells and secret cells, an obfuscator is constructed so that the reported (or obfuscated) location never visits the secret grid cells. At the same time, the reported location should be within some given bound from the actual one, and the resulting obfuscated trajectory should be a valid one (i.e., avoid the obstacles). The app monitors the location of the user in real-time using GPS and the obfuscator synthesized by the methodology in [8] is used in real-time to select a reported position each time the user enters a new cell. <br />
<br />
We produced a video of the actual and obfuscated trajectories as a user holding an ipad running the app was moving in the Gerstacker Grove on the North Campus of the University of Michigan in Ann Arbor. In that video, the blue circle is the actual user location from GPS, the blue diamond is the actual cell location used as input to the obfuscator, the purple diamond is the reported position by the obfuscator, the black grid cells are obstacles, and the orange grid cell is the secret location. Here, the constraint was that the obfuscated position should never be more than 2 cells away from the actual one; moreover, the obfuscator is allowed to move at most 3 cells when the user moves one cell. It can be seen that the obfuscated trajectory is a valid one, which is provably guaranteed by construction of the obfuscator (if one exists).<br />
<br />
Video of app implementing real-time obfuscation for location privacy: [[Media:Walk_grove.mp4]]<br />
<br />
Several people contributed to the development of the Obfuscation app; alphabetically: Andrew Bourgeois, Isaac Dubuque, Dylan Lawton, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-23T19:40:19Z<p>Stephane: /* Our publications on opacity and its enforcement */</p>
<hr />
<div><br />
== '''Obfuscation Project of the UMDES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement by Edit Functions and Its Application to Location Privacy''.<br />
<br />
<br />
==== What is Opacity? ====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [A]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [B].<br />
<br />
[A] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[B] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
<br />
==== Applications of Opacity ====<br />
Opacity is defined in the context of a formal model, typically a nondeterministic transition system such as an NFA or a DFA with unobservable events; Petri net models have also been considered in the literature. Since it is a formal property related to information flow, it can be used to model many scenarios of privacy or security in cyber and cyber-physical systems. Our group has used location privacy to illustrate our theoretical contributions to opacity enforcement, but this is by no means the only application domain of opacity. In our current work, we have been looking at privacy issues in the context of contact tracing. Other groups have looked at other application domains, such as smart homes, mobile agents in sensor networks, encryption guarantees in pseudo-random generators, and so forth; see [A] above as well as Chapter 8 in the recent book:<br />
<br />
[C] C.H. Hadjicostis, "Estimation and Inference in Discrete Event Systems.''<br />
Springer, 2020.<br />
<br />
<br />
==== Our Publications on Opacity and its Enforcement ====<br />
<br />
The UMDES group at Michigan has been doing work on opacity and its enforcement for many years. Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Location privacy is not the only application of interest for obfuscation. The synthesis problems we solve in our papers for different types of obfuscators are generic and therefore applicable to a wide range of domains. Recently, we have also developed case studies based on contact tracing. <br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
[1] Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
[2] Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
[4] Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
[5] Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
[6] X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
[7] X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
[8] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
[9] C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
[10] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
[11] Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
[12] Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64, No. 10, October 2019, pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
[13] Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
[14] S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
[15] S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
[16] C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
[17] A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
[18] A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.<br />
<br />
==== Location Privacy: An Approach based on Opacity ====<br />
<br />
To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used ''location privacy'' as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
The paper [10] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40. The real-time obfuscator is based on the symbolic implementation of edit functions in the paper [8] and on the SynthSMV tool presented in the paper [D] below.<br />
<br />
[D] B.C. Rawlings, S. Lafortune, and B.E. Ydstie,<br />
"Supervisory Control of Labeled Transition Systems Subject to Multiple Reachability Requirements via Symbolic Model Checking,"<br />
''IEEE Transactions on Control Systems Technology''.<br />
Vol. 8, No. 2, March 2020, pp. 644-652.<br />
<br />
The following video shows a simulation of the implementation described in [10]. The grid shown in the video represents the first floor of the Clark Kerr building on the campus of the University of California at Berkeley. [[ Media:Obfuscation-video.ogv ]]<br />
<br />
<br />
==== Obfuscation Game: Can you keep a secret? ====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [8] and on the SynthSMV tool presented in the paper [D].<br />
<br />
Several people contributed to the development of the Obfuscation Game; alphabetically: Andrew Bourgeois, Isaac Dubuque, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
Here is a video showing a user playing the Obfuscation Game: [[Media:Obfgame.mp4]]<br />
<br />
<br />
==== Obfuscation App: Real-time Obfuscation for Location Privacy ====<br />
<br />
The methodology from [8] used for the background obfuscator in the Obfuscation Game was also deployed in an ''Obfuscation app'' running on an ipad for real-time obfuscation when a user is moving in an area with secret locations and obstacles. Given a bounded geographical area modeled by a grid with obstacle cells and secret cells, an obfuscator is constructed so that the reported (or obfuscated) location never visits the secret grid cells. At the same time, the reported location should be within some given bound from the actual one, and the resulting obfuscated trajectory should be a valid one (i.e., avoid the obstacles). The app monitors the location of the user in real-time using GPS and the obfuscator synthesized by the methodology in [8] is used in real-time to select a reported position each time the user enters a new cell. <br />
<br />
We produced a video of the actual and obfuscated trajectories as a user holding an ipad running the app was moving in the Gerstacker Grove on the North Campus of the University of Michigan in Ann Arbor. In that video, the blue circle is the actual user location from GPS, the blue diamond is the actual cell location used as input to the obfuscator, the purple diamond is the reported position by the obfuscator, the black grid cells are obstacles, and the orange grid cell is the secret location. Here, the constraint was that the obfuscated position should never be more than 2 cells away from the actual one; moreover, the obfuscator is allowed to move at most 3 cells when the user moves one cell. It can be seen that the obfuscated trajectory is a valid one, which is provably guaranteed by construction of the obfuscator (if one exists).<br />
<br />
Video of app implementing real-time obfuscation for location privacy: [[Media:Walk_grove.mp4]]<br />
<br />
Several people contributed to the development of the Obfuscation app; alphabetically: Andrew Bourgeois, Isaac Dubuque, Dylan Lawton, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-23T19:39:42Z<p>Stephane: /* Obfuscation for Location Privacy: Real-time Demonstration of Obfuscation app */</p>
<hr />
<div><br />
== '''Obfuscation Project of the UMDES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement by Edit Functions and Its Application to Location Privacy''.<br />
<br />
<br />
==== What is Opacity? ====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [A]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [B].<br />
<br />
[A] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[B] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
<br />
==== Applications of Opacity ====<br />
Opacity is defined in the context of a formal model, typically a nondeterministic transition system such as an NFA or a DFA with unobservable events; Petri net models have also been considered in the literature. Since it is a formal property related to information flow, it can be used to model many scenarios of privacy or security in cyber and cyber-physical systems. Our group has used location privacy to illustrate our theoretical contributions to opacity enforcement, but this is by no means the only application domain of opacity. In our current work, we have been looking at privacy issues in the context of contact tracing. Other groups have looked at other application domains, such as smart homes, mobile agents in sensor networks, encryption guarantees in pseudo-random generators, and so forth; see [A] above as well as Chapter 8 in the recent book:<br />
<br />
[C] C.H. Hadjicostis, "Estimation and Inference in Discrete Event Systems.''<br />
Springer, 2020.<br />
<br />
<br />
==== Our publications on opacity and its enforcement ====<br />
<br />
The UMDES group at Michigan has been doing work on opacity and its enforcement for many years. Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Location privacy is not the only application of interest for obfuscation. The synthesis problems we solve in our papers for different types of obfuscators are generic and therefore applicable to a wide range of domains. Recently, we have also developed case studies based on contact tracing. <br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
[1] Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
[2] Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
[4] Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
[5] Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
[6] X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
[7] X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
[8] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
[9] C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
[10] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
[11] Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
[12] Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64, No. 10, October 2019, pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
[13] Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
[14] S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
[15] S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
[16] C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
[17] A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
[18] A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.<br />
<br />
<br />
==== Location Privacy: An Approach based on Opacity ====<br />
<br />
To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used ''location privacy'' as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
The paper [10] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40. The real-time obfuscator is based on the symbolic implementation of edit functions in the paper [8] and on the SynthSMV tool presented in the paper [D] below.<br />
<br />
[D] B.C. Rawlings, S. Lafortune, and B.E. Ydstie,<br />
"Supervisory Control of Labeled Transition Systems Subject to Multiple Reachability Requirements via Symbolic Model Checking,"<br />
''IEEE Transactions on Control Systems Technology''.<br />
Vol. 8, No. 2, March 2020, pp. 644-652.<br />
<br />
The following video shows a simulation of the implementation described in [10]. The grid shown in the video represents the first floor of the Clark Kerr building on the campus of the University of California at Berkeley. [[ Media:Obfuscation-video.ogv ]]<br />
<br />
<br />
==== Obfuscation Game: Can you keep a secret? ====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [8] and on the SynthSMV tool presented in the paper [D].<br />
<br />
Several people contributed to the development of the Obfuscation Game; alphabetically: Andrew Bourgeois, Isaac Dubuque, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
Here is a video showing a user playing the Obfuscation Game: [[Media:Obfgame.mp4]]<br />
<br />
<br />
==== Obfuscation App: Real-time Obfuscation for Location Privacy ====<br />
<br />
The methodology from [8] used for the background obfuscator in the Obfuscation Game was also deployed in an ''Obfuscation app'' running on an ipad for real-time obfuscation when a user is moving in an area with secret locations and obstacles. Given a bounded geographical area modeled by a grid with obstacle cells and secret cells, an obfuscator is constructed so that the reported (or obfuscated) location never visits the secret grid cells. At the same time, the reported location should be within some given bound from the actual one, and the resulting obfuscated trajectory should be a valid one (i.e., avoid the obstacles). The app monitors the location of the user in real-time using GPS and the obfuscator synthesized by the methodology in [8] is used in real-time to select a reported position each time the user enters a new cell. <br />
<br />
We produced a video of the actual and obfuscated trajectories as a user holding an ipad running the app was moving in the Gerstacker Grove on the North Campus of the University of Michigan in Ann Arbor. In that video, the blue circle is the actual user location from GPS, the blue diamond is the actual cell location used as input to the obfuscator, the purple diamond is the reported position by the obfuscator, the black grid cells are obstacles, and the orange grid cell is the secret location. Here, the constraint was that the obfuscated position should never be more than 2 cells away from the actual one; moreover, the obfuscator is allowed to move at most 3 cells when the user moves one cell. It can be seen that the obfuscated trajectory is a valid one, which is provably guaranteed by construction of the obfuscator (if one exists).<br />
<br />
Video of app implementing real-time obfuscation for location privacy: [[Media:Walk_grove.mp4]]<br />
<br />
Several people contributed to the development of the Obfuscation app; alphabetically: Andrew Bourgeois, Isaac Dubuque, Dylan Lawton, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-23T19:36:55Z<p>Stephane: /* Obfuscation Project of the UMDES Group at the University of Michigan */</p>
<hr />
<div><br />
== '''Obfuscation Project of the UMDES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement by Edit Functions and Its Application to Location Privacy''.<br />
<br />
<br />
==== What is Opacity? ====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [A]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [B].<br />
<br />
[A] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[B] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
<br />
==== Applications of Opacity ====<br />
Opacity is defined in the context of a formal model, typically a nondeterministic transition system such as an NFA or a DFA with unobservable events; Petri net models have also been considered in the literature. Since it is a formal property related to information flow, it can be used to model many scenarios of privacy or security in cyber and cyber-physical systems. Our group has used location privacy to illustrate our theoretical contributions to opacity enforcement, but this is by no means the only application domain of opacity. In our current work, we have been looking at privacy issues in the context of contact tracing. Other groups have looked at other application domains, such as smart homes, mobile agents in sensor networks, encryption guarantees in pseudo-random generators, and so forth; see [A] above as well as Chapter 8 in the recent book:<br />
<br />
[C] C.H. Hadjicostis, "Estimation and Inference in Discrete Event Systems.''<br />
Springer, 2020.<br />
<br />
<br />
==== Our publications on opacity and its enforcement ====<br />
<br />
The UMDES group at Michigan has been doing work on opacity and its enforcement for many years. Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Location privacy is not the only application of interest for obfuscation. The synthesis problems we solve in our papers for different types of obfuscators are generic and therefore applicable to a wide range of domains. Recently, we have also developed case studies based on contact tracing. <br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
[1] Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
[2] Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
[4] Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
[5] Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
[6] X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
[7] X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
[8] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
[9] C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
[10] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
[11] Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
[12] Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64, No. 10, October 2019, pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
[13] Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
[14] S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
[15] S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
[16] C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
[17] A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
[18] A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.<br />
<br />
<br />
==== Location Privacy: An Approach based on Opacity ====<br />
<br />
To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used ''location privacy'' as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
The paper [10] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40. The real-time obfuscator is based on the symbolic implementation of edit functions in the paper [8] and on the SynthSMV tool presented in the paper [D] below.<br />
<br />
[D] B.C. Rawlings, S. Lafortune, and B.E. Ydstie,<br />
"Supervisory Control of Labeled Transition Systems Subject to Multiple Reachability Requirements via Symbolic Model Checking,"<br />
''IEEE Transactions on Control Systems Technology''.<br />
Vol. 8, No. 2, March 2020, pp. 644-652.<br />
<br />
The following video shows a simulation of the implementation described in [10]. The grid shown in the video represents the first floor of the Clark Kerr building on the campus of the University of California at Berkeley. [[ Media:Obfuscation-video.ogv ]]<br />
<br />
<br />
==== Obfuscation Game: Can you keep a secret? ====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [8] and on the SynthSMV tool presented in the paper [D].<br />
<br />
Several people contributed to the development of the Obfuscation Game; alphabetically: Andrew Bourgeois, Isaac Dubuque, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
Here is a video showing a user playing the Obfuscation Game: [[Media:Obfgame.mp4]]<br />
<br />
<br />
==== Obfuscation for Location Privacy: Real-time Demonstration of Obfuscation app ====<br />
<br />
The methodology from [8] used for the background obfuscator in the Obfuscation Game was also deployed in an ''Obfuscation app'' running on an ipad for real-time obfuscation when a user is moving in an area with secret locations and obstacles. Given a bounded geographical area modeled by a grid with obstacle cells and secret cells, an obfuscator is constructed so that the reported (or obfuscated) location never visits the secret grid cells. At the same time, the reported location should be within some given bound from the actual one, and the resulting obfuscated trajectory should be a valid one (i.e., avoid the obstacles). The app monitors the location of the user in real-time using GPS and the obfuscator synthesized by the methodology in [8] is used in real-time to select a reported position each time the user enters a new cell. <br />
<br />
We produced a video of the actual and obfuscated trajectories as a user holding an ipad running the app was moving in the Gerstacker Grove on the North Campus of the University of Michigan in Ann Arbor. In that video, the blue circle is the actual user location from GPS, the blue diamond is the actual cell location used as input to the obfuscator, the purple diamond is the reported position by the obfuscator, the black grid cells are obstacles, and the orange grid cell is the secret location. Here, the constraint was that the obfuscated position should never be more than 2 cells away from the actual one; moreover, the obfuscator is allowed to move at most 3 cells when the user moves one cell. It can be seen that the obfuscated trajectory is a valid one, which is provably guaranteed by construction of the obfuscator (if one exists).<br />
<br />
Video of app implementing real-time obfuscation for location privacy: [[Media:Walk_grove.mp4]]<br />
<br />
Several people contributed to the development of the Obfuscation app; alphabetically: Andrew Bourgeois, Isaac Dubuque, Dylan Lawton, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-23T19:33:20Z<p>Stephane: /* Obfuscation Game: Can you keep a secret? */</p>
<hr />
<div><br />
== '''Obfuscation Project of the UMDES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement by Edit Functions and Its Application to Location Privacy''.<br />
<br />
<br />
==== What is Opacity? ====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [A]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [B].<br />
<br />
[A] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[B] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
<br />
====Applications of Opacity====<br />
Opacity is defined in the context of a formal model, typically a nondeterministic transition system such as an NFA or a DFA with unobservable events; Petri net models have also been considered in the literature. Since it is a formal property related to information flow, it can be used to model many scenarios of privacy or security in cyber and cyber-physical systems. Our group has used location privacy to illustrate our theoretical contributions to opacity enforcement, but this is by no means the only application domain of opacity. In our current work, we have been looking at privacy issues in the context of contact tracing. Other groups have looked at other application domains, such as smart homes, mobile agents in sensor networks, encryption guarantees in pseudo-random generators, and so forth; see [A] above as well as Chapter 8 in the recent book:<br />
<br />
[C] C.H. Hadjicostis, "Estimation and Inference in Discrete Event Systems.''<br />
Springer, 2020.<br />
<br />
<br />
==== Our publications on opacity and its enforcement ====<br />
<br />
The UMDES group at Michigan has been doing work on opacity and its enforcement for many years. Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Location privacy is not the only application of interest for obfuscation. The synthesis problems we solve in our papers for different types of obfuscators are generic and therefore applicable to a wide range of domains. Recently, we have also developed case studies based on contact tracing. <br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
[1] Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
[2] Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
[4] Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
[5] Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
[6] X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
[7] X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
[8] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
[9] C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
[10] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
[11] Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
[12] Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
[13] Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
[14] S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
[15] S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
[16] C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
[17] A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
[18] A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.<br />
<br />
<br />
==== Location Privacy: An Approach based on Opacity ====<br />
<br />
To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used ''location privacy'' as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
The paper [10] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40. The real-time obfuscator is based on the symbolic implementation of edit functions in the paper [8] and on the SynthSMV tool presented in the paper [D] below.<br />
<br />
[D] B.C. Rawlings, S. Lafortune, and B.E. Ydstie,<br />
"Supervisory Control of Labeled Transition Systems Subject to Multiple Reachability Requirements via Symbolic Model Checking,"<br />
''IEEE Transactions on Control Systems Technology''.<br />
Vol. 8, No. 2, March 2020, pp. 644-652.<br />
<br />
The following video shows a simulation of the implementation described in [10]. The grid shown in the video represents the first floor of the Clark Kerr building on the campus of the University of California at Berkeley. [[ Media:Obfuscation-video.ogv ]]<br />
<br />
<br />
==== Obfuscation Game: Can you keep a secret? ====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [8] and on the SynthSMV tool presented in the paper [D].<br />
<br />
Several people contributed to the development of the Obfuscation Game; alphabetically: Andrew Bourgeois, Isaac Dubuque, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
Here is a video showing a user playing the Obfuscation Game: [[Media:Obfgame.mp4]]<br />
<br />
==== Obfuscation for Location Privacy: Real-time Demonstration of Obfuscation app ====<br />
<br />
The methodology from [8] and [D] used for the background obfuscator in the Obfuscation Game was also deployed in an ''Obfuscation app'' running on an ipad for real-time obfuscation when a user is moving in an area with secret locations and obstacles. Given a bounded geographical area modeled by a grid with obstacle cells and secret cells, an obfuscator is constructed so that the reported (or obfuscated) location never visits the secret grid cells. At the same time, the reported location should be within some given bound from the actual one, and the resulting obfuscated trajectory should be a valid one (i.e., avoid the obstacles). The app monitors the location of the user in real-time using GPS and the obfuscator synthesized by the methodology in [5] is used in real-time to select a reported position each time the user enters a new cell. <br />
<br />
We produced a video of the actual and obfuscated trajectories as a user holding an ipad running the app was moving in the Gerstacker Grove on the North Campus of the University of Michigan in Ann Arbor. In that video, the blue circle is the actual user location from GPS, the blue diamond is the actual cell location used as input to the obfuscator, the purple diamond is the reported position by the obfuscator, the black grid cells are obstacles, and the orange grid cell is the secret location. Here, the constraint was that the obfuscated position should never be more than 2 cells away from the actual one; moreover, the obfuscator is allowed to move at most 3 cells when the user moves one cell. It can be seen that the obfuscated trajectory is a valid one, which is provably guaranteed by construction of the obfuscator (if one exists).<br />
<br />
Video of app implementing real-time obfuscation for location privacy: [[Media:Walk_grove.mp4]]<br />
<br />
Several people contributed to the development of the Obfuscation app; alphabetically: Andrew Bourgeois, Isaac Dubuque, Dylan Lawton, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-23T19:31:51Z<p>Stephane: /* Applications of Opacity */</p>
<hr />
<div><br />
== '''Obfuscation Project of the UMDES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement by Edit Functions and Its Application to Location Privacy''.<br />
<br />
<br />
==== What is Opacity? ====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [A]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [B].<br />
<br />
[A] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[B] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
<br />
====Applications of Opacity====<br />
Opacity is defined in the context of a formal model, typically a nondeterministic transition system such as an NFA or a DFA with unobservable events; Petri net models have also been considered in the literature. Since it is a formal property related to information flow, it can be used to model many scenarios of privacy or security in cyber and cyber-physical systems. Our group has used location privacy to illustrate our theoretical contributions to opacity enforcement, but this is by no means the only application domain of opacity. In our current work, we have been looking at privacy issues in the context of contact tracing. Other groups have looked at other application domains, such as smart homes, mobile agents in sensor networks, encryption guarantees in pseudo-random generators, and so forth; see [A] above as well as Chapter 8 in the recent book:<br />
<br />
[C] C.H. Hadjicostis, "Estimation and Inference in Discrete Event Systems.''<br />
Springer, 2020.<br />
<br />
<br />
==== Our publications on opacity and its enforcement ====<br />
<br />
The UMDES group at Michigan has been doing work on opacity and its enforcement for many years. Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Location privacy is not the only application of interest for obfuscation. The synthesis problems we solve in our papers for different types of obfuscators are generic and therefore applicable to a wide range of domains. Recently, we have also developed case studies based on contact tracing. <br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
[1] Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
[2] Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
[4] Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
[5] Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
[6] X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
[7] X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
[8] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
[9] C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
[10] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
[11] Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
[12] Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
[13] Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
[14] S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
[15] S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
[16] C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
[17] A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
[18] A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.<br />
<br />
<br />
==== Location Privacy: An Approach based on Opacity ====<br />
<br />
To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used ''location privacy'' as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
The paper [10] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40. The real-time obfuscator is based on the symbolic implementation of edit functions in the paper [8] and on the SynthSMV tool presented in the paper [D] below.<br />
<br />
[D] B.C. Rawlings, S. Lafortune, and B.E. Ydstie,<br />
"Supervisory Control of Labeled Transition Systems Subject to Multiple Reachability Requirements via Symbolic Model Checking,"<br />
''IEEE Transactions on Control Systems Technology''.<br />
Vol. 8, No. 2, March 2020, pp. 644-652.<br />
<br />
The following video shows a simulation of the implementation described in [10]. The grid shown in the video represents the first floor of the Clark Kerr building on the campus of the University of California at Berkeley. [[ Media:Obfuscation-video.ogv ]]<br />
<br />
<br />
==== Obfuscation Game: Can you keep a secret? ====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [8] and on the SynthSMV tool presented in the paper [6].<br />
<br />
Several people contributed to the development of the Obfuscation Game; alphabetically: Andrew Bourgeois, Isaac Dubuque, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
Here is a video showing a user playing the Obfuscation Game: [[Media:Obfgame.mp4]]<br />
<br />
<br />
==== Obfuscation for Location Privacy: Real-time Demonstration of Obfuscation app ====<br />
<br />
The methodology from [8] and [D] used for the background obfuscator in the Obfuscation Game was also deployed in an ''Obfuscation app'' running on an ipad for real-time obfuscation when a user is moving in an area with secret locations and obstacles. Given a bounded geographical area modeled by a grid with obstacle cells and secret cells, an obfuscator is constructed so that the reported (or obfuscated) location never visits the secret grid cells. At the same time, the reported location should be within some given bound from the actual one, and the resulting obfuscated trajectory should be a valid one (i.e., avoid the obstacles). The app monitors the location of the user in real-time using GPS and the obfuscator synthesized by the methodology in [5] is used in real-time to select a reported position each time the user enters a new cell. <br />
<br />
We produced a video of the actual and obfuscated trajectories as a user holding an ipad running the app was moving in the Gerstacker Grove on the North Campus of the University of Michigan in Ann Arbor. In that video, the blue circle is the actual user location from GPS, the blue diamond is the actual cell location used as input to the obfuscator, the purple diamond is the reported position by the obfuscator, the black grid cells are obstacles, and the orange grid cell is the secret location. Here, the constraint was that the obfuscated position should never be more than 2 cells away from the actual one; moreover, the obfuscator is allowed to move at most 3 cells when the user moves one cell. It can be seen that the obfuscated trajectory is a valid one, which is provably guaranteed by construction of the obfuscator (if one exists).<br />
<br />
Video of app implementing real-time obfuscation for location privacy: [[Media:Walk_grove.mp4]]<br />
<br />
Several people contributed to the development of the Obfuscation app; alphabetically: Andrew Bourgeois, Isaac Dubuque, Dylan Lawton, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-23T19:31:28Z<p>Stephane: /* Obfuscation Project of the UMDES Group at the University of Michigan */</p>
<hr />
<div><br />
== '''Obfuscation Project of the UMDES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement by Edit Functions and Its Application to Location Privacy''.<br />
<br />
<br />
==== What is Opacity? ====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [A]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [B].<br />
<br />
[A] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[B] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
<br />
==Applications of Opacity==<br />
Opacity is defined in the context of a formal model, typically a nondeterministic transition system such as an NFA or a DFA with unobservable events; Petri net models have also been considered in the literature. Since it is a formal property related to information flow, it can be used to model many scenarios of privacy or security in cyber and cyber-physical systems. Our group has used location privacy to illustrate our theoretical contributions to opacity enforcement, but this is by no means the only application domain of opacity. In our current work, we have been looking at privacy issues in the context of contact tracing. Other groups have looked at other application domains, such as smart homes, mobile agents in sensor networks, encryption guarantees in pseudo-random generators, and so forth; see [A] above as well as Chapter 8 in the recent book:<br />
<br />
[C] C.H. Hadjicostis, "Estimation and Inference in Discrete Event Systems.''<br />
Springer, 2020.<br />
<br />
<br />
==== Our publications on opacity and its enforcement ====<br />
<br />
The UMDES group at Michigan has been doing work on opacity and its enforcement for many years. Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Location privacy is not the only application of interest for obfuscation. The synthesis problems we solve in our papers for different types of obfuscators are generic and therefore applicable to a wide range of domains. Recently, we have also developed case studies based on contact tracing. <br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
[1] Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
[2] Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
[4] Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
[5] Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
[6] X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
[7] X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
[8] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
[9] C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
[10] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
[11] Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
[12] Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
[13] Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
[14] S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
[15] S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
[16] C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
[17] A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
[18] A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.<br />
<br />
<br />
==== Location Privacy: An Approach based on Opacity ====<br />
<br />
To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used ''location privacy'' as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
The paper [10] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40. The real-time obfuscator is based on the symbolic implementation of edit functions in the paper [8] and on the SynthSMV tool presented in the paper [D] below.<br />
<br />
[D] B.C. Rawlings, S. Lafortune, and B.E. Ydstie,<br />
"Supervisory Control of Labeled Transition Systems Subject to Multiple Reachability Requirements via Symbolic Model Checking,"<br />
''IEEE Transactions on Control Systems Technology''.<br />
Vol. 8, No. 2, March 2020, pp. 644-652.<br />
<br />
The following video shows a simulation of the implementation described in [10]. The grid shown in the video represents the first floor of the Clark Kerr building on the campus of the University of California at Berkeley. [[ Media:Obfuscation-video.ogv ]]<br />
<br />
<br />
==== Obfuscation Game: Can you keep a secret? ====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [8] and on the SynthSMV tool presented in the paper [6].<br />
<br />
Several people contributed to the development of the Obfuscation Game; alphabetically: Andrew Bourgeois, Isaac Dubuque, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
Here is a video showing a user playing the Obfuscation Game: [[Media:Obfgame.mp4]]<br />
<br />
<br />
==== Obfuscation for Location Privacy: Real-time Demonstration of Obfuscation app ====<br />
<br />
The methodology from [8] and [D] used for the background obfuscator in the Obfuscation Game was also deployed in an ''Obfuscation app'' running on an ipad for real-time obfuscation when a user is moving in an area with secret locations and obstacles. Given a bounded geographical area modeled by a grid with obstacle cells and secret cells, an obfuscator is constructed so that the reported (or obfuscated) location never visits the secret grid cells. At the same time, the reported location should be within some given bound from the actual one, and the resulting obfuscated trajectory should be a valid one (i.e., avoid the obstacles). The app monitors the location of the user in real-time using GPS and the obfuscator synthesized by the methodology in [5] is used in real-time to select a reported position each time the user enters a new cell. <br />
<br />
We produced a video of the actual and obfuscated trajectories as a user holding an ipad running the app was moving in the Gerstacker Grove on the North Campus of the University of Michigan in Ann Arbor. In that video, the blue circle is the actual user location from GPS, the blue diamond is the actual cell location used as input to the obfuscator, the purple diamond is the reported position by the obfuscator, the black grid cells are obstacles, and the orange grid cell is the secret location. Here, the constraint was that the obfuscated position should never be more than 2 cells away from the actual one; moreover, the obfuscator is allowed to move at most 3 cells when the user moves one cell. It can be seen that the obfuscated trajectory is a valid one, which is provably guaranteed by construction of the obfuscator (if one exists).<br />
<br />
Video of app implementing real-time obfuscation for location privacy: [[Media:Walk_grove.mp4]]<br />
<br />
Several people contributed to the development of the Obfuscation app; alphabetically: Andrew Bourgeois, Isaac Dubuque, Dylan Lawton, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-22T21:05:16Z<p>Stephane: /* Obfuscation Project of the DES Group at the University of Michigan */</p>
<hr />
<div><br />
== '''Obfuscation Project of the UMDES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement by Edit Functions and Its Application to Location Privacy''.<br />
<br />
<br />
==== What is Opacity? ====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].<br />
<br />
[1] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[2] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
<br />
==== Location Privacy: An Approach based on Opacity ====<br />
<br />
The UMDES group at Michigan has been doing work on opacity and its enforcement for many years. More on this in the section below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40. The real-time obfuscator is based on the symbolic implementation of edit functions in the paper [5] and on the SynthSMV tool presented in the paper [6].<br />
<br />
[4] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
[5] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
[6] B.C. Rawlings, S. Lafortune, and B.E. Ydstie,<br />
"Supervisory Control of Labeled Transition Systems Subject to Multiple Reachability Requirements via Symbolic Model Checking,"<br />
''IEEE Transactions on Control Systems Technology''.<br />
Vol. 8, No. 2, March 2020, pp. 644-652.<br />
<br />
The following video shows a simulation of the implementation described in [4]. The grid shown in the video represents the first floor of the Clark Kerr building on the campus of the University of California at Berkeley. [[ Media:Obfuscation-video.ogv ]]<br />
<br />
==== Obfuscation Game: Can you keep a secret? ====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5] and on the SynthSMV tool presented in the paper [6].<br />
<br />
Several people contributed to the development of the Obfuscation Game; alphabetically: Andrew Bourgeois, Isaac Dubuque, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
Here is a video showing a user playing the Obfuscation Game: [[Media:Obfgame.mp4]]<br />
<br />
<br />
==== Obfuscation for Location Privacy: Real-time Demonstration of Obfuscation app ====<br />
<br />
The methodology from [5]-[6] used for the background obfuscator in the Obfuscation Game was also deployed in an ''Obfuscation app'' running on an ipad for real-time obfuscation when a user is moving in an area with secret locations and obstacles. Given a bounded geographical area modeled by a grid with obstacle cells and secret cells, an obfuscator is constructed so that the reported (or obfuscated) location never visits the secret grid cells. At the same time, the reported location should be within some given bound from the actual one, and the resulting obfuscated trajectory should be a valid one (i.e., avoid the obstacles). The app monitors the location of the user in real-time using GPS and the obfuscator synthesized by the methodology in [5] is used in real-time to select a reported position each time the user enters a new cell. <br />
<br />
We produced a video of the actual and obfuscated trajectories as a user holding an ipad running the app was moving in the Gerstacker Grove on the North Campus of the University of Michigan in Ann Arbor. In that video, the blue circle is the actual user location from GPS, the blue diamond is the actual cell location used as input to the obfuscator, the purple diamond is the reported position by the obfuscator, the black grid cells are obstacles, and the orange grid cell is the secret location. Here, the constraint was that the obfuscated position should never be more than 2 cells away from the actual one; moreover, the obfuscator is allowed to move at most 3 cells when the user moves one cell. It can be seen that the obfuscated trajectory is a valid one, which is provably guaranteed by construction of the obfuscator (if one exists).<br />
<br />
Video of app implementing real-time obfuscation for location privacy: [[Media:Walk_grove.mp4]]<br />
<br />
Several people contributed to the development of the Obfuscation app; alphabetically: Andrew Bourgeois, Isaac Dubuque, Dylan Lawton, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.<br />
<br />
<br />
==== Our publications on opacity and its enforcement ====<br />
<br />
Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Location privacy is not the only application of interest for obfuscation. The synthesis problems we solve in our papers for different types of obfuscators are generic and therefore applicable to a wide range of domains. Recently, we have also developed case studies based on contact tracing. <br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-22T21:04:45Z<p>Stephane: /* Location Privacy: An Approach based on Opacity */</p>
<hr />
<div><br />
== '''Obfuscation Project of the DES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement by Edit Functions and Its Application to Location Privacy''.<br />
<br />
<br />
==== What is Opacity? ====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].<br />
<br />
[1] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[2] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
<br />
==== Location Privacy: An Approach based on Opacity ====<br />
<br />
The UMDES group at Michigan has been doing work on opacity and its enforcement for many years. More on this in the section below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40. The real-time obfuscator is based on the symbolic implementation of edit functions in the paper [5] and on the SynthSMV tool presented in the paper [6].<br />
<br />
[4] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
[5] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
[6] B.C. Rawlings, S. Lafortune, and B.E. Ydstie,<br />
"Supervisory Control of Labeled Transition Systems Subject to Multiple Reachability Requirements via Symbolic Model Checking,"<br />
''IEEE Transactions on Control Systems Technology''.<br />
Vol. 8, No. 2, March 2020, pp. 644-652.<br />
<br />
The following video shows a simulation of the implementation described in [4]. The grid shown in the video represents the first floor of the Clark Kerr building on the campus of the University of California at Berkeley. [[ Media:Obfuscation-video.ogv ]]<br />
<br />
==== Obfuscation Game: Can you keep a secret? ====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5] and on the SynthSMV tool presented in the paper [6].<br />
<br />
Several people contributed to the development of the Obfuscation Game; alphabetically: Andrew Bourgeois, Isaac Dubuque, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
Here is a video showing a user playing the Obfuscation Game: [[Media:Obfgame.mp4]]<br />
<br />
<br />
==== Obfuscation for Location Privacy: Real-time Demonstration of Obfuscation app ====<br />
<br />
The methodology from [5]-[6] used for the background obfuscator in the Obfuscation Game was also deployed in an ''Obfuscation app'' running on an ipad for real-time obfuscation when a user is moving in an area with secret locations and obstacles. Given a bounded geographical area modeled by a grid with obstacle cells and secret cells, an obfuscator is constructed so that the reported (or obfuscated) location never visits the secret grid cells. At the same time, the reported location should be within some given bound from the actual one, and the resulting obfuscated trajectory should be a valid one (i.e., avoid the obstacles). The app monitors the location of the user in real-time using GPS and the obfuscator synthesized by the methodology in [5] is used in real-time to select a reported position each time the user enters a new cell. <br />
<br />
We produced a video of the actual and obfuscated trajectories as a user holding an ipad running the app was moving in the Gerstacker Grove on the North Campus of the University of Michigan in Ann Arbor. In that video, the blue circle is the actual user location from GPS, the blue diamond is the actual cell location used as input to the obfuscator, the purple diamond is the reported position by the obfuscator, the black grid cells are obstacles, and the orange grid cell is the secret location. Here, the constraint was that the obfuscated position should never be more than 2 cells away from the actual one; moreover, the obfuscator is allowed to move at most 3 cells when the user moves one cell. It can be seen that the obfuscated trajectory is a valid one, which is provably guaranteed by construction of the obfuscator (if one exists).<br />
<br />
Video of app implementing real-time obfuscation for location privacy: [[Media:Walk_grove.mp4]]<br />
<br />
Several people contributed to the development of the Obfuscation app; alphabetically: Andrew Bourgeois, Isaac Dubuque, Dylan Lawton, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.<br />
<br />
<br />
==== Our publications on opacity and its enforcement ====<br />
<br />
Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Location privacy is not the only application of interest for obfuscation. The synthesis problems we solve in our papers for different types of obfuscators are generic and therefore applicable to a wide range of domains. Recently, we have also developed case studies based on contact tracing. <br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-19T12:49:06Z<p>Stephane: </p>
<hr />
<div><br />
== '''Obfuscation Project of the DES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement by Edit Functions and Its Application to Location Privacy''.<br />
<br />
<br />
==== What is Opacity? ====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].<br />
<br />
[1] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[2] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
<br />
==== Location Privacy: An Approach based on Opacity ====<br />
<br />
Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this in the section below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40. The real-time obfuscator is based on the symbolic implementation of edit functions in the paper [5] and on the SynthSMV tool presented in the paper [6].<br />
<br />
[4] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
[5] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
[6] B.C. Rawlings, S. Lafortune, and B.E. Ydstie,<br />
"Supervisory Control of Labeled Transition Systems Subject to Multiple Reachability Requirements via Symbolic Model Checking,"<br />
''IEEE Transactions on Control Systems Technology''.<br />
Vol. 8, No. 2, March 2020, pp. 644-652.<br />
<br />
The following video shows a simulation of the implementation described in [4]. The grid shown in the video represents the first floor of the Clark Kerr building on the campus of the University of California at Berkeley. [[ Media:Obfuscation-video.ogv ]]<br />
<br />
<br />
==== Obfuscation Game: Can you keep a secret? ====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5] and on the SynthSMV tool presented in the paper [6].<br />
<br />
Several people contributed to the development of the Obfuscation Game; alphabetically: Andrew Bourgeois, Isaac Dubuque, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
Here is a video showing a user playing the Obfuscation Game: [[Media:Obfgame.mp4]]<br />
<br />
<br />
==== Obfuscation for Location Privacy: Real-time Demonstration of Obfuscation app ====<br />
<br />
The methodology from [5]-[6] used for the background obfuscator in the Obfuscation Game was also deployed in an ''Obfuscation app'' running on an ipad for real-time obfuscation when a user is moving in an area with secret locations and obstacles. Given a bounded geographical area modeled by a grid with obstacle cells and secret cells, an obfuscator is constructed so that the reported (or obfuscated) location never visits the secret grid cells. At the same time, the reported location should be within some given bound from the actual one, and the resulting obfuscated trajectory should be a valid one (i.e., avoid the obstacles). The app monitors the location of the user in real-time using GPS and the obfuscator synthesized by the methodology in [5] is used in real-time to select a reported position each time the user enters a new cell. <br />
<br />
We produced a video of the actual and obfuscated trajectories as a user holding an ipad running the app was moving in the Gerstacker Grove on the North Campus of the University of Michigan in Ann Arbor. In that video, the blue circle is the actual user location from GPS, the blue diamond is the actual cell location used as input to the obfuscator, the purple diamond is the reported position by the obfuscator, the black grid cells are obstacles, and the orange grid cell is the secret location. Here, the constraint was that the obfuscated position should never be more than 2 cells away from the actual one; moreover, the obfuscator is allowed to move at most 3 cells when the user moves one cell. It can be seen that the obfuscated trajectory is a valid one, which is provably guaranteed by construction of the obfuscator (if one exists).<br />
<br />
Video of app implementing real-time obfuscation for location privacy: [[Media:Walk_grove.mp4]]<br />
<br />
Several people contributed to the development of the Obfuscation app; alphabetically: Andrew Bourgeois, Isaac Dubuque, Dylan Lawton, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.<br />
<br />
<br />
==== Our publications on opacity and its enforcement ====<br />
<br />
Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Location privacy is not the only application of interest for obfuscation. The synthesis problems we solve in our papers for different types of obfuscators are generic and therefore applicable to a wide range of domains. Recently, we have also developed case studies based on contact tracing. <br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-19T12:48:39Z<p>Stephane: </p>
<hr />
<div><big> [[Stéphane Lafortune]] | [[Contact Information]] | [[Career]] | '''Research''' | [[Publications]] | [[Related Links]] </big><br />
<br />
<br />
== '''Obfuscation Project of the DES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement by Edit Functions and Its Application to Location Privacy''.<br />
<br />
<br />
==== What is Opacity? ====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].<br />
<br />
[1] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[2] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
<br />
==== Location Privacy: An Approach based on Opacity ====<br />
<br />
Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this in the section below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40. The real-time obfuscator is based on the symbolic implementation of edit functions in the paper [5] and on the SynthSMV tool presented in the paper [6].<br />
<br />
[4] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
[5] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
[6] B.C. Rawlings, S. Lafortune, and B.E. Ydstie,<br />
"Supervisory Control of Labeled Transition Systems Subject to Multiple Reachability Requirements via Symbolic Model Checking,"<br />
''IEEE Transactions on Control Systems Technology''.<br />
Vol. 8, No. 2, March 2020, pp. 644-652.<br />
<br />
The following video shows a simulation of the implementation described in [4]. The grid shown in the video represents the first floor of the Clark Kerr building on the campus of the University of California at Berkeley. [[ Media:Obfuscation-video.ogv ]]<br />
<br />
<br />
==== Obfuscation Game: Can you keep a secret? ====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5] and on the SynthSMV tool presented in the paper [6].<br />
<br />
Several people contributed to the development of the Obfuscation Game; alphabetically: Andrew Bourgeois, Isaac Dubuque, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
Here is a video showing a user playing the Obfuscation Game: [[Media:Obfgame.mp4]]<br />
<br />
<br />
==== Obfuscation for Location Privacy: Real-time Demonstration of Obfuscation app ====<br />
<br />
The methodology from [5]-[6] used for the background obfuscator in the Obfuscation Game was also deployed in an ''Obfuscation app'' running on an ipad for real-time obfuscation when a user is moving in an area with secret locations and obstacles. Given a bounded geographical area modeled by a grid with obstacle cells and secret cells, an obfuscator is constructed so that the reported (or obfuscated) location never visits the secret grid cells. At the same time, the reported location should be within some given bound from the actual one, and the resulting obfuscated trajectory should be a valid one (i.e., avoid the obstacles). The app monitors the location of the user in real-time using GPS and the obfuscator synthesized by the methodology in [5] is used in real-time to select a reported position each time the user enters a new cell. <br />
<br />
We produced a video of the actual and obfuscated trajectories as a user holding an ipad running the app was moving in the Gerstacker Grove on the North Campus of the University of Michigan in Ann Arbor. In that video, the blue circle is the actual user location from GPS, the blue diamond is the actual cell location used as input to the obfuscator, the purple diamond is the reported position by the obfuscator, the black grid cells are obstacles, and the orange grid cell is the secret location. Here, the constraint was that the obfuscated position should never be more than 2 cells away from the actual one; moreover, the obfuscator is allowed to move at most 3 cells when the user moves one cell. It can be seen that the obfuscated trajectory is a valid one, which is provably guaranteed by construction of the obfuscator (if one exists).<br />
<br />
Video of app implementing real-time obfuscation for location privacy: [[Media:Walk_grove.mp4]]<br />
<br />
Several people contributed to the development of the Obfuscation app; alphabetically: Andrew Bourgeois, Isaac Dubuque, Dylan Lawton, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.<br />
<br />
<br />
==== Our publications on opacity and its enforcement ====<br />
<br />
Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Location privacy is not the only application of interest for obfuscation. The synthesis problems we solve in our papers for different types of obfuscators are generic and therefore applicable to a wide range of domains. Recently, we have also developed case studies based on contact tracing. <br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-18T19:43:24Z<p>Stephane: /* Obfuscation Project of the DES Group at the University of Michigan */</p>
<hr />
<div>== '''Obfuscation Project of the DES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement by Edit Functions and Its Application to Location Privacy''.<br />
<br />
<br />
==== What is Opacity? ====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].<br />
<br />
[1] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[2] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
<br />
==== Location Privacy: An Approach based on Opacity ====<br />
<br />
Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this in the section below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40. The real-time obfuscator is based on the symbolic implementation of edit functions in the paper [5] and on the SynthSMV tool presented in the paper [6].<br />
<br />
[4] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
[5] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
[6] B.C. Rawlings, S. Lafortune, and B.E. Ydstie,<br />
"Supervisory Control of Labeled Transition Systems Subject to Multiple Reachability Requirements via Symbolic Model Checking,"<br />
''IEEE Transactions on Control Systems Technology''.<br />
Vol. 8, No. 2, March 2020, pp. 644-652.<br />
<br />
The following video shows a simulation of the implementation described in [4]. The grid shown in the video represents the first floor of the Clark Kerr building on the campus of the University of California at Berkeley. [[ Media:Obfuscation-video.ogv ]]<br />
<br />
<br />
==== Obfuscation Game: Can you keep a secret? ====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5] and on the SynthSMV tool presented in the paper [6].<br />
<br />
Several people contributed to the development of the Obfuscation Game; alphabetically: Andrew Bourgeois, Isaac Dubuque, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
Here is a video showing a user playing the Obfuscation Game: [[Media:Obfgame.mp4]]<br />
<br />
<br />
==== Obfuscation for Location Privacy: Real-time Demonstration of Obfuscation app ====<br />
<br />
The methodology from [5]-[6] used for the background obfuscator in the Obfuscation Game was also deployed in an ''Obfuscation app'' running on an ipad for real-time obfuscation when a user is moving in an area with secret locations and obstacles. Given a bounded geographical area modeled by a grid with obstacle cells and secret cells, an obfuscator is constructed so that the reported (or obfuscated) location never visits the secret grid cells. At the same time, the reported location should be within some given bound from the actual one, and the resulting obfuscated trajectory should be a valid one (i.e., avoid the obstacles). The app monitors the location of the user in real-time using GPS and the obfuscator synthesized by the methodology in [5] is used in real-time to select a reported position each time the user enters a new cell. <br />
<br />
We produced a video of the actual and obfuscated trajectories as a user holding an ipad running the app was moving in the Gerstacker Grove on the North Campus of the University of Michigan in Ann Arbor. In that video, the blue circle is the actual user location from GPS, the blue diamond is the actual cell location used as input to the obfuscator, the purple diamond is the reported position by the obfuscator, the black grid cells are obstacles, and the orange grid cell is the secret location. Here, the constraint was that the obfuscated position should never be more than 2 cells away from the actual one; moreover, the obfuscator is allowed to move at most 3 cells when the user moves one cell. It can be seen that the obfuscated trajectory is a valid one, which is provably guaranteed by construction of the obfuscator (if one exists).<br />
<br />
Video of app implementing real-time obfuscation for location privacy: [[Media:Walk_grove.mp4]]<br />
<br />
Several people contributed to the development of the Obfuscation app; alphabetically: Andrew Bourgeois, Isaac Dubuque, Dylan Lawton, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.<br />
<br />
<br />
==== Our publications on opacity and its enforcement ====<br />
<br />
Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Location privacy is not the only application of interest for obfuscation. The synthesis problems we solve in our papers for different types of obfuscators are generic and therefore applicable to a wide range of domains. Recently, we have also developed case studies based on contact tracing. <br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-17T18:48:50Z<p>Stephane: /* Obfuscation Project of the DES Group at the University of Michigan */</p>
<hr />
<div>== '''Obfuscation Project of the DES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement by Edit Functions and Its Application to Location Privacy''.<br />
<br />
<br />
==== What is Opacity? ====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].<br />
<br />
[1] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[2] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
<br />
==== Location Privacy: An Approach based on Opacity ====<br />
<br />
Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this in the section below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40. The real-time obfuscator is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
[4] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
[5] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
The following video shows a simulation of the implementation described in [4]. The grid shown in the video represents the first floor of the Clark Kerr building on the campus of the University of California at Berkeley. [[ Media:Obfuscation-video.ogv ]]<br />
<br />
<br />
==== Obfuscation Game: Can you keep a secret? ====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
Several people contributed to the development of the Obfuscation Game; alphabetically: Andrew Bourgeois, Isaac Dubuque, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
Here is a video showing a user playing the Obfuscation Game: [[Media:Obfgame.mp4]]<br />
<br />
<br />
==== Obfuscation for Location Privacy: Real-time Demonstration of Obfuscation app ====<br />
<br />
The methodology from [5] used for the background obfuscator in the Obfuscation Game was also deployed in an ''Obfuscation app'' running on an ipad for real-time obfuscation when a user is moving in an area with secret locations and obstacles. Given a bounded geographical area modeled by a grid with obstacle cells and secret cells, an obfuscator is constructed so that the reported (or obfuscated) location never visits the secret grid cells. At the same time, the reported location should be within some given bound from the actual one, and the resulting obfuscated trajectory should be a valid one (i.e., avoid the obstacles). The app monitors the location of the user in real-time using GPS and the obfuscator synthesized by the methodology in [5] is used in real-time to select a reported position each time the user enters a new cell. <br />
<br />
We produced a video of the actual and obfuscated trajectories as a user holding an ipad running the app was moving in the Gerstacker Grove on the North Campus of the University of Michigan in Ann Arbor. In that video, the blue circle is the actual user location from GPS, the blue diamond is the actual cell location used as input to the obfuscator, the purple diamond is the reported position by the obfuscator, the black grid cells are obstacles, and the orange grid cell is the secret location. Here, the constraint was that the obfuscated position should never be more than 2 cells away from the actual one; moreover, the obfuscator is allowed to move at most 3 cells when the user moves one cell. It can be seen that the obfuscated trajectory is a valid one, which is provably guaranteed by construction of the obfuscator (if one exists).<br />
<br />
Video of app implementing real-time obfuscation for location privacy: [[Media:Walk_grove.mp4]]<br />
<br />
Several people contributed to the development of the Obfuscation app; alphabetically: Andrew Bourgeois, Isaac Dubuque, Dylan Lawton, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.<br />
<br />
<br />
==== Our publications on opacity and its enforcement ====<br />
<br />
Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Location privacy is not the only application of interest for obfuscation. The synthesis problems we solve in our papers for different types of obfuscators are generic and therefore applicable to a wide range of domains. Recently, we have also developed case studies based on contact tracing. <br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-17T18:48:23Z<p>Stephane: /* Obfuscation for Location Privacy: Real-time Demonstration of Obfuscation app */</p>
<hr />
<div>== '''Obfuscation Project of the DES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement by Edit Functions and Its Application to Location Privacy''.<br />
<br />
<br />
==== What is Opacity? ====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].<br />
<br />
[1] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[2] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
<br />
==== Location Privacy: An Approach based on Opacity ====<br />
<br />
Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this in the section below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40. The real-time obfuscator is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
[4] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
[5] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
The following video shows a simulation of the implementation described in [4]. The grid shown in the video represents the first floor of the Clark Kerr building on the campus of the University of California at Berkeley. [[ Media:Obfuscation-video.ogv ]]<br />
<br />
<br />
==== Obfuscation Game: Can you keep a secret? ====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
Several people contributed to the development of the Obfuscation Game; alphabetically: Andrew Bourgeois, Isaac Dubuque, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
Here is a video showing a user playing the Obfuscation Game: [[Media:Obfgame.mp4]]<br />
<br />
<br />
==== Obfuscation for Location Privacy: Real-time Demonstration of Obfuscation app ====<br />
<br />
The methodology from [5] used for the background obfuscator in the Obfuscation Game was also deployed in an ''Obfuscation app'' running on an ipad for real-time obfuscation when a user is moving in an area with secret locations and obstacles. Given a bounded geographical area modeled by a grid with obstacle cells and secret cells, an obfuscator is constructed so that the reported (or obfuscated) location never visits the secret grid cells. At the same time, the reported location should be within some given bound from the actual one, and the resulting obfuscated trajectory should be a valid one (i.e., avoid the obstacles). The app monitors the location of the user in real-time using GPS and the obfuscator synthesized by the methodology in [5] is used in real-time to select a reported position each time the user enters a new cell. <br />
<br />
We produced a video of the actual and obfuscated trajectories as a user holding an ipad running the app was moving in the Gerstacker Grove on the North Campus of the University of Michigan in Ann Arbor. In that video, the blue circle is the actual user location from GPS, the blue diamond is the actual cell location used as input to the obfuscator, the purple diamond is the reported position by the obfuscator, the black grid cells are obstacles, and the orange grid cell is the secret location. Here, the constraint was that the obfuscated position should never be more than 2 cells away from the actual one; moreover, the obfuscator is allowed to move at most 3 cells when the user moves one cell. It can be seen that the obfuscated trajectory is a valid one, which is provably guaranteed by construction of the obfuscator (if one exists).<br />
<br />
Video of app implementing real-time obfuscation for location privacy: [[Media:Walk_grove.mp4]]<br />
<br />
Several people contributed to the development of the Obfuscation app; alphabetically: Andrew Bourgeois, Isaac Dubuque, Dylan Lawton, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.<br />
<br />
==== Our publications on opacity and its enforcement ====<br />
<br />
Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Location privacy is not the only application of interest for obfuscation. The synthesis problems we solve in our papers for different types of obfuscators are generic and therefore applicable to a wide range of domains. Recently, we have also developed case studies based on contact tracing. <br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-17T18:47:33Z<p>Stephane: /* Obfuscation Project of the DES Group at the University of Michigan */</p>
<hr />
<div>== '''Obfuscation Project of the DES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement by Edit Functions and Its Application to Location Privacy''.<br />
<br />
<br />
==== What is Opacity? ====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].<br />
<br />
[1] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[2] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
<br />
==== Location Privacy: An Approach based on Opacity ====<br />
<br />
Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this in the section below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40. The real-time obfuscator is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
[4] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
[5] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
The following video shows a simulation of the implementation described in [4]. The grid shown in the video represents the first floor of the Clark Kerr building on the campus of the University of California at Berkeley. [[ Media:Obfuscation-video.ogv ]]<br />
<br />
<br />
==== Obfuscation Game: Can you keep a secret? ====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
Several people contributed to the development of the Obfuscation Game; alphabetically: Andrew Bourgeois, Isaac Dubuque, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
Here is a video showing a user playing the Obfuscation Game: [[Media:Obfgame.mp4]]<br />
<br />
<br />
==== Obfuscation for Location Privacy: Real-time Demonstration of Obfuscation app ====<br />
<br />
The methodology from [5] used for the background obfuscator in the Obfuscation Game was also deployed in an ''Obfuscation app'' running on an ipad for real-time obfuscation when a user is moving in an area with secret locations and obstacles. Given a bounded geographical area modeled by a grid with obstacle cells and secret cells, an obfuscator is constructed so that the reported (or obfuscated) location never visits the secret grid cells. At the same time, the reported location should be within some given bound from the actual one, and the resulting obfuscated trajectory should be a valid one (i.e., avoid the obstacles). The app monitors the location of the user in real-time using GPS and the obfuscator synthesized by the methodology in [5] is used in real-time to select a reported position each time the user enters a new cell. <br />
<br />
We produced a video of the actual and obfuscated trajectories as a user holding an ipad running the app was moving in the Gerstacker Grove on the North Campus of the University of Michigan in Ann Arbor. In that video, the blue circle is the actual user location from GPS, the blue diamond is the actual cell location used as input to the obfuscator, the purple diamond is the reported position by the obfuscator, the black grid cells are obstacles, and the orange grid cell is the secret location. Here, the constraint was that the obfuscated position should never be more than 2 cells away from the actual one; moreover, the obfuscator is allowed to move at most 3 cells when the user moves one cell. It can be seen that the obfuscated trajectory is a valid one, which is provably guaranteed by construction of the obfuscator (if one exists).<br />
<br />
Video of app implementing real-time obfuscation for location privacy: [[Media:Walk_grove.mp4]]<br />
<br />
Several people contributed to the development of the Obfuscation Game; alphabetically: Andrew Bourgeois, Isaac Dubuque, Dylan Lawton, Nicholas Recker, Jack Weitze, and Gregory Willett. They worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.<br />
<br />
<br />
==== Our publications on opacity and its enforcement ====<br />
<br />
Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Location privacy is not the only application of interest for obfuscation. The synthesis problems we solve in our papers for different types of obfuscators are generic and therefore applicable to a wide range of domains. Recently, we have also developed case studies based on contact tracing. <br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-17T18:40:47Z<p>Stephane: /* Obfuscation Project of the DES Group at the University of Michigan */</p>
<hr />
<div>== '''Obfuscation Project of the DES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement by Edit Functions and Its Application to Location Privacy''.<br />
<br />
<br />
==== What is Opacity? ====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].<br />
<br />
[1] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[2] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
<br />
==== Location Privacy: An Approach based on Opacity ====<br />
<br />
Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this in the section below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40. The real-time obfuscator is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
[4] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
[5] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
The following video shows a simulation of the implementation described in [4]. The grid shown in the video represents the first floor of the Clark Kerr building on the campus of the University of California at Berkeley. [[ Media:Obfuscation-video.ogv ]]<br />
<br />
<br />
==== Obfuscation Game: Can you keep a secret? ====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
Several people contributed to the development of the Obfuscation Game, principally Gregory Willett, Isaac Dubuque, Andrew Bourgeois, and Jack Weitze, who worked under the mentorship of Blake Rawlings and Rômulo Meira-Góes.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
Here is a video showing a user playing the Obfuscation Game: [[Media:Obfgame.mp4]]<br />
<br />
<br />
==== Obfuscation for Location Privacy: Real-time Demonstration of Obfuscation app ====<br />
<br />
The methodology from [5] used for the background obfuscator in the Obfuscation Game was also deployed in an ''Obfuscation app'' running on an ipad for real-time obfuscation when a user is moving in an area with secret locations and obstacles. Given a bounded geographical area modeled by a grid with obstacle cells and secret cells, an obfuscator is constructed so that the reported (or obfuscated) location never visits the secret grid cells. At the same time, the reported location should be within some given bound from the actual one, and the resulting obfuscated trajectory should be a valid one (i.e., avoid the obstacles). The app monitors the location of the user in real-time using GPS and the obfuscator synthesized by the methodology in [5] is used in real-time to select a reported position each time the user enters a new cell. <br />
<br />
We produced a video of the actual and obfuscated trajectories as a user holding an ipad running the app was moving in the Gerstacker Grove on the North Campus of the University of Michigan in Ann Arbor. In that video, the blue circle is the actual user location from GPS, the blue diamond is the actual cell location used as input to the obfuscator, the purple diamond is the reported position by the obfuscator, the black grid cells are obstacles, and the orange grid cell is the secret location. Here, the constraint was that the obfuscated position should never be more than 2 cells away from the actual one; moreover, the obfuscator is allowed to move at most 3 cells when the user moves one cell. It can be seen that the obfuscated trajectory is a valid one, which is provably guaranteed by construction of the obfuscator (if one exists).<br />
<br />
Video of app implementing real-time obfuscation for location privacy: [[Media:Walk_grove.mp4]]<br />
<br />
Several people contributed to the development of the Obfuscation app, principally Dylan Lawton, Andrew Bourgeois, and Gregory Willett, who worked under the mentorship of Rômulo Meira-Góes and Blake Rawlings.<br />
<br />
<br />
==== Our publications on opacity and its enforcement ====<br />
<br />
Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Location privacy is not the only application of interest for obfuscation. The synthesis problems we solve in our papers for different types of obfuscators are generic and therefore applicable to a wide range of domains. Recently, we have also developed case studies based on contact tracing. <br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-17T18:34:02Z<p>Stephane: /* Our publications on opacity and its enforcement */</p>
<hr />
<div>== '''Obfuscation Project of the DES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement by Edit Functions and Its Application to Location Privacy''.<br />
<br />
<br />
==== What is Opacity? ====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].<br />
<br />
[1] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[2] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
<br />
==== Location Privacy: An Approach based on Opacity ====<br />
<br />
Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this in the section below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40. The real-time obfuscator is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
[4] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
[5] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
The following video shows a simulation of the implementation described in [4]. The grid shown in the video represents the first floor of the Clark Kerr building on the campus of the University of California at Berkeley. [[ Media:Obfuscation-video.ogv ]]<br />
<br />
<br />
==== Obfuscation Game: Can you keep a secret? ====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
Several people contributed to the development of the Obfuscation Game, principally Isaac Dubuque, Andrew Bourgeois, and Jack Weize, who worked under the mentorship of Blake Rawlings and Rômulo Meira-Góes.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
Here is a video showing a user playing the Obfuscation Game: [[Media:Obfgame.mp4]]<br />
<br />
<br />
==== Obfuscation for Location Privacy: Real-time Demonstration ====<br />
<br />
The methodology from [5] used for the background obfuscator in the Obfuscation Game was also deployed in an app running on an ipad for real-time obfuscation when a user is moving in an area with secret locations and obstacles. Given a bounded geographical area modeled by a grid with obstacle cells and secret cells, an obfuscator is constructed so that the reported (or obfuscated) location never visits the secret grid cells. At the same time, the reported location should be within some given bound from the actual one, and the resulting obfuscated trajectory should be a valid one (i.e., avoid the obstacles). The app monitors the location of the user in real-time using GPS and the obfuscator synthesized by the methodology in [5] is used in real-time to select a reported position each time the user enters a new cell. <br />
<br />
We produced a video of the actual and obfuscated trajectories as a user holding an ipad running the app was moving in the Gerstacker Grove on the North Campus of the University of Michigan in Ann Arbor. In that video, the blue circle is the actual user location from GPS, the blue diamond is the actual cell location used as input to the obfuscator, the purple diamond is the reported position by the obfuscator, the black grid cells are obstacles, and the orange grid cell is the secret location. Here, the constraint was that the obfuscated position should never be more than 2 cells away from the actual one; moreover, the obfuscator is allowed to move at most 3 cells when the user moves one cell. It can be seen that the obfuscated trajectory is a valid one, which is provably guaranteed by construction of the obfuscator (if one exists).<br />
<br />
Video of app implementing real-time obfuscation for location privacy: [[Media:Walk_grove.mp4]]<br />
<br />
<br />
==== Our publications on opacity and its enforcement ====<br />
<br />
Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Location privacy is not the only application of interest for obfuscation. The synthesis problems we solve in our papers for different types of obfuscators are generic and therefore applicable to a wide range of domains. Recently, we have also developed case studies based on contact tracing. <br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-17T18:29:35Z<p>Stephane: /* Obfuscation Project of the DES Group at the University of Michigan */</p>
<hr />
<div>== '''Obfuscation Project of the DES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement by Edit Functions and Its Application to Location Privacy''.<br />
<br />
<br />
==== What is Opacity? ====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].<br />
<br />
[1] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[2] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
<br />
==== Location Privacy: An Approach based on Opacity ====<br />
<br />
Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this in the section below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40. The real-time obfuscator is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
[4] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
[5] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
The following video shows a simulation of the implementation described in [4]. The grid shown in the video represents the first floor of the Clark Kerr building on the campus of the University of California at Berkeley. [[ Media:Obfuscation-video.ogv ]]<br />
<br />
<br />
==== Obfuscation Game: Can you keep a secret? ====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
Several people contributed to the development of the Obfuscation Game, principally Isaac Dubuque, Andrew Bourgeois, and Jack Weize, who worked under the mentorship of Blake Rawlings and Rômulo Meira-Góes.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
Here is a video showing a user playing the Obfuscation Game: [[Media:Obfgame.mp4]]<br />
<br />
<br />
==== Obfuscation for Location Privacy: Real-time Demonstration ====<br />
<br />
The methodology from [5] used for the background obfuscator in the Obfuscation Game was also deployed in an app running on an ipad for real-time obfuscation when a user is moving in an area with secret locations and obstacles. Given a bounded geographical area modeled by a grid with obstacle cells and secret cells, an obfuscator is constructed so that the reported (or obfuscated) location never visits the secret grid cells. At the same time, the reported location should be within some given bound from the actual one, and the resulting obfuscated trajectory should be a valid one (i.e., avoid the obstacles). The app monitors the location of the user in real-time using GPS and the obfuscator synthesized by the methodology in [5] is used in real-time to select a reported position each time the user enters a new cell. <br />
<br />
We produced a video of the actual and obfuscated trajectories as a user holding an ipad running the app was moving in the Gerstacker Grove on the North Campus of the University of Michigan in Ann Arbor. In that video, the blue circle is the actual user location from GPS, the blue diamond is the actual cell location used as input to the obfuscator, the purple diamond is the reported position by the obfuscator, the black grid cells are obstacles, and the orange grid cell is the secret location. Here, the constraint was that the obfuscated position should never be more than 2 cells away from the actual one; moreover, the obfuscator is allowed to move at most 3 cells when the user moves one cell. It can be seen that the obfuscated trajectory is a valid one, which is provably guaranteed by construction of the obfuscator (if one exists).<br />
<br />
Video of app implementing real-time obfuscation for location privacy: [[Media:Walk_grove.mp4]]<br />
<br />
<br />
==== Our publications on opacity and its enforcement ====<br />
<br />
Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-17T17:33:21Z<p>Stephane: /* Obfuscation Project of the DES Group at the University of Michigan */</p>
<hr />
<div>== '''Obfuscation Project of the DES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement and Its Application to Location Privacy''.<br />
<br />
<br />
==== What is Opacity? ====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].<br />
<br />
[1] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[2] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
<br />
==== Location Privacy: An Approach based on Opacity ====<br />
<br />
Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this in the section below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40. The real-time obfuscator is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
[4] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
[5] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
The following video shows a simulation of the implementation described in [4]. The grid shown in the video represents the first floor of the Clark Kerr building on the campus of the University of California at Berkeley. [[ Media:Obfuscation-video.ogv ]]<br />
<br />
<br />
==== Obfuscation Game: Can you keep a secret? ====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
Several people contributed to the development of the Obfuscation Game, principally Isaac Dubuque, Andrew Bourgeois, and Jack Weize, who worked under the mentorship of Blake Rawlings and Rômulo Meira-Góes.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
Here is a video showing a user playing the Obfuscation Game: [[Media:Obfgame.mp4]]<br />
<br />
<br />
==== Obfuscation for Location Privacy: Real-time Demonstration ====<br />
<br />
The methodology from [5] used for the background obfuscator in the Obfuscation Game was also deployed in an app running on an ipad for real-time obfuscation when a user is moving in an area with secret locations and obstacles. Given a bounded geographical area modeled by a grid with obstacle cells and secret cells, an obfuscator is constructed so that the reported (or obfuscated) location never visits the secret grid cells. At the same time, the reported location should be within some given bound from the actual one, and the resulting obfuscated trajectory should be a valid one (i.e., avoid the obstacles). The app monitors the location of the user in real-time using GPS and the obfuscator synthesized by the methodology in [5] is used in real-time to select a reported position each time the user enters a new cell. <br />
<br />
We produced a video of the actual and obfuscated trajectories as a user holding an ipad running the app was moving in the Gerstacker Grove on the North Campus of the University of Michigan in Ann Arbor. In that video, the blue circle is the actual user location from GPS, the blue diamond is the actual cell location used as input to the obfuscator, the purple diamond is the reported position by the obfuscator, the black grid cells are obstacles, and the orange grid cell is the secret location. Here, the constraint was that the obfuscated position should never be more than 2 cells away from the actual one; moreover, the obfuscator is allowed to move at most 3 cells when the user moves one cell. It can be seen that the obfuscated trajectory is a valid one, which is provably guaranteed by construction of the obfuscator (if one exists).<br />
<br />
Video of app implementing real-time obfuscation for location privacy: [[Media:Walk_grove.mp4]]<br />
<br />
<br />
==== Our publications on opacity and its enforcement ====<br />
<br />
Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-17T17:31:24Z<p>Stephane: /* Location Privacy: An Approach based on Opacity */</p>
<hr />
<div>== '''Obfuscation Project of the DES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement and Its Application to Location Privacy''.<br />
<br />
<br />
==== What is Opacity? ====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].<br />
<br />
[1] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[2] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
<br />
==== Location Privacy: An Approach based on Opacity ====<br />
<br />
Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this in the section below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40. The real-time obfuscator is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
[4] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
[5] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
The following video shows a simulation of the implementation described in [4]. The grid shown in the video represents the first floor of the Clark Kerr building on the campus of the University of California at Berkeley. [[ Media:Obfuscation-video.ogv ]]<br />
<br />
==== Obfuscation Game: Can you keep a secret? ====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
Several people contributed to the development of the Obfuscation Game, principally Isaac Dubuque, Andrew Bourgeois, and Jack Weize, who worked under the mentorship of Blake Rawlings and Rômulo Meira-Góes.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
Here is a video showing a user playing the Obfuscation Game: [[Media:Obfgame.mp4]]<br />
<br />
<br />
==== Obfuscation for Location Privacy: Real-time Demonstration ====<br />
<br />
The methodology from [5] used for the background obfuscator in the Obfuscation Game was also deployed in an app running on an ipad for real-time obfuscation when a user is moving in an area with secret locations and obstacles. Given a bounded geographical area modeled by a grid with obstacle cells and secret cells, an obfuscator is constructed so that the reported (or obfuscated) location never visits the secret grid cells. At the same time, the reported location should be within some given bound from the actual one, and the resulting obfuscated trajectory should be a valid one (i.e., avoid the obstacles). The app monitors the location of the user in real-time using GPS and the obfuscator synthesized by the methodology in [5] is used in real-time to select a reported position each time the user enters a new cell. <br />
<br />
We produced a video of the actual and obfuscated trajectories as a user holding an ipad running the app was moving in the Gerstacker Grove on the North Campus of the University of Michigan in Ann Arbor. In that video, the blue circle is the actual user location from GPS, the blue diamond is the actual cell location used as input to the obfuscator, the purple diamond is the reported position by the obfuscator, the black grid cells are obstacles, and the orange grid cell is the secret location. Here, the constraint was that the obfuscated position should never be more than 2 cells away from the actual one; moreover, the obfuscator is allowed to move at most 3 cells when the user moves one cell. It can be seen that the obfuscated trajectory is a valid one, which is provably guaranteed by construction of the obfuscator (if one exists).<br />
<br />
Video of app implementing real-time obfuscation for location privacy: [[Media:Walk_grove.mp4]]<br />
<br />
<br />
==== Our publications on opacity and its enforcement ====<br />
<br />
Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-17T17:31:02Z<p>Stephane: /* Location Privacy: An Approach based on Opacity */</p>
<hr />
<div>== '''Obfuscation Project of the DES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement and Its Application to Location Privacy''.<br />
<br />
<br />
==== What is Opacity? ====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].<br />
<br />
[1] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[2] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
<br />
==== Location Privacy: An Approach based on Opacity ====<br />
<br />
Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this in the section below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40. The real-time obfuscator is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
[4] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
[5] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
The following video shows a simulation of the implementation described in [4]. The grid shown in the video represents the first floor of the Clark Kerr building on the campus of the University of California at Berkeley.<br />
: [[ Media:Obfuscation-video.ogv ]]<br />
<br />
==== Obfuscation Game: Can you keep a secret? ====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
Several people contributed to the development of the Obfuscation Game, principally Isaac Dubuque, Andrew Bourgeois, and Jack Weize, who worked under the mentorship of Blake Rawlings and Rômulo Meira-Góes.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
Here is a video showing a user playing the Obfuscation Game: [[Media:Obfgame.mp4]]<br />
<br />
<br />
==== Obfuscation for Location Privacy: Real-time Demonstration ====<br />
<br />
The methodology from [5] used for the background obfuscator in the Obfuscation Game was also deployed in an app running on an ipad for real-time obfuscation when a user is moving in an area with secret locations and obstacles. Given a bounded geographical area modeled by a grid with obstacle cells and secret cells, an obfuscator is constructed so that the reported (or obfuscated) location never visits the secret grid cells. At the same time, the reported location should be within some given bound from the actual one, and the resulting obfuscated trajectory should be a valid one (i.e., avoid the obstacles). The app monitors the location of the user in real-time using GPS and the obfuscator synthesized by the methodology in [5] is used in real-time to select a reported position each time the user enters a new cell. <br />
<br />
We produced a video of the actual and obfuscated trajectories as a user holding an ipad running the app was moving in the Gerstacker Grove on the North Campus of the University of Michigan in Ann Arbor. In that video, the blue circle is the actual user location from GPS, the blue diamond is the actual cell location used as input to the obfuscator, the purple diamond is the reported position by the obfuscator, the black grid cells are obstacles, and the orange grid cell is the secret location. Here, the constraint was that the obfuscated position should never be more than 2 cells away from the actual one; moreover, the obfuscator is allowed to move at most 3 cells when the user moves one cell. It can be seen that the obfuscated trajectory is a valid one, which is provably guaranteed by construction of the obfuscator (if one exists).<br />
<br />
Video of app implementing real-time obfuscation for location privacy: [[Media:Walk_grove.mp4]]<br />
<br />
<br />
==== Our publications on opacity and its enforcement ====<br />
<br />
Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-17T17:28:17Z<p>Stephane: /* Obfuscation Project of the DES Group at the University of Michigan */</p>
<hr />
<div>== '''Obfuscation Project of the DES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement and Its Application to Location Privacy''.<br />
<br />
<br />
==== What is Opacity? ====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].<br />
<br />
[1] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[2] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
<br />
==== Location Privacy: An Approach based on Opacity ====<br />
<br />
Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this in the section below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40. The real-time obfuscator is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
[4] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
[5] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
The following video shows a simulation of the implementation described in [4]: [[ Media:Obfuscation-video.ogv ]]<br />
The grid shown in the video represents the first floor of the Clark Kerr building on the campus of the University of California at Berkeley.<br />
<br />
<br />
==== Obfuscation Game: Can you keep a secret? ====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
Several people contributed to the development of the Obfuscation Game, principally Isaac Dubuque, Andrew Bourgeois, and Jack Weize, who worked under the mentorship of Blake Rawlings and Rômulo Meira-Góes.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
Here is a video showing a user playing the Obfuscation Game: [[Media:Obfgame.mp4]]<br />
<br />
<br />
==== Obfuscation for Location Privacy: Real-time Demonstration ====<br />
<br />
The methodology from [5] used for the background obfuscator in the Obfuscation Game was also deployed in an app running on an ipad for real-time obfuscation when a user is moving in an area with secret locations and obstacles. Given a bounded geographical area modeled by a grid with obstacle cells and secret cells, an obfuscator is constructed so that the reported (or obfuscated) location never visits the secret grid cells. At the same time, the reported location should be within some given bound from the actual one, and the resulting obfuscated trajectory should be a valid one (i.e., avoid the obstacles). The app monitors the location of the user in real-time using GPS and the obfuscator synthesized by the methodology in [5] is used in real-time to select a reported position each time the user enters a new cell. <br />
<br />
We produced a video of the actual and obfuscated trajectories as a user holding an ipad running the app was moving in the Gerstacker Grove on the North Campus of the University of Michigan in Ann Arbor. In that video, the blue circle is the actual user location from GPS, the blue diamond is the actual cell location used as input to the obfuscator, the purple diamond is the reported position by the obfuscator, the black grid cells are obstacles, and the orange grid cell is the secret location. Here, the constraint was that the obfuscated position should never be more than 2 cells away from the actual one; moreover, the obfuscator is allowed to move at most 3 cells when the user moves one cell. It can be seen that the obfuscated trajectory is a valid one, which is provably guaranteed by construction of the obfuscator (if one exists).<br />
<br />
Video of app implementing real-time obfuscation for location privacy: [[Media:Walk_grove.mp4]]<br />
<br />
<br />
==== Our publications on opacity and its enforcement ====<br />
<br />
Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-17T17:23:36Z<p>Stephane: /* Location Privacy: An Approach based on Opacity */</p>
<hr />
<div>== '''Obfuscation Project of the DES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement and Its Application to Location Privacy''.<br />
<br />
<br />
==== What is Opacity? ====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].<br />
<br />
[1] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[2] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
<br />
==== Location Privacy: An Approach based on Opacity ====<br />
<br />
Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this in the section below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40.<br />
<br />
[4] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
[[ Media:Obfuscation-video.ogv ]]<br />
<br />
==== Obfuscation Game: Can you keep a secret? ====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
[5] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
Several people contributed to the development of the Obfuscation Game, principally Isaac Dubuque, Andrew Bourgeois, and Jack Weize, who worked under the mentorship of Blake Rawlings and Rômulo Meira-Góes.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
Here is a video showing a user playing the Obfuscation Game: [[Media:Obfgame.mp4]]<br />
<br />
<br />
==== Obfuscation for Location Privacy: Real-time Demonstration ====<br />
<br />
The methodology from [5] used for the background obfuscator in the Obfuscation Game was also deployed in an app running on an ipad for real-time obfuscation when a user is moving in an area with secret locations and obstacles. Given a bounded geographical area modeled by a grid with obstacle cells and secret cells, an obfuscator is constructed so that the reported (or obfuscated) location never visits the secret grid cells. At the same time, the reported location should be within some given bound from the actual one, and the resulting obfuscated trajectory should be a valid one (i.e., avoid the obstacles). The app monitors the location of the user in real-time using GPS and the obfuscator synthesized by the methodology in [5] is used in real-time to select a reported position each time the user enters a new cell. <br />
<br />
We produced a video of the actual and obfuscated trajectories as a user holding an ipad running the app was moving in the Gerstacker Grove on the North Campus of the University of Michigan in Ann Arbor. In that video, the blue circle is the actual user location from GPS, the blue diamond is the actual cell location used as input to the obfuscator, the purple diamond is the reported position by the obfuscator, the black grid cells are obstacles, and the orange grid cell is the secret location. Here, the constraint was that the obfuscated position should never be more than 2 cells away from the actual one; moreover, the obfuscator is allowed to move at most 3 cells when the user moves one cell. It can be seen that the obfuscated trajectory is a valid one, which is provably guaranteed by construction of the obfuscator (if one exists).<br />
<br />
Video of app implementing real-time obfuscation for location privacy: [[Media:Walk_grove.mp4]]<br />
<br />
<br />
==== Our publications on opacity and its enforcement ====<br />
<br />
Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/File:Obfuscation-video.ogvFile:Obfuscation-video.ogv2021-06-17T17:22:53Z<p>Stephane: </p>
<hr />
<div></div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-17T15:48:39Z<p>Stephane: /* Obfuscation Project of the DES Group at the University of Michigan */</p>
<hr />
<div>== '''Obfuscation Project of the DES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement and Its Application to Location Privacy''.<br />
<br />
<br />
==== What is Opacity? ====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].<br />
<br />
[1] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[2] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
<br />
==== Location Privacy: An Approach based on Opacity ====<br />
<br />
Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this in the section below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40.<br />
<br />
[4] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
==== Obfuscation Game: Can you keep a secret? ====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
[5] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
Several people contributed to the development of the Obfuscation Game, principally Isaac Dubuque, Andrew Bourgeois, and Jack Weize, who worked under the mentorship of Blake Rawlings and Rômulo Meira-Góes.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
Here is a video showing a user playing the Obfuscation Game: [[Media:Obfgame.mp4]]<br />
<br />
<br />
==== Obfuscation for Location Privacy: Real-time Demonstration ====<br />
<br />
The methodology from [5] used for the background obfuscator in the Obfuscation Game was also deployed in an app running on an ipad for real-time obfuscation when a user is moving in an area with secret locations and obstacles. Given a bounded geographical area modeled by a grid with obstacle cells and secret cells, an obfuscator is constructed so that the reported (or obfuscated) location never visits the secret grid cells. At the same time, the reported location should be within some given bound from the actual one, and the resulting obfuscated trajectory should be a valid one (i.e., avoid the obstacles). The app monitors the location of the user in real-time using GPS and the obfuscator synthesized by the methodology in [5] is used in real-time to select a reported position each time the user enters a new cell. <br />
<br />
We produced a video of the actual and obfuscated trajectories as a user holding an ipad running the app was moving in the Gerstacker Grove on the North Campus of the University of Michigan in Ann Arbor. In that video, the blue circle is the actual user location from GPS, the blue diamond is the actual cell location used as input to the obfuscator, the purple diamond is the reported position by the obfuscator, the black grid cells are obstacles, and the orange grid cell is the secret location. Here, the constraint was that the obfuscated position should never be more than 2 cells away from the actual one; moreover, the obfuscator is allowed to move at most 3 cells when the user moves one cell. It can be seen that the obfuscated trajectory is a valid one, which is provably guaranteed by construction of the obfuscator (if one exists).<br />
<br />
Video of app implementing real-time obfuscation for location privacy: [[Media:Walk_grove.mp4]]<br />
<br />
<br />
==== Our publications on opacity and its enforcement ====<br />
<br />
Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/File:Obfgame.mp4File:Obfgame.mp42021-06-17T15:47:06Z<p>Stephane: </p>
<hr />
<div></div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-17T15:31:59Z<p>Stephane: /* Obfuscation Project of the DES Group at the University of Michigan */</p>
<hr />
<div>== '''Obfuscation Project of the DES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement and Its Application to Location Privacy''.<br />
<br />
<br />
==== What is Opacity? ====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].<br />
<br />
[1] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[2] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
<br />
==== Location Privacy: An Approach based on Opacity ====<br />
<br />
Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this in the section below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40.<br />
<br />
[4] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
==== Obfuscation Game: Can you keep a secret? ====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
[5] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
Several people contributed to the development of the Obfuscation Game, principally Isaac Dubuque, Andrew Bourgeois, and Jack Weize, who worked under the mentorship of Blake Rawlings and Rômulo Meira-Góes.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
<br />
==== Obfuscation for Location Privacy: Real-time Demonstration ====<br />
<br />
The methodology from [5] used for the background obfuscator in the Obfuscation Game was also deployed in an app running on an ipad for real-time obfuscation when a user is moving in an area with secret locations and obstacles. Given a bounded geographical area modeled by a grid with obstacle cells and secret cells, an obfuscator is constructed so that the reported (or obfuscated) location never visits the secret grid cells. At the same time, the reported location should be within some given bound from the actual one, and the resulting obfuscated trajectory should be a valid one (i.e., avoid the obstacles). The app monitors the location of the user in real-time using GPS and the obfuscator synthesized by the methodology in [5] is used in real-time to select a reported position each time the user enters a new cell. <br />
<br />
We produced a video of the actual and obfuscated trajectories as a user holding an ipad running the app was moving in the Gerstacker Grove on the North Campus of the University of Michigan in Ann Arbor. In that video, the blue circle is the actual user location from GPS, the blue diamond is the actual cell location used as input to the obfuscator, the purple diamond is the reported position by the obfuscator, the black grid cells are obstacles, and the orange grid cell is the secret location. Here, the constraint was that the obfuscated position should never be more than 2 cells away from the actual one; moreover, the obfuscator is allowed to move at most 3 cells when the user moves one cell. It can be seen that the obfuscated trajectory is a valid one, which is provably guaranteed by construction of the obfuscator (if one exists).<br />
<br />
Video of app implementing real-time obfuscation for location privacy: [[Media:Walk_grove.mp4]]<br />
<br />
<br />
==== Our publications on opacity and its enforcement ====<br />
<br />
Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-17T15:30:07Z<p>Stephane: /* Obfuscation for Location Privacy: Real-time Demonstration */</p>
<hr />
<div>== '''Obfuscation Project of the DES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement and Its Application to Location Privacy''.<br />
<br />
<br />
===== What is Opacity? =====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].<br />
<br />
[1] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[2] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
<br />
===== Location Privacy: An Approach based on Opacity =====<br />
<br />
Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this in the section below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40.<br />
<br />
[4] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
===== Obfuscation Game: Can you keep a secret? =====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
[5] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
Several people contributed to the development of the Obfuscation Game, principally Isaac Dubuque, Andrew Bourgeois, and Jack Weize, who worked under the mentorship of Blake Rawlings and Rômulo Meira-Góes.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
<br />
===== Obfuscation for Location Privacy: Real-time Demonstration =====<br />
<br />
The methodology from [5] used for the background obfuscator in the Obfuscation Game was also deployed in an app running on an ipad for real-time obfuscation when a user is moving in an area with secret locations and obstacles. Given a bounded geographical area modeled by a grid with obstacle cells and secret cells, an obfuscator is constructed so that the reported (or obfuscated) location never visits the secret grid cells. At the same time, the reported location should be within some given bound from the actual one, and the resulting obfuscated trajectory should be a valid one (i.e., avoid the obstacles). The app monitors the location of the user in real-time using GPS and the obfuscator synthesized by the methodology in [5] is used in real-time to select a reported position each time the user enters a new cell. <br />
<br />
We produced a video of the actual and obfuscated trajectories as a user holding an ipad running the app was moving in the Gerstacker Grove on the North Campus of the University of Michigan in Ann Arbor. In that video, the blue circle is the actual user location from GPS, the blue diamond is the actual cell location used as input to the obfuscator, the purple diamond is the reported position by the obfuscator, the black grid cells are obstacles, and the orange grid cell is the secret location. Here, the constraint was that the obfuscated position should never be more than 2 cells away from the actual one; moreover, the obfuscator is allowed to move at most 3 cells when the user moves one cell. It can be seen that the obfuscated trajectory is a valid one, which is provably guaranteed by construction of the obfuscator (if one exists).<br />
<br />
Video of app implementing real-time obfuscation for location privacy: [[Media:Walk_grove.mp4]]<br />
<br />
==== Our research on opacity and its enforcement ====<br />
<br />
Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-17T15:26:42Z<p>Stephane: /* Obfuscation Project of the DES Group at the University of Michigan */</p>
<hr />
<div>== '''Obfuscation Project of the DES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement and Its Application to Location Privacy''.<br />
<br />
<br />
===== What is Opacity? =====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].<br />
<br />
[1] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[2] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
<br />
===== Location Privacy: An Approach based on Opacity =====<br />
<br />
Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this in the section below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40.<br />
<br />
[4] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
===== Obfuscation Game: Can you keep a secret? =====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
[5] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
Several people contributed to the development of the Obfuscation Game, principally Isaac Dubuque, Andrew Bourgeois, and Jack Weize, who worked under the mentorship of Blake Rawlings and Rômulo Meira-Góes.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
<br />
===== Obfuscation for Location Privacy: Real-time Demonstration =====<br />
<br />
The methodology from [5] used for the background obfuscator in the Obfuscation Game was also deployed in an app running on an ipad for real-time obfuscation when a user is moving in an area with secret locations and obstacles. Given a bounded geographical area modeled by a grid with obstacle cells and secret cells, an obfuscator is constructed so that the reported (or obfuscated) location never visits the secret grid cells. At the same time, the reported location should be within some given bound from the actual one, and the resulting obfuscated trajectory should be a valid one (i.e., avoid the obstacles). The app monitors the location of the user in real-time using GPS and the obfuscator synthesized by the methodology in [5] is used in real-time to select a reported position each time the user enters a new cell. We produced a video of the actual and obfuscated trajectories as a user holding an ipad running the app was moving in the Gerstacker Grove on the North Campus of the University of Michigan in Ann Arbor. In that video, the blue circle is the actual user location from GPS, the blue diamond is the actual cell location used as input to the obfuscator, the purple diamond is the reported position by the obfuscator, the black grid cells are obstacles, and the orange grid cell is the secret location. Here, the constraint was that the obfuscated position should never be more than 2 cells away from the actual one. It can be seen that the obfuscated trajectory is a valid one, which is provably guaranteed by construction of the obfuscator (if one exists).<br />
<br />
Video of app implementing real-time obfuscation for location privacy: [[Media:Walk_grove.mp4]]<br />
<br />
<br />
==== Our research on opacity and its enforcement ====<br />
<br />
Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-17T15:25:57Z<p>Stephane: /* Location Privacy: An Approach based on Opacity */</p>
<hr />
<div>== '''Obfuscation Project of the DES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement and Its Application to Location Privacy''.<br />
<br />
===== What is Opacity? =====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].<br />
<br />
[1] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[2] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
===== Location Privacy: An Approach based on Opacity =====<br />
<br />
Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this in the section below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40.<br />
<br />
[4] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
===== Obfuscation Game: Can you keep a secret? =====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
[5] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
Several people contributed to the development of the Obfuscation Game, principally Isaac Dubuque, Andrew Bourgeois, and Jack Weize, who worked under the mentorship of Blake Rawlings and Rômulo Meira-Góes.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
<br />
===== Obfuscation for Location Privacy: Real-time Demonstration =====<br />
<br />
The methodology from [5] used for the background obfuscator in the Obfuscation Game was also deployed in an app running on an ipad for real-time obfuscation when a user is moving in an area with secret locations and obstacles. Given a bounded geographical area modeled by a grid with obstacle cells and secret cells, an obfuscator is constructed so that the reported (or obfuscated) location never visits the secret grid cells. At the same time, the reported location should be within some given bound from the actual one, and the resulting obfuscated trajectory should be a valid one (i.e., avoid the obstacles). The app monitors the location of the user in real-time using GPS and the obfuscator synthesized by the methodology in [5] is used in real-time to select a reported position each time the user enters a new cell. We produced a video of the actual and obfuscated trajectories as a user holding an ipad running the app was moving in the Gerstacker Grove on the North Campus of the University of Michigan in Ann Arbor. In that video, the blue circle is the actual user location from GPS, the blue diamond is the actual cell location used as input to the obfuscator, the purple diamond is the reported position by the obfuscator, the black grid cells are obstacles, and the orange grid cell is the secret location. Here, the constraint was that the obfuscated position should never be more than 2 cells away from the actual one. It can be seen that the obfuscated trajectory is a valid one, which is provably guaranteed by construction of the obfuscator (if one exists).<br />
<br />
Video of app implementing real-time obfuscation for location privacy: [[Media:Walk_grove.mp4]]<br />
<br />
<br />
==== Our research on opacity and its enforcement ====<br />
<br />
Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-17T15:25:31Z<p>Stephane: /* Obfuscation Project of the DES Group at the University of Michigan */</p>
<hr />
<div>== '''Obfuscation Project of the DES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement and Its Application to Location Privacy''.<br />
<br />
===== What is Opacity? =====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].<br />
<br />
[1] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[2] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
===== Location Privacy: An Approach based on Opacity =====<br />
<br />
Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this in the section below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40.<br />
<br />
[4] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
Video of App implementing real-time obfuscation for location privacy:<br />
[[Media:Walk_grove.mp4]]<br />
<br />
===== Obfuscation Game: Can you keep a secret? =====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
[5] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
Several people contributed to the development of the Obfuscation Game, principally Isaac Dubuque, Andrew Bourgeois, and Jack Weize, who worked under the mentorship of Blake Rawlings and Rômulo Meira-Góes.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
<br />
===== Obfuscation for Location Privacy: Real-time Demonstration =====<br />
<br />
The methodology from [5] used for the background obfuscator in the Obfuscation Game was also deployed in an app running on an ipad for real-time obfuscation when a user is moving in an area with secret locations and obstacles. Given a bounded geographical area modeled by a grid with obstacle cells and secret cells, an obfuscator is constructed so that the reported (or obfuscated) location never visits the secret grid cells. At the same time, the reported location should be within some given bound from the actual one, and the resulting obfuscated trajectory should be a valid one (i.e., avoid the obstacles). The app monitors the location of the user in real-time using GPS and the obfuscator synthesized by the methodology in [5] is used in real-time to select a reported position each time the user enters a new cell. We produced a video of the actual and obfuscated trajectories as a user holding an ipad running the app was moving in the Gerstacker Grove on the North Campus of the University of Michigan in Ann Arbor. In that video, the blue circle is the actual user location from GPS, the blue diamond is the actual cell location used as input to the obfuscator, the purple diamond is the reported position by the obfuscator, the black grid cells are obstacles, and the orange grid cell is the secret location. Here, the constraint was that the obfuscated position should never be more than 2 cells away from the actual one. It can be seen that the obfuscated trajectory is a valid one, which is provably guaranteed by construction of the obfuscator (if one exists).<br />
<br />
Video of app implementing real-time obfuscation for location privacy: [[Media:Walk_grove.mp4]]<br />
<br />
<br />
==== Our research on opacity and its enforcement ====<br />
<br />
Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-17T15:04:46Z<p>Stephane: /* Location Privacy: An Approach based on Opacity */</p>
<hr />
<div>== '''Obfuscation Project of the DES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement and Its Application to Location Privacy''.<br />
<br />
===== What is Opacity? =====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].<br />
<br />
[1] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[2] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
===== Location Privacy: An Approach based on Opacity =====<br />
<br />
Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this in the section below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40.<br />
<br />
[4] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
Video of App implementing real-time obfuscation for location privacy:<br />
[[Media:Walk_grove.mp4]]<br />
<br />
===== Obfuscation Game: Can you keep a secret? =====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
[5] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
Several people contributed to the development of the Obfuscation Game, principally Isaac Dubuque, Andrew Bourgeois, and Jack Weize, who worked under the mentorship of Blake Rawlings and Rômulo Meira-Góes.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
==== Our research on opacity and its enforcement ====<br />
<br />
Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/File:Walk_grove.mp4File:Walk grove.mp42021-06-17T15:02:54Z<p>Stephane: Real-time demonstration of Obfuscation app while working in the Gerstacker Grove on the North Campus of the University of Michigan, Ann Arbor, MI, USA</p>
<hr />
<div>Real-time demonstration of Obfuscation app while working in the Gerstacker Grove on the North Campus of the University of Michigan, Ann Arbor, MI, USA</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-17T14:59:53Z<p>Stephane: /* Location Privacy: An Approach based on Opacity */</p>
<hr />
<div>== '''Obfuscation Project of the DES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement and Its Application to Location Privacy''.<br />
<br />
===== What is Opacity? =====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].<br />
<br />
[1] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[2] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
===== Location Privacy: An Approach based on Opacity =====<br />
<br />
Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this in the section below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40.<br />
<br />
[4] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
Video of App implementing real-time obfuscation for location privacy:<br />
[[Media:??]]<br />
<br />
===== Obfuscation Game: Can you keep a secret? =====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
[5] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
Several people contributed to the development of the Obfuscation Game, principally Isaac Dubuque, Andrew Bourgeois, and Jack Weize, who worked under the mentorship of Blake Rawlings and Rômulo Meira-Góes.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
==== Our research on opacity and its enforcement ====<br />
<br />
Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-17T14:58:25Z<p>Stephane: /* Obfuscation Project of the DES Group at the University of Michigan */</p>
<hr />
<div>== '''Obfuscation Project of the DES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement and Its Application to Location Privacy''.<br />
<br />
===== What is Opacity? =====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].<br />
<br />
[1] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[2] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
===== Location Privacy: An Approach based on Opacity =====<br />
<br />
Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this in the section below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40.<br />
<br />
[4] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
Video of App implementing real-time obfuscation for location privacy:<br />
<br />
<br />
===== Obfuscation Game: Can you keep a secret? =====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
[5] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
Several people contributed to the development of the Obfuscation Game, principally Isaac Dubuque, Andrew Bourgeois, and Jack Weize, who worked under the mentorship of Blake Rawlings and Rômulo Meira-Góes.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
==== Our research on opacity and its enforcement ====<br />
<br />
Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-15T20:01:17Z<p>Stephane: /* Our research on opacity and its enforcement */</p>
<hr />
<div>== '''Obfuscation Project of the DES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement and Its Application to Location Privacy''.<br />
<br />
===== What is Opacity? =====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].<br />
<br />
[1] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[2] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
===== Location Privacy: An Approach based on Opacity =====<br />
<br />
Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this in the section below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40.<br />
<br />
[4] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
===== Obfuscation Game: Can you keep a secret? =====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
[5] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
Several people contributed to the development of the Obfuscation Game, principally Isaac Dubuque, Andrew Bourgeois, and Jack Weize, who worked under the mentorship of Blake Rawlings and Rômulo Meira-Góes.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
==== Our research on opacity and its enforcement ====<br />
<br />
Our results to-date on opacity and its enforcement have been published in the papers listed below and organized chronologically. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin contributed to opacity enforcement by ''supervisory control'' in his doctoral dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate students interned in our group and participated in our efforts; they are listed below as co-authors in the relevant papers. We also acknowledge our external collaborators.<br />
<br />
Financial acknowledgement: We acknowledge in the papers below the sponsors that have made this work possible, principally the US National Science Foundation.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-15T19:50:52Z<p>Stephane: /* Our research on opacity and its enforcement */</p>
<hr />
<div>== '''Obfuscation Project of the DES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement and Its Application to Location Privacy''.<br />
<br />
===== What is Opacity? =====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].<br />
<br />
[1] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[2] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
===== Location Privacy: An Approach based on Opacity =====<br />
<br />
Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this in the section below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40.<br />
<br />
[4] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
===== Obfuscation Game: Can you keep a secret? =====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
[5] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
Several people contributed to the development of the Obfuscation Game, principally Isaac Dubuque, Andrew Bourgeois, and Jack Weize, who worked under the mentorship of Blake Rawlings and Rômulo Meira-Góes.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
==== Our research on opacity and its enforcement ====<br />
<br />
Our results to-date on opacity and its enforcement have been published in the papers below. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin also studied opacity enforcement by ''supervisory control'' in his dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate student interned in our group and participated in our demonstration projects; they are listed below as co-authors in the relevant papers.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied this approach to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-15T19:47:32Z<p>Stephane: /* Our research on opacity and its enforcement */</p>
<hr />
<div>== '''Obfuscation Project of the DES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement and Its Application to Location Privacy''.<br />
<br />
===== What is Opacity? =====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].<br />
<br />
[1] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[2] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
===== Location Privacy: An Approach based on Opacity =====<br />
<br />
Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this in the section below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40.<br />
<br />
[4] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
===== Obfuscation Game: Can you keep a secret? =====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
[5] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
Several people contributed to the development of the Obfuscation Game, principally Isaac Dubuque, Andrew Bourgeois, and Jack Weize, who worked under the mentorship of Blake Rawlings and Rômulo Meira-Góes.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
==== Our research on opacity and its enforcement ====<br />
<br />
Our results to-date on opacity and its enforcement have been published in the papers below. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin also studied opacity enforcement by ''supervisory control'' in his dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate student interned in our group and participated in our demonstration projects; they are listed below as co-authors in the relevant papers.<br />
<br />
<br />
* Our first paper on the verification of different notions of opacity:<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
<br />
* Our papers where we first introduced opacity enforcement by ''insertion functions'', applied that to location privacy, and considered a class of optimal insertion functions according to different cost criteria, first for a logical DES model and then for a stochastic DES model:<br />
<br />
Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
<br />
* Our first paper on opacity enforcement by ''supervisory control'':<br />
<br />
X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
<br />
* Our paper on a new method to verify K-step and infinite-step opacity:<br />
<br />
X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
<br />
* Our paper on ''obfuscation'' by ''edit functions'', in a general set up of obfuscation where opacity itself is implicit and edit functions must satisfy a utility constraint:<br />
<br />
Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
<br />
* Back to opacity enforcement by insertion functions, but this time allowing for deciding on insertion actions based on strings of length K, as opposed to one event at a time:<br />
<br />
C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
<br />
* Application of opacity enforcement by edit functions with a utility constraint to location privacy in an indoor environment:<br />
<br />
R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
<br />
* More general set up of opacity enforcement by insertion or edit functions that may be known by the eavesdropper (public-private case):<br />
<br />
Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
<br />
* Different approach to optimal insertion functions for opacity enforcement, under ''energy'' constraints:<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
<br />
* Application of ''modular'' methods to opacity verification and enforcement (case of edit functions):<br />
<br />
S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
<br />
* Opacity enforcement by insertion functions that can see the unobservable events generated by the systems (i.e., these insertion functions are ''embedded'' in the system as opposed to being an output interface):<br />
<br />
C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
<br />
* Our most recent work on a general approach to opacity verification that covers not only current-state opacity but also K-step opacity in a ''unified'' framework, and application of that approach to enforcement of K-step opacity by edit functions:<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-14T18:14:28Z<p>Stephane: /* Our research on opacity and its enforcement */</p>
<hr />
<div>== '''Obfuscation Project of the DES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement and Its Application to Location Privacy''.<br />
<br />
===== What is Opacity? =====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].<br />
<br />
[1] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[2] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
===== Location Privacy: An Approach based on Opacity =====<br />
<br />
Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this in the section below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40.<br />
<br />
[4] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
===== Obfuscation Game: Can you keep a secret? =====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
[5] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
Several people contributed to the development of the Obfuscation Game, principally Isaac Dubuque, Andrew Bourgeois, and Jack Weize, who worked under the mentorship of Blake Rawlings and Rômulo Meira-Góes.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
==== Our research on opacity and its enforcement ====<br />
<br />
Our results to-date on opacity and its enforcement have been published in the papers below. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin also studied opacity enforcement by ''supervisory control'' in his dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate student interned in our group and participated in our demonstration projects; they are listed below as co-authors in the relevant papers.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,''<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-14T18:12:57Z<p>Stephane: /* Our research on opacity and its enforcement */</p>
<hr />
<div>== '''Obfuscation Project of the DES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement and Its Application to Location Privacy''.<br />
<br />
===== What is Opacity? =====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].<br />
<br />
[1] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[2] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
===== Location Privacy: An Approach based on Opacity =====<br />
<br />
Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this in the section below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40.<br />
<br />
[4] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
===== Obfuscation Game: Can you keep a secret? =====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
[5] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
Several people contributed to the development of the Obfuscation Game, principally Isaac Dubuque, Andrew Bourgeois, and Jack Weize, who worked under the mentorship of Blake Rawlings and Rômulo Meira-Góes.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
==== Our research on opacity and its enforcement ====<br />
<br />
Our results to-date on opacity and its enforcement have been published in the papers below. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin also studied opacity enforcement by ''supervisory control'' in his dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate student interned in our group and participated in our demonstration projects; they are listed below as co-authors in the relevant papers.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,'' % and Incomplete Information,"<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-14T18:11:36Z<p>Stephane: /* Our research on opacity and its enforcement */</p>
<hr />
<div>== '''Obfuscation Project of the DES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement and Its Application to Location Privacy''.<br />
<br />
===== What is Opacity? =====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].<br />
<br />
[1] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[2] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
===== Location Privacy: An Approach based on Opacity =====<br />
<br />
Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this in the section below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40.<br />
<br />
[4] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
===== Obfuscation Game: Can you keep a secret? =====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
[5] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
Several people contributed to the development of the Obfuscation Game, principally Isaac Dubuque, Andrew Bourgeois, and Jack Weize, who worked under the mentorship of Blake Rawlings and Rômulo Meira-Góes.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
==== Our research on opacity and its enforcement ====<br />
<br />
Our results to-date on opacity and its enforcement have been published in the papers below. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin also studied opacity enforcement by ''supervisory control'' in his dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate student interned in our group and participated in our demonstration projects; they are listed as co-authors in the relevant papers.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,'' % and Incomplete Information,"<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
Under review.<br />
[https://arxiv.org/abs/2103.10501 Deposited in arxiv]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of ''K''-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-06-14T18:10:12Z<p>Stephane: /* Our research on opacity and its enforcement */</p>
<hr />
<div>== '''Obfuscation Project of the DES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement and Its Application to Location Privacy''.<br />
<br />
===== What is Opacity? =====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].<br />
<br />
[1] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[2] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
===== Location Privacy: An Approach based on Opacity =====<br />
<br />
Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this in the section below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40.<br />
<br />
[4] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
===== Obfuscation Game: Can you keep a secret? =====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
[5] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
Several people contributed to the development of the Obfuscation Game, principally Isaac Dubuque, Andrew Bourgeois, and Jack Weize, who worked under the mentorship of Blake Rawlings and Rômulo Meira-Góes.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
==== Our research on opacity and its enforcement ====<br />
<br />
Our results to-date on opacity and its enforcement have been published in the papers below. (Conference papers are omitted if they were followed by a journal paper that subsumed them.) <br />
<br />
Opacity enforcement by ''insertion'' of fictitious events, or by ''deletion'' of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin also studied opacity enforcement by ''supervisory control'' in his dissertation. Rômulo Meira-Góes participated in several papers on implementation of opacity enforcement during his graduate studies. Research fellows Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts. Current student Andrew Wintenberg is considering opacity verification and enforcement in his doctoral research. <br />
<br />
Several undergraduate student interned in our group and participated in our demonstration projects; they are listed as co-authors in the relevant papers.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
See also Vol. 124, Article 109273, February 2021, for Authors' Reply to Comment on above article.<br />
<br />
Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,'' % and Incomplete Information,"<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control'', <br />
Vol. 65, <br />
No. 8, <br />
August 2020, <br />
pp. 3349-3364.<br />
<br />
C. Keroglou and S. Lafortune, <br />
"Embedded Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''. <br />
In print. [DOI: 10.1109/TAC.2020.3037891]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"A General Language-Based Framework for Specifying and Verifying Notions of Opacity."<br />
[https://arxiv.org/abs/2103.10501]<br />
<br />
A. Wintenberg, M. Blischke, S. Lafortune, and N. Ozay,<br />
"Enforcement of $K$-Step Opacity with Edit Functions,"<br />
submitted to IEEE Conference on Decision and Control, December 2021.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2021-03-02T21:19:44Z<p>Stephane: /* Obfuscation Game: Can you keep a secret? */</p>
<hr />
<div>== '''Obfuscation Project of the DES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement and Its Application to Location Privacy''.<br />
<br />
===== What is Opacity? =====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].<br />
<br />
[1] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[2] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
===== Location Privacy: An Approach based on Opacity =====<br />
<br />
Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this in the section below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40.<br />
<br />
[4] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
===== Obfuscation Game: Can you keep a secret? =====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
[5] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
Several people contributed to the development of the Obfuscation Game, principally Isaac Dubuque, Andrew Bourgeois, and Jack Weize, who worked under the mentorship of Blake Rawlings and Rômulo Meira-Góes.<br />
<br />
Try the [https://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
==== Our research on opacity and its enforcement ====<br />
<br />
Our results to-date on opacity and its enforcement have been published in the papers below. Opacity enforcement by insertion of fictitious events, or by deletion of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin also studied opacity enforcement by supervisory control in his dissertation. Current students Rômulo Meira-Góes and Andrew Wintenberg are also considering opacity enforcement in their doctoral research. Research fellow Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts.<br />
Several undergraduate student interned in our group and participated in our demonstration projects; they are listed as co-authors in the relevant papers.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
<br />
C. Keroglou and S. Lafortune,<br />
"Verification and Synthesis of Embedded Insertion Functions for Opacity Enforcement,"<br />
"Proceedings of the 56th IEEE Conference on Decision and Control",<br />
December 2017, pp. 4217-4223.<br />
<br />
Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,'' % and Incomplete Information,"<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control''.<br />
In print.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2020-06-02T20:04:18Z<p>Stephane: /* Obfuscation Game: Can you keep a secret? */</p>
<hr />
<div>== '''Obfuscation Project of the DES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement and Its Application to Location Privacy''.<br />
<br />
===== What is Opacity? =====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].<br />
<br />
[1] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[2] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
===== Location Privacy: An Approach based on Opacity =====<br />
<br />
Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this in the section below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40.<br />
<br />
[4] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
===== Obfuscation Game: Can you keep a secret? =====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [http://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
[5] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
Several people contributed to the development of the Obfuscation Game, principally Isaac Dubuque, Andrew Bourgeois, and Jack Weize, who worked under the mentorship of Blake Rawlings and Rômulo Meira-Góes.<br />
<br />
Try the [http://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
==== Our research on opacity and its enforcement ====<br />
<br />
Our results to-date on opacity and its enforcement have been published in the papers below. Opacity enforcement by insertion of fictitious events, or by deletion of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin also studied opacity enforcement by supervisory control in his dissertation. Current students Rômulo Meira-Góes and Andrew Wintenberg are also considering opacity enforcement in their doctoral research. Research fellow Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts.<br />
Several undergraduate student interned in our group and participated in our demonstration projects; they are listed as co-authors in the relevant papers.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
<br />
C. Keroglou and S. Lafortune,<br />
"Verification and Synthesis of Embedded Insertion Functions for Opacity Enforcement,"<br />
"Proceedings of the 56th IEEE Conference on Decision and Control",<br />
December 2017, pp. 4217-4223.<br />
<br />
Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,'' % and Incomplete Information,"<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control''.<br />
In print.</div>Stephanehttps://wiki.eecs.umich.edu/obfuscation/index.php/Main_PageMain Page2020-06-01T20:52:29Z<p>Stephane: /* Obfuscation Project of the DES Group at the University of Michigan */</p>
<hr />
<div>== '''Obfuscation Project of the DES Group at the University of Michigan''' ==<br />
<br />
We call this page the ''Obfuscation Project''. A better but less catchy name should probably be: ''Opacity Enforcement and Its Application to Location Privacy''.<br />
<br />
===== What is Opacity? =====<br />
Opacity is a general property that has been defined and studied in the context of computer security and privacy. Assuming that some information about a user is revealed to an eavesdropper with potentially malicious intentions, and assuming that a portion of that information needs to be kept ''secret'', opacity roughly means that the user can always maintain plausible deniability about its secret information. Let's say that someone is tracking your movements and that your secret information is that you are at the bank, then the tracking should not reveal with certainty that you are at the bank; perhaps you could also be at the coffee shop next to the bank. For an overview of the study of opacity, please refer to [1]. For some historical remarks regarding the study of opacity in a branch of control engineering known as Discrete Event Systems (DES), see [2].<br />
<br />
[1] Romain Jacob, Jean-Jacques Lesage, Jean-Marc Faure,<br />
"Overview of discrete event systems opacity: Models, validation, and quantification,"<br />
''Annual Reviews in Control'',<br />
Volume 41,<br />
2016,<br />
Pages 135-146.<br />
<br />
[2] S. Lafortune, F. Lin, and C. Hadjicostis,<br />
"On the History of Diagnosability and Opacity in Discrete Event Systems,"<br />
''Annual Reviews in Control'',<br />
Vol. 45, pp. 257-266, 2018.<br />
<br />
===== Location Privacy: An Approach based on Opacity =====<br />
<br />
Our group at Michigan has been doing work on opacity and its enforcement for many years. More on this in the section below. To illustrate our theoretical work on opacity enforcement by insertion and edit functions, we have used location privacy as an illustrative example. Let's imagine that you can send slightly altered (i.e., obfuscated) information about your location as you move around in a certain geographical area. Then how should your position information be slightly altered, as observed by the eavesdropper or other parties, so that your visits to secret locations are never revealed? This is more complicated than adding random noise to your location as you move. The obfuscated trajectory is required to be a valid trajectory where you are moving. Inside a building, the obfuscated trajectory should not go through walls. If you are moving around campus or town, then it should not go through buildings and follow sidewalks or streets. Moreover, to make the problem more challenging (and also more realistic), we require that the obfuscated position should never be more than a certain maximum distance from the true one. Our general algorithms for opacity enforcement by insertion functions can be used to solve this problem, if we model the trajectory of the user by discrete moves, such as from tile to tile in a grid.<br />
<br />
The paper [3] gives an example on our methodology for location privacy for a user moving around the Central Campus of the University of Michigan. For the sake of simplicity, we consider only 8 possible locations for the user, with one of them being the secret. <br />
<br />
[3] Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
The paper [4] shows how to actually implement the same methodology in an indoor setting, using real-time data from an acoustic positioning system and a real-time obfuscator implemented in the cloud. The grid there is much larger, over 30 by 40.<br />
<br />
[4] R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
===== Obfuscation Game: Can you keep a secret? =====<br />
<br />
As a way to illustrate the challenge of enforcing location privacy using obfuscation in an amusing way, we developed the [http://obfuscationgame.eecs.umich.edu Obfuscation Game]. Here, the user is the obfuscator and it must try to obfuscate, in real-time, the moves of an agent in a grid. In the Obfuscation Game, our algorithms for opacity enforcement are relegated to the background and the user must do the obfuscation on their own. They play against the computer, which simply moves the agent randomly. The goal of this game is to show that due to obstacles and the maximum distance constraint, the obfuscator needs to plan several steps ahead (as in most board games), and this can often be quite difficult. Our algorithm does run in the background and it can provide a hint to the user, if need be.<br />
<br />
The implementation of the background obfuscator in the Obfuscation Game is based on the symbolic implementation of edit functions in the paper [5].<br />
<br />
[5] Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
Try the [http://obfuscationgame.eecs.umich.edu Obfuscation Game]. We hope you enjoy it!<br />
<br />
<br />
==== Our research on opacity and its enforcement ====<br />
<br />
Our results to-date on opacity and its enforcement have been published in the papers below. Opacity enforcement by insertion of fictitious events, or by deletion of events under some constraints, were the main results in the doctoral dissertations of Yi-Chin Wu and Yiding Ji. Xiang Yin also studied opacity enforcement by supervisory control in his dissertation. Current students Rômulo Meira-Góes and Andrew Wintenberg are also considering opacity enforcement in their doctoral research. Research fellow Blake Rawlings, Christoforos Keroglou, and Sahar Mohajerani were also important contributors to our efforts.<br />
Several undergraduate student interned in our group and participated in our demonstration projects; they are listed as co-authors in the relevant papers.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Comparative Analysis of Related Notions of Opacity in Centralized and Coordinated Architectures,"<br />
''Discrete Event Dynamic Systems: Theory and Applications''.<br />
Vol. 23, No. 3, September 2013, pp. 307-339.<br />
<br />
Y.-C. Wu and S. Lafortune,<br />
"Synthesis of Insertion Functions for Enforcement of Opacity Security Properties,"<br />
''Automatica''. <br />
Vol. 50, No. 5, May 2014, pp. 1336-1348.<br />
<br />
Y.-C. Wu, K.A. Sankararaman, and S. Lafortune,<br />
"Ensuring Privacy in Location-Based Services: An Approach Based on Opacity Enforcement,"<br />
''Proceedings of the 12th International Workshop on Discrete Event Systems'',<br />
May 2014.<br />
<br />
Y.C. Wu and S. Lafortune,<br />
"Synthesis of Optimal Insertion Functions for Opacity Enforcement,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 3, March 2016, pp. 571-584.<br />
<br />
Y.-C. Wu, G. Lederman, and S. Lafortune,<br />
"Enhancing opacity of stochastic discrete event systems using insertion functions,"<br />
''Proceedings of the 2016 American Control Conference'',<br />
July 2016,<br />
pp. 2053--2060.<br />
<br />
X. Yin and S. Lafortune,<br />
"A Uniform Approach for Synthesizing Property-Enforcing Supervisors for Partially-Observed Discrete-Event Systems,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 61, No. 8, August 2016, pp. 2140-2154.<br />
<br />
X. Yin and S. Lafortune,<br />
"A New Approach for the Verification of Infinite-Step and K-Step Opacity using Two-Way Observers,"<br />
''Automatica''.<br />
Vol. 80, pp. 162-171, June 2017.<br />
<br />
C. Keroglou and S. Lafortune,<br />
"Verification and Synthesis of Embedded Insertion Functions for Opacity Enforcement,"<br />
"Proceedings of the 56th IEEE Conference on Decision and Control",<br />
December 2017, pp. 4217-4223.<br />
<br />
Y.-C. Wu, V. Raman, B.C. Rawlings, S. Lafortune, and S. Seshia,<br />
"Synthesis of Obfuscation Policies to Ensure Privacy and Utility,"<br />
''Journal of Automated Reasoning''.<br />
Vol. 60, No. 1, pp. 107-131, January 2018.<br />
<br />
C. Keroglou, S. Lafortune, and L. Ricker,<br />
"Insertion Functions with Memory for Opacity Enforcement,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 405-410.<br />
<br />
R. Meira Góes, B.C. Rawlings, N. Recker, G. Willett, and S. Lafortune,<br />
"Demonstration of Indoor Location Privacy Enforcement using Obfuscation,"<br />
''Proceedings of the 14th International Workshop on Discrete Event Systems'',<br />
June 2018, pp. 157-163.<br />
<br />
Y. Ji, Y.-C. Wu, and S. Lafortune,<br />
"Enforcement of Opacity by Public and Private Insertion Functions,"<br />
''Automatica'',<br />
Vol. 93, July 2018, pp. 369-378.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Opacity Enforcement using Nondeterministic Publicly-Known Edit Functions,"<br />
''IEEE Transactions on Automatic Control''.<br />
Vol. 64,<br />
No. 10,<br />
October 2019,<br />
pp. 4369-4376.<br />
<br />
Y. Ji, X. Yin, and S. Lafortune,<br />
"Enforcing Opacity by Insertion Functions under Multiple Energy Constraints,'' % and Incomplete Information,"<br />
''Automatica''.<br />
Vol. 108, October 2019, pp. 108476 (1-14).<br />
<br />
S. Mohajerani and S. Lafortune, "Transforming opacity verification to nonblocking verification in modular systems," <br />
''IEEE Transactions on Automatic Control'',<br />
Vol. 65,<br />
No. 4,<br />
April 2020, pp. 1739-1746.<br />
<br />
S. Mohajerani, Y. Ji, and S. Lafortune,<br />
"Compositional and Abstraction-Based Approach for Synthesis of Edit Functions for Opacity Enforcement."<br />
''IEEE Transactions on Automatic Control''.<br />
In print.</div>Stephane