License: Creative Commons Attribution 4.0 International license (CC BY 4.0)
When quoting this document, please refer to the following
DOI: 10.4230/LIPIcs.CONCUR.2022.31
URN: urn:nbn:de:0030-drops-170947
URL: http://dagstuhl.sunsite.rwth-aachen.de/volltexte/2022/17094/
Go to the corresponding LIPIcs Volume Portal


Dongol, Brijesh ; Schellhorn, Gerhard ; Wehrheim, Heike

Weak Progressive Forward Simulation Is Necessary and Sufficient for Strong Observational Refinement

pdf-format:
LIPIcs-CONCUR-2022-31.pdf (1.0 MB)


Abstract

Hyperproperties are correctness conditions for labelled transition systems that are more expressive than traditional trace properties, with particular relevance to security. Recently, Attiya and Enea studied a notion of strong observational refinement that preserves all hyperproperties. They analyse the correspondence between forward simulation and strong observational refinement in a setting with only finite traces. We study this correspondence in a setting with both finite and infinite traces. In particular, we show that forward simulation does not preserve hyperliveness properties in this setting. We extend the forward simulation proof obligation with a (weak) progress condition, and prove that this weak progressive forward simulation is equivalent to strong observational refinement.

BibTeX - Entry

@InProceedings{dongol_et_al:LIPIcs.CONCUR.2022.31,
  author =	{Dongol, Brijesh and Schellhorn, Gerhard and Wehrheim, Heike},
  title =	{{Weak Progressive Forward Simulation Is Necessary and Sufficient for Strong Observational Refinement}},
  booktitle =	{33rd International Conference on Concurrency Theory (CONCUR 2022)},
  pages =	{31:1--31:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-246-4},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{243},
  editor =	{Klin, Bartek and Lasota, S{\l}awomir and Muscholl, Anca},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/opus/volltexte/2022/17094},
  URN =		{urn:nbn:de:0030-drops-170947},
  doi =		{10.4230/LIPIcs.CONCUR.2022.31},
  annote =	{Keywords: Strong Observational Refinement, Hyperproperties, Forward Simulation, Weak Progressiveness}
}

Keywords: Strong Observational Refinement, Hyperproperties, Forward Simulation, Weak Progressiveness
Collection: 33rd International Conference on Concurrency Theory (CONCUR 2022)
Issue Date: 2022
Date of publication: 06.09.2022


DROPS-Home | Fulltext Search | Imprint | Privacy Published by LZI