License: Creative Commons Attribution 4.0 International license (CC BY 4.0)
When quoting this document, please refer to the following
DOI: 10.4230/LIPIcs.CONCUR.2022.32
URN: urn:nbn:de:0030-drops-170955
URL: http://dagstuhl.sunsite.rwth-aachen.de/volltexte/2022/17095/
Go to the corresponding LIPIcs Volume Portal


Kiefer, Stefan ; Tang, Qiyi

Strategies for MDP Bisimilarity Equivalence and Inequivalence

pdf-format:
LIPIcs-CONCUR-2022-32.pdf (1 MB)


Abstract

A labelled Markov decision process (MDP) is a labelled Markov chain with nondeterminism; i.e., together with a strategy a labelled MDP induces a labelled Markov chain. Motivated by applications to the verification of probabilistic noninterference in security, we study problems whether there exist strategies such that the labelled MDPs become bisimilarity equivalent/inequivalent. We show that the equivalence problem is decidable; in fact, it is EXPTIME-complete and becomes NP-complete if one of the MDPs is a Markov chain. Concerning the inequivalence problem, we show that (1) it is decidable in polynomial time; (2) if there are strategies for inequivalence then there are memoryless strategies for inequivalence; (3) such memoryless strategies can be computed in polynomial time.

BibTeX - Entry

@InProceedings{kiefer_et_al:LIPIcs.CONCUR.2022.32,
  author =	{Kiefer, Stefan and Tang, Qiyi},
  title =	{{Strategies for MDP Bisimilarity Equivalence and Inequivalence}},
  booktitle =	{33rd International Conference on Concurrency Theory (CONCUR 2022)},
  pages =	{32:1--32:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-246-4},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{243},
  editor =	{Klin, Bartek and Lasota, S{\l}awomir and Muscholl, Anca},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/opus/volltexte/2022/17095},
  URN =		{urn:nbn:de:0030-drops-170955},
  doi =		{10.4230/LIPIcs.CONCUR.2022.32},
  annote =	{Keywords: Markov decision processes, Markov chains}
}

Keywords: Markov decision processes, Markov chains
Collection: 33rd International Conference on Concurrency Theory (CONCUR 2022)
Issue Date: 2022
Date of publication: 06.09.2022


DROPS-Home | Fulltext Search | Imprint | Privacy Published by LZI