License: Creative Commons Attribution-NoDerivs 3.0 Unported license (CC BY-ND 3.0)
When quoting this document, please refer to the following
DOI: 10.4230/LIPIcs.STACS.2013.598
URN: urn:nbn:de:0030-drops-39681
URL: http://dagstuhl.sunsite.rwth-aachen.de/volltexte/2013/3968/
Go to the corresponding LIPIcs Volume Portal


Clément, Julien ; Nguyen Thi, Thu Hien ; Vallée, Brigitte

A general framework for the realistic analysis of sorting and searching algorithms. Application to some popular algorithms

pdf-format:
56.pdf (0.6 MB)


Abstract

We describe a general framework for realistic analysis of sorting and searching algorithms, and we apply it to the average-case analysis of five basic algorithms: three sorting algorithms (QuickSort, InsertionSort, BubbleSort) and two selection algorithms (QuickMin and SelectionMin). Usually, the analysis deals with the mean number of key comparisons, but, here, we view keys as words produced by the same source, which are compared via their symbols in the lexicographic order. The "realistic" cost of the algorithm is now the total number of symbol comparisons performed by the algorithm, and, in this context, the average-case analysis aims to providee stimates for the mean number of symbol comparisons used by the algorithm. For sorting algorithms, and with respect to key comparisons, the average-case complexity of QuickSort is asymptotic to 2n log n, InsertionSort to n^2/4 and BubbleSort to n^2/2. With respect to symbol comparisons, we prove that their average-case complexity becomes Theta(n log^2n), Theta(n^2), Theta (n^2 log n). For selection algorithms, and with respect to key comparisons, the average-case complexity of QuickMin is asymptotic to 2n, of SelectionMin is n - 1. With respect to symbol comparisons, we prove that their average-case complexity remains Theta(n). In these five cases, we describe the dominant constants which exhibit the probabilistic behaviour of the source (namely, entropy, and various notions of coincidence) with respect to the algorithm.

BibTeX - Entry

@InProceedings{clment_et_al:LIPIcs:2013:3968,
  author =	{Julien Cl{\'e}ment and Thu Hien Nguyen Thi and Brigitte Vall{\'e}e},
  title =	{{A general framework for the realistic analysis of sorting and searching algorithms. Application to some popular algorithms}},
  booktitle =	{30th International Symposium on Theoretical Aspects of Computer Science (STACS 2013)},
  pages =	{598--609},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-50-7},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{20},
  editor =	{Natacha Portier and Thomas Wilke},
  publisher =	{Schloss Dagstuhl--Leibniz-Zentrum fuer Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{http://drops.dagstuhl.de/opus/volltexte/2013/3968},
  URN =		{urn:nbn:de:0030-drops-39681},
  doi =		{10.4230/LIPIcs.STACS.2013.598},
  annote =	{Keywords: Probabilistic analysis of algorithms, Sorting and searching algorithms, Pattern matching, Permutations, Information theory, Rice formula, Asymptotic e}
}

Keywords: Probabilistic analysis of algorithms, Sorting and searching algorithms, Pattern matching, Permutations, Information theory, Rice formula, Asymptotic e
Collection: 30th International Symposium on Theoretical Aspects of Computer Science (STACS 2013)
Issue Date: 2013
Date of publication: 26.02.2013


DROPS-Home | Fulltext Search | Imprint | Privacy Published by LZI