License: Creative Commons Attribution 3.0 Unported license (CC BY 3.0)
When quoting this document, please refer to the following
DOI: 10.4230/DagRep.10.3.58
URN: urn:nbn:de:0030-drops-134303
URL: http://dagstuhl.sunsite.rwth-aachen.de/volltexte/2020/13430/
Go back to Dagstuhl Reports


Bientinesi, Paolo ; Ham, David ; Huang, Furong ; Kelly, Paul H. J. ; Lengauer, Christian ; Sadayappan, Saday
Weitere Beteiligte (Hrsg. etc.): Paolo Bientinesi and David Ham and Furong Huang and Paul H. J. Kelly and Christian Lengauer and Saday Sadayappan

Tensor Computations: Applications and Optimization (Dagstuhl Seminar 20111)

pdf-format:
dagrep_v010_i003_p058_20111.pdf (4 MB)


Abstract

Tensors are higher-dimensional analogs of matrices, and represent a key data abstraction for many applications in computational science and data science. In contrast to the wide availability on diverse hardware platforms of high-performance numerical libraries for matrix computations, only limited software infrastructure exists today for high-performance tensor computations.
Recent research developments have resulted in the formulation of many machine learning algorithms in terms of tensor computations. Tensor computations have also emerged as fundamental building blocks for many algorithms in data science and computational science. Therefore, several concurrent efforts have targeted the development of libraries, frameworks, and domain-specific compilers to support the rising demand for high-performance tensor computations. However, there is currently very little coordination among the various groups of developers. Further, the groups developing high-performance libraries/frameworks for tensor computations are still rather disconnected from the research community that develops applications using tensors as a key data abstraction.
The main goal of this Dagstuhl Seminar has been to bring together the following two communities: first researchers from disciplines developing applications centered around tensor computations, and second researchers developing software infrastructure for efficient tensor computation primitives. Invitees from the former group included experts in machine learning and data analytics, and computational scientists developing tensor-based applications. Invitees from the latter group spanned experts in compiler optimization and experts in numerical methods.
A very fruitful exchange of ideas across these four research communities took place, with discussions on the variety of needs and use-cases for tensor computations and the challenges/opportunities in the development of high-performance software to satisfy those needs.

BibTeX - Entry

@Article{bientinesi_et_al:DR:2020:13430,
  author =	{Paolo Bientinesi and David Ham and Furong Huang and Paul H. J. Kelly and Christian Lengauer and Saday Sadayappan},
  title =	{{Tensor Computations: Applications and Optimization (Dagstuhl Seminar 20111)}},
  pages =	{58--70},
  journal =	{Dagstuhl Reports},
  ISSN =	{2192-5283},
  year =	{2020},
  volume =	{10},
  number =	{3},
  editor =	{Paolo Bientinesi and David Ham and Furong Huang and Paul H. J. Kelly and Christian Lengauer and Saday Sadayappan},
  publisher =	{Schloss Dagstuhl--Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/opus/volltexte/2020/13430},
  URN =		{urn:nbn:de:0030-drops-134303},
  doi =		{10.4230/DagRep.10.3.58},
  annote =	{Keywords: compilers, computational science, linear algebra, machine learning, numerical methods}
}

Keywords: compilers, computational science, linear algebra, machine learning, numerical methods
Collection: Dagstuhl Reports, Volume 10, Issue 3
Issue Date: 2020
Date of publication: 21.12.2020


DROPS-Home | Fulltext Search | Imprint | Privacy Published by LZI