Related articles |
---|
CFP: TRUST'14 @ PLDI'14 (1st ACM SIGPLAN Workshop on Reproducible Rese gfursin@gmail.com (2014-02-02) |
From: | gfursin@gmail.com |
Newsgroups: | comp.compilers |
Date: | Sun, 2 Feb 2014 16:47:46 -0500 (EST) |
Organization: | Compilers Central |
Keywords: | conference, CFP |
Posted-Date: | 02 Feb 2014 16:47:46 EST |
=================================================================
CALL FOR PAPERS
TRUST: 1st ACM SIGPLAN Workshop
on Reproducible Research Methodologies
and New Publication Models in Computer Engineering
June 12,2014, Edinburgh, UK
(co-located with PLDI 2014)
http://c-mind.org/events/trust2014
=================================================================
It becomes excessively challenging or even impossible to capture,
share and accurately reproduce experimental results in computer
engineering for fair and trustable evaluation and future
improvement. This is often due to ever rising complexity of the
design, analysis and optimization of computer systems, increasing
number of ad-hoc tools, interfaces and techniques, lack of
a common experimental methodology, and lack of simple and unified
mechanisms, tools and repositories to preserve and exchange
knowledge apart from numerous publications where reproducibility
is oaten not even considered. This SIGPLAN workshop is intended
to become an interdisciplinary forum for academic and industrial
researchers, practitioners and developers in computer engineering
to discuss challenges, ideas, experience, trustable and
reproducible research methodologies, practical techniques, tools
and repositories to:
* capture, preserve, formalize, systematize, exchange and improve
knowledge and experimental results including negative ones
* describe and catalog whole experimental setups with all related
material including algorithms, benchmarks, codelets, datasets,
tools, models and any other artifact
* validate and verify experimental results by the community
* develop common research interfaces for existing or new tools
* develop common experimental frameworks and repositories
* share rare hardware and computational resources for
experimental validation
* deal with variability and rising amount of experimental data
using statistical analysis, data mining, predictive modeling and
other techniques
* implement previously published experimental scenarios
(auto-tuning, run-time adaptation) using common infrastructure
* implement open access to publications and data (particularly
discussing intellectual property IP and legal issues)
* improve reviewing process
* enable interactive articles
==== Important Dates ====
* Abstract submission deadline: March 7, 2014 (AoE)
* Paper submission deadline: March 14, 2014 (AoE)
* Author notification: April 14, 2014
* Final paper version: May 2, 2014
* Workshop: June 12, 2014
==== Workshop organizers ====
* Grigori Fursin (INRIA, France)
* Bruce Childers, Alex K.Jones and Daniel Mosse
(University of Pittsburgh, USA)
==== Program Committee ====
* Jose Nelson Amaral (University of Alberta, Canada)
* Calin Cascaval (Qualcomm, USA)
* Jack Davidson (University of Virginia, USA)
* Evelyn Duesterwald (IBM, USA)
* Lieven Eeckhout (Ghent University, Belgium)
* Eric Eide (University of Utah, USA)
* Sebastian Fischmeister (University of Waterloo, Canada)
* Michael Gerndt (TU Munich, Germany)
* Christophe Guillon (STMicroelectronics, France)
* Shriram Krishnamurthi (Brown University, USA)
* Hugh Leather (University of Edinburgh, UK)
* Anton Lokhmotov (ARM, UK)
* Mikel Lujan (University of Manchester, UK)
* David Padua (University of Illinois at Urbana-Champaign, USA)
* Christoph Reichenbach
(Johann-Wolfgang Goethe Universitat Frankfurt, Germany)
* Arun Rodrigues (Sandia National Laboratories, USA)
* Reiji Suda (University of Tokyo, Japan)
* Sid Touati (INRIA, France)
* Jesper Larsson Traff (Vienna University of Technology, Austria)
* Petr Tuma (Charles University, Czech Republic)
* Jan Vitek (Purdue University, USA)
* Vladimir Voevodin (Moscow State University, Russia)
* Vittorio Zaccaria (Politecnico di Milano, Italy)
* Xiaoyun Zhu (VMware, USA)
==== Paper Submission Guidelines ====
We invite papers in three categories (please use these prefixes
for your submission title):
* T1: Extended abstracts should be at most 3 pages long
(excluding bibliography). We welcome preliminary and exploratory
work, presentation of related tools and repositories
in development, experience reports, and wild & crazy ideas.
* T2: Full papers should be at most 6 pages long (excluding
bibliography). Papers in this category are expected to have
relatively mature content.
* T3: Papers validating and sharing past research on design and
optimization of computer systems published in relevant
conferences. These papers should be at most 6 pages long
(excluding bibliography).
Submissions should be in PDF formatted with double
column/single-spacing using 10pt fonts and printable on US letter
or A4 sized paper. All papers will be peer-reviewed. Accepted
papers can be published online on the conference website that
will not prevent later publication of extended papers.
We currently arrange proceedings to be published in the ACM
Digital Library.
Easychair submission website:
https://www.easychair.org/conferences/?conf=trust20140
==== Some related projects and initiatives ====
* Conference Artifact Evaluation
http://cs.brown.edu/~sk/Memos/Conference-Artifact-Evaluation/
* HiPEAC thematic session on making computer engineering a science:
http://www.hipeac.net/thematic-session/making-computer-engineering-science
* ADAPT panel on reproducible research methodologies and new
publication models (January 2014):
http://adapt-workshop.org/program.htm
* Collective Mind technology for collaborative, systematic and
reproducible computer engineering:
http://c-mind.org
* cTuning technology to crowdsource auto-tuning
and combine with machine learning (2006-2011):
http://cTuning.org
* OCCAM project for reproducible computer architecture simulation:
http://www.occamportal.org
* Evaluate project:
http://evaluate.inf.usi.ch
* Artifact evaluation at OOSPLA'13:
http://splashcon.org/2013/cfp/665
* Artifact evaluation at PLDI'14:
http://pldi14-aec.cs.brown.edu
* CARE tool from STMicroelectronics
(Comprehensive Archiver for Reproducible Execution)
http://reproducible.io
==== Sponsors ====
* ACM SIGPLAN, http://sigplan.org
Return to the
comp.compilers page.
Search the
comp.compilers archives again.