MLSP 2014 SoundSoftware.ac.uk Prizes for Reproducibility in Signal Processing

The SoundSoftware Project, in collaboration with the IEEE Signal Processing Society, will be sponsoring a Prize for Reproducibility in Signal Processing for work published at the IEEE International Workshop on Machine Learning for Signal Processing (MLSP 2014).

This edition of the Reproducible Research prizes follows our earlier prize series in conjuction with the Audio Engineering Society's 53rd Conference and our SoundSoftware.ac.uk Prizes for Reproducibility in Audio and Music Research.

Eligibility

Authors of papers submitted to the IEEE International Workshop on Machine Learning for Signal Processing (MLSP 2014) may present their submitted papers for this edition of the SoundSoftware.ac.uk Prizes for Reproducibility. Please read below for details on how to submit your work.

All researchers will be considered for this award, both from UK institutions eligible for EPSRC support and from other UK and international institutions.

Important dates

New — Entry has been re-opened with a new deadline of Thursday, 31 July!

  • Fri 16 May - Paper submission deadline for MLSP; Announcement of Prizes for Reproducibility
  • Sun 8 June - Submission deadline — Now re-opened! New deadline: Thurs 31 July

Categories

Category A - Fully reproducible work
Awarded for a paper whose results can be reproduced using the datasets, software and methodologies described in the paper;
Category B - Reproducibility-enabling work
Awarded for a paper presenting infrastructure, datasets, or standards intended to enable future reproducible work from other authors.

Within both categories we will separately consider regular and student-led submissions. Participants will be expected to choose a category on submission.

Prizes

While the value of this prize is in its formal recognition of the quality and potential impact of your work, winning entrants will also receive a £100 Amazon voucher.

How to enter

To submit your application to this edition of the SoundSoftware.ac.uk Prizes for Reproducibility in Signal Processing, please complete the following form: application form.

Please take note that the deadline for submissions is now July 31, 2014.

Evaluation

The entries will be evaluated by the SoundSoftware project. The evaluation panel will consider the following factors:

  • Ease of reproducibility of the results: a straightforward baseline replicability test in which we will evaluate whether each submission's published results can be regenerated using the software and data associated with the paper (where applicable).
  • Quality of sustainability planning: each submission's sustainability (of associated software and data) will be evaluated based on the Software Sustainability Institute's sustainability evaluation criteria (see http://www.software.ac.uk/online-sustainability-evaluation).
  • Potential to enable high quality research in the UK signal processing research community: all submissions will be sent to (at least) two external reviewers in order to assess their potential to enable further high quality research in their field

Some advice on how to prepare your submission

To give you the best chance of winning the prize, we suggest that you test both the sustainability and the reproducibility of your entry before submitting.

To test for sustainability, you can refer to the Software Sustainability Institute's Online Sustainability evaluation tool: http://www.software.ac.uk/online-sustainability-evaluation. Of particular note are:

  • whether the code and data are available in a suitable public repository;
  • whether version control is used;
  • which tools for community involvement are available;
  • whether the work includes a useful README file, copyright and licence information.

To test for reproducibility, we suggest that you simply test your software on a different machine. You might, for instance, ask a colleague to test it for you, preferably without standing over them explaining what to do as they do it! Usually this is enough to detect many significant oversights, such as missing files, references to nonexistent paths (eg /home/myusername/data/) or even incomplete or (completely missing) instructions.

If for any reason you can't ask a colleague to test your software for you, we'd advise you to test it on a virtual machine. For example (if you are not using Matlab) try running your code within this Linux image, which was originally provided for use by entrants to an IEEE AASP acoustic scene challenge.

Ideally, a submission would contain individual scripts that regenerate all results presented in your paper. We are aware that many results (either in the form of tables or plots) are generated a few hours before the submission deadline ends; writing scripts that generate the results on your paper at the same time you are writing the paper itself is usually the best option! Don't forget that the person most likely to want to reproduce your results is you, at a later date - so why not prepare for that from the start?

If you wish to know more about the previous edition's evaluation process, please see the following page: http://soundsoftware.ac.uk/rr-prize-how.

Notes

While we encourage fully open access and open source submissions, research that can reasonably be reproduced by other researchers using common non-open tools or data is also acceptable (e.g. using Matlab). However, do not assume that the testers will have access to special datasets in your particular field, e.g. image processing, bioinformatics, speech processing, etc.

If no submissions of suitable quality are received for a particular category, that prize may not be awarded.

For further advice or assistance contact the SoundSoftware project at info@soundsoftware.ac.uk.