From Aug 3rd to Aug 7th, there was the Dark-matter and Neutrino Computation Explored (DANCE) Machine Learning Workshop 2020 with 233 participants, averaging around 60 at any given time.
The DANCE workshop series aims to unify the large disperse community of neutrino and dark-matter experiments, where DANCE-ML is a new workshop to unify efforts related to machine learning. Experimentally, most medium-sized modern dark matter and neutrino experiments emerged out of the success of underground experimentation related to discovery that neutrinos had mass (Nobel Prize 2015). These successes gave rise to a large new field between particle physics, astrophysics, and nuclear physics while advancing all three fields. The total size of this community is bigger than, for instance, the DUNE community where the smaller collaboration size makes these experiments heavily R&D driven. The DIDACTS project is an HDR funded project that is forming a community of practice for computational efforts across our field, especially related to machine learning.
The DANCE-ML workshop focuses on understanding and communicating developments in the area of machine learning in this field, which is quickly changing how measurements are performed. There are three parts to the workshop:
- a training portion related to the basics of machine learning in dark matter, and graph signal processing by CLARIPHY collaborator Waheed Bajwa
- talks on new developments in the field
- breakout rooms on specific topics, like uncertainty quantification in machine learning
- career panel for those who are in industry
The workshop was a massive success (timetable) due to its ability to bring in mostly students while having updates from a wide range of experiments, ranging from neutrinoless double-beta decay to different experimental approaches to dark matter detection. A theme of the workshop was moving beyond convolutional neural networks, which are less applicable to our data than in astronomy due to the nature of our data. Therefore, a range of new techniques were discussed ranging from GANs to graph neural networks.