Inspired by previous work in the field of data sonification, we built a data-driven composition platform that enables users to map collision event information from experiments in high-energy physics to audio properties, and thus make music from real-time data. The tool is used for outreach purposes, allowing physicists and composers to interact with collision data through novel interfaces. Three real-time compositions were streamed from May 2016–July 2016. Two additional compositions are streamed in fall 2018. This project can inspire the development of strategic mappings that facilitate the auditory perception of hidden regularities in high-dimensional datasets, and one day evolve into a useful analysis tool for physicists as well, possibly for the purpose of monitoring slow control data in experiment control rooms. The project is accessible at Quantizer.media.mit.edu.