Some innovations in physics come from entirely new technologies, others from fresh theoretical insights. Others still take shape by bringing together existing tools in new ways, working out how to combine them to outperform other solutions. The branch of particle physics that studies weakly interacting particles—such as neutrinos and some types of dark-matter candidates—could use innovative detection approaches: technological challenges in this research area quickly become practical as well as economic, as increases in detector volume and spatial resolution improve the sensitivity to the processes producing the particles of interest. Similarly, demanding targets on instrument capability apply to the calorimeters used in collider experiments.
Three-dimensional (3D) tracking of elementary particles in large-volume, dense materials is required in most particle physics experiments. In a scintillator, this is commonly achieved through fine segmentation of the material into many smaller active units, with each unit emitting light in the visible frequency range when a charged particle passes through it. Typically, the photons produced in every active unit are collected by optical fibers and carried outside of the scintillator to the photomultiplier tubes or silicon photomultipliers used for photon counting.
In the T2K neutrino-oscillation experiment in Japan, for example, one detector boasts about two tons of sensitive volume assembled from approximately two million cubes and 60,000 fibers. Over at CERN and the Paul Scherrer Institute, the LHCb and Mu3e experiments achieve sub-millimeter spatial resolution thanks to millions of thin scintillating optical fibers. With these figures, it’s clear that the scalability of this kind of scintillator material segmentation may turn into a bottleneck when larger volumes become necessary.