“Proper data management, interoperability and algorithms will be instrumental to maximize utilization of informations generated by our observations of the space”
Particle Physics has been a data driver since the 1980s, cumulating with the LHC3 producing more than 300 PetaBytes actually stored on tape: a data volume that was only surpassed since the middle of the 2010s with the advent of social media and streaming. Yet, the needs of Particle Physics led to the advent of the world-wide-web, to computing architectures such as the LHC grid, that paved the way towards cloud computing. In the upcoming decennia, with the advent of the HL-LHC4 the data volume will again be multiplied by a factor 100 and other data intensive projects such as the Vera Rubin Observatory, the Euclid Satellite and the Square Kilometer Array will follow closely or even surpass the LHC. On the other side, sophisticated simulations retracing the evolution of the Universe or of complex interactions of elementary particles requiring the most advanced computing infrastructures.
In the coming years, these science projects will lead to the use of diversified computing and storage technologies, yet rather provided through centralized computing infrastructures with a broad offer in technologies.
The main opportunities for these infrastructures reside in providing a transparent access to the most appropriate architecture for a particular type of process and make therefore the most efficient use of the resources available.
New approaches in data storage should allow reducing multiple copies of data and provide data for a broad usage according to the FAIR5 principles.
However, without the development of procedures and algorithms adapted to interoperable infrastructure or underlying barriers to access different resources, difficulties may arise to provide all the computing necessary to the scientific collaborations, with an evident impact on their scientific production.
The private sector will certainly not be interested on a short timescale in the results of the research on the understanding of fundamental laws of the Universe. Yet, these disciplines have paved the way or had a strong interaction with the private sector on several aspects of computing – being it the world wide web, cloud computing or even real-time applications of machine learning algorithms. It is certainly in the cross-fertilization on methodologies that are the most interesting part of collaborations and algorithms developed have found applications in medicine, imaging, cultural heritage, etc. From the deep look in the Universe of the James Webb Space Telescope, methods may derive that are also of interest for the space-based survey of the earth which can have their immediate impact in meteorlogical forecasts or the prediction of the load on electricity grids and lead to the development of a new space economy.
The development of algorithms to make optimal use of the resources for various science driven topics is ongoing in all of the scientific collaborations dealing with fundamental scientific questions. Searching for cooperation and collaborations will allow to benefit directly from the developments made and use them in a broader context.