The Particle Physics Community Planning Exercise ("Snowmass") in the United States aims to identify a strategy for future particle physics in the US and elsewhere. This study includes both the physics cases for experiments and facilities and the technologies that are needed to support such ambitions, including computing. I will discuss the areas under review in the Snowmass computing working...
HEP experiments are among the top users at HPC centers worldwide, where they have run in production for years. Significant effort has been invested in adapting HEP workflows to these unique platforms. Yet, we have only scratched the surface. The next generation of exascale HPC systems has the potential to revolutionize HEP computing, but only if we can re-engineer our applications to run in...
We'd like to show the status of quantum computing, in particular, the application to HEP. We'll give a summary of the usage of the supercomputers in the ATLAS experiment and our experience in Japan.
The Belle II computing system is expected to manage the process of massive raw data, production of copious simulation and many concurrent user analysis jobs. To cope with these, we established a distributed computing model with DIRAC as a workload management system and started its operation.
It has been roughly 10 years since we started the Belle II distributed computing activity. In this...