8–11 Jul 2024
The University of Tokyo, Japan
Asia/Tokyo timezone

Particle-flow reconstruction with Transformer

8 Jul 2024, 17:30
2h
Foyer (Ito International Research Center)

Foyer

Ito International Research Center

Poster (in person) Software, Reconstruction, Computing Posters

Speaker

Paul Wahlen (ICEPP, The University of Tokyo)

Description

Transformers are one of the recent big achievements of machine learning, which enables realistic communication on natural language processing such as ChatGPT, as well as being applied to many other fields such as image processing. The basic concept of a Transformer is to learn relation between two objects by an attention mechanism. This structure is especially efficient with large input samples and large number of learnable parameters.
We are studying this architecture applied to the particle flow, which reconstructs particles by clustering hits at highly-segmented calorimeters and assign charged tracks to the clusters.
We apply the structure inspired from a translation task, which uses the Transformer as both an encoder and a decoder. An original sentence is provided to the encoder input leading to a translated sentence as output of the decoder. The latter is initially provided with a start token and then recursively uses its own output as inputs to obtain the final translated sentence.
We supply hits and tracks to the encoder as input, and a start token to the decoder to obtain the first cluster. Truth clusters information are provided at learning stage to compare with the decoder output.
Detailed implementation of the network as well as initial results of particle flow reconstruction using this method will be shown in the presentation.

Apply for poster award Yes

Primary authors

Paul Wahlen (ICEPP, The University of Tokyo) Taikan Suehara (ICEPP, The University of Tokyo)

Presentation materials

Peer reviewing

Paper