The use of convolutional neural networks for processing stereoscopic IACT images in the TAIGA experiment

5 Jul 2021, 16:30
15m
407 or Online - https://jinr.webex.com/jinr/j.php?MTID=m573f9b30a298aa1fc397fb1a64a0fb4b

407 or Online - https://jinr.webex.com/jinr/j.php?MTID=m573f9b30a298aa1fc397fb1a64a0fb4b

Sectional reports 9. Big data Analytics and Machine learning Big data Analytics and Machine learning.

Speaker

Stanislav Polyakov (SINP MSU)

Description

Machine learning methods including convolutional neural networks
(CNNs) have been successfully applied to the analysis of extensive air
shower images from imaging atmospheric Cherenkov telescopes (IACTs).
In the case of the TAIGA experiment, we previously demonstrated that
both quality of selection of gamma ray events and accuracy of
estimates of the gamma ray energy by CNNs are good compared to the
conventional Hillas approach. These CNNs used images from a single
telescope as input. In our present work we demonstrate that adding
data from another telescope results in higher accuracy of the energy
estimates and quality of selection. The same approach can be used for
arbitrary number of IACTs. All the results have been obtained with the
simulated images generated by TAIGA Monte Carlo software.

Keywords
deep learning; convolutional neural networks; gamma astronomy;
extensive air shower; TAIGA; stereoscopic mode

Primary authors

Presentation materials