The estimation of the Maximum Likelihood (MLE) is the most robust algorithm used in gamma-ray astronomy but, particularly if used in conjunction with unbinned analysis, uses a huge amount of computing resources. Typically, the estimation of the maximum is left to a single-thread minimizer, like MINUIT, running on a CPU while providing a call-back function that may estimate the likelihood on the GPU. We propose an alternative to the MINUIT package, that leverages Levenberg-Marquardt algorithm and Dynamic Parallelism and runs entirely on GPUs.
Maximum Likelihood Estimation on GPUs: Leveraging Dynamic Parallelism
D. Bastieri;S. Amerio;D. Lucchesi;
2016
Abstract
The estimation of the Maximum Likelihood (MLE) is the most robust algorithm used in gamma-ray astronomy but, particularly if used in conjunction with unbinned analysis, uses a huge amount of computing resources. Typically, the estimation of the maximum is left to a single-thread minimizer, like MINUIT, running on a CPU while providing a call-back function that may estimate the likelihood on the GPU. We propose an alternative to the MINUIT package, that leverages Levenberg-Marquardt algorithm and Dynamic Parallelism and runs entirely on GPUs.File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.