Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Machine Learning for Likelihood-Free Inference, Lecture notes of Particle Physics

All trained on event/parameters sample pairs – no extra use of particle physics structures. • New method: training deep neural networks to approximate the ...

Typology: Lecture notes

2022/2023

Uploaded on 05/11/2023

kalia
kalia 🇺🇸

4

(7)

239 documents

1 / 15

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
Machine Learning for
Likelihood-Free Inference
Luc Le Pottier
University of Michigan, CERN ATLAS Group
J. Brehmer et al. ; adapted by APS/Alan Stonebraker
pf3
pf4
pf5
pf8
pf9
pfa
pfd
pfe
pff

Partial preview of the text

Download Machine Learning for Likelihood-Free Inference and more Lecture notes Particle Physics in PDF only on Docsity!

Machine Learning for

Likelihood-Free Inference

Luc Le Pottier

University of Michigan, CERN ATLAS Group

J. Brehmer et al. ; adapted by APS/Alan Stonebraker

Outline

  • Project Intro
  • Project Work
  • Future Work
  • Project Lessons
  • Cultural Experiences

Project Intro

  • Likelihood function allows for EFT parameter calculation  Describes the compatibility of data with model parameters  must be approximated: difficult to do precisely in high-dim parameter/observable spaces
  • Current methods:  Matrix Element Method, Optimal observables, naïve Parameter scans, Neural Network classification  All trained on event/parameters sample pairs – no extra use of particle physics structures
  • New method: training deep neural networks to approximate the likelihood function, using the joint likelihood ratio and joint score in loss functions. ALICE Method Cross-Entropy Estimator-based loss function

Finished Work

  • Using the MadMiner machine learning toolkit (1, 2, 3) , with an implementation of the ALICE algorithm, claimed to be one of the more ‘sample-efficient’ algorithms (good for testing and debugging). (2)
  • Problem Introduction (January)  Wrote a python module for testing ML algorithms on arbitrary-dimensional ‘toy’ processes  Validated precision results of various algorithms using the tractable likelihoods of these toy processes  Made clear that ALICES is a good algorithm to prototype new processes with w

Slight Problems

  • ttH CP parity parameterization is limited to the range [0, 1], with simulation benchmarks at the boundary points
  • Results in strange predicted likelihood functions / 17
  • Assuming a gaussian likelihood function, extrapolated variances of log-likelihood are terrible
  • Determined after hundreds of different deep neural net configurations and sample augmentations that a new statistical approach is needed
  • Will likely receive attention from stats PHD in the near future

Finished Work

  • Authored a python module and linux CLI utility
  • Provides a wrapper for

 Complex environment setup for

madminer/madgraph

 Process backend setup

 Generation and storage of a wide array of

samples, models, and evaluations

  • Allows for near-instant prototyping of the madminer framework with arbitrarily complex processes

 Helps a lot with finding a suitable set of ML

parameters for a given process

/ 17

Progress thus far / 17 Good sample augmentation Poor sample augmentation

Future Work

  • Update module to work with recently-released madminer version
  • Finish (majority) of documentation
  • Add 4-5 more ‘sample processes’ to demonstrate applicability of the program to arbitrary processes with minimal effort
  • Perform and write up a short study on how to quickly find optimal ML parameters for a new problem (lots of research wrt this, but not much written down) Luckily, I have until May 10 th to do these things! / 17

Vallee blanche in Chamonix Culture! Zz in cully (^) Hiking in the jura

Questions?