Machine Learning and Multivariate Techniques in HEP data Analyses
Prof. dr hab. Elżbieta Richter-Wąs
Extracted from slides by:
G. Cowan’s lectures at RH London Univ., H. Voss at SOS 2016, K. Reygers lectures at Heilderbeg Univ.
Boosted Decision Trees
Artificial Neural Networks
Decision Trees
2
Boosted Decision Trees
3
Decision Trees
4
Finding Optimal Cuts
5
Separation Measures
6
Decision Tree Prunning
7
Boosted Decision Trees: Idea
8
AdaBoost (short for Adaptive Boosting)
9
Assigning the Classifier Score
10
Updating Events Weights
11
Boosting
12
Addaptive Boosting (AdaBoost)
13
Boosted Decision Trees
14
General Remarks on Multi-Variate Analyses
15
Machine Learning - Basic terminology
16
17
Where are the Neural Networks?
18
Neural Networks
19
Perceptron
20
The Biological Inspiration: the Neuron
21
Feedforward Neural Network with One Hiden Layer
22
Network Training
23
Backpropagation
24
Neural Network Output and Decision Boundaries
25
Example of Overtraining
26
Monitoring Overtraining
27
Deep Neural Networks
28
How do NNs work?
29
How do NNs learn?
30
How do NNs learn?
31
How do NNs learn?
32
Typical Applications
33
Input Preprocesing
34
Training
35
Training: (Stochastic) Gradient Descent
36
Training: more optimisers
37
Underfitting and overtraining
38
Overtraining solutions
39
Convolutional NN
40
Convolution layer
41
Average and Max pooling layers
42
Convolutional NN architecture
43
Recursive NN
44
Recursive NN: possible HEP applications
45
Adversarial NN
46
Generative Adversarial NN
47
Lorentz boost network: motivation
48
Lorentz boost network: network architecture
49
Lorentz boost network: feauture extraction
50
BN for tth(bb) vs tt+bb: performance
51
Conclusions
52