by Alessia Saggio

Hi everybody,

still at CERN for the last lecture of the Machine Learning Course that started on Monday and that’s almost over. The classes are taught by Dr. Ilya Narsky, a world-known expert on Machine Learning Techniques for High Energy Physics.

First of all, let me say that I’m very happy of having the chance to meet for the first time the two other girls of the network: Cecilia, who’s just starting in Oxford, and Anna, based here at CERN. I’m very happy I’m going to collaborate with them during my PhD career: working together is always a very good chance to become good scientists.

My intention for this post was to give you an overview about some techniques for statistical learning that I just learned in this course, but some technical issues with my computer this morning didn’t allow me to do so and now I have to attend the last lecture, so please take this post as a sort of rapid “introduction”: I’ll give much more detail in my next post.

The course covered basically the most important techniques for statistical learning: from Principal Component Analysis (PCA), kernel Principal Component Analysis (kernel PCA), Boosted Decision Trees (BDT) to classification methods such as support vector machines.

As participants, we were also given a temporary MATLAB license, since a MATLAB tutorial was taught during the last hour of each lecture.

Without any doubt, this course represented a good chance to get in contact with these new promising techniques for HEP analysis for people who are totally new to this kind of stuff, and to deepen the knowledge of people already working on that. In any case, you’ll always learn something by attending these kind of schools. So my advice is: if you have the chance, don’t miss it!