A Marie Sklodowska-Curie ITN funded by the Horizon2020 program of the European Commission


Deep learning

Journey through Fast.AI: II – Columnar data

by Giles Strong

Welcome back to the second part of my journey through the Fast.AI deep-learning course; beginning section here. Last time I gave an example of analysing images, now I’ll move on to working with columnar data.

Columnar data is a form of structured data, meaning that the features of the data are already extracted (in this case into columns), unlike in images or audio where features must be learnt or carefully constructed by hand. Continue reading “Journey through Fast.AI: II – Columnar data”

Hyper-parameters revisited

by Giles Strong


Well folks, it’s been quite a while since my last post; apologies for that, it’s been a busy few months recently.

Towards the end of last year I wrote a post on optimising the hyper parameters (depth, width, learning rate, et cetera) of neural networks. In this post I described how I was trying to use Bayesian methods to ‘quickly’ find useful sets of parameters. Continue reading “Hyper-parameters revisited”

Train-time/test-time data augmentation

by Giles Strong

The week before last I was presenting an update of some of my analysis work to the rest of my group. The work involved developing a neural-network to classify particle-collisions at the LHC. Continue reading “Train-time/test-time data augmentation”

Higgs Hacking

by Giles Strong

A few days before I returned from CERN at the beginning of the month, I attended a talk on the upcoming TrackML challenge. This is a competition beginning this month in which members of the public will be invited to try and find a solution to the quite tricky problem of accurate reconstruction of particle trajectories in the collisions at the LHC. The various detectors simply record the hits where particles pass by, however to make use of this data, the hits in surrounding detector layers must be combined into a single flight path, called a track. Continue reading “Higgs Hacking”

Blog at

Up ↑