For the past few months I’ve been following the Fast.AI Deep-Learning for Coders course. An online series of lectures accompanied with Jupyter notebooks and python library built around PyTorch. The course itself is split into two halves: the first uses a top-down approach to teach state of the art techniques and best practices for deep learning in order to achieve top results on well established problems and datasets, with later lessons delving deeper into the code and mathematics; the second half deals with more with the cutting edge of deep learning, and focuses on less-well-founded problems, such as generative modelling, and recent experimental technologies which are still be developed. Continue reading “Journey through Fast.AI: I – Introduction and image data”
This week the VII AMVA4NewPhysics workshop is under way in the premises of LIP in Lisbon. During these events the network gets together to discuss the status of the various projects, plan future events and activities, take action on arisen issues, and vote on budget and other topics. But this is a special event in the lifetime of the network, as we are getting toward the mature stage – we are in the Continue reading “Advanced Results in Lisbon”
Summer 2018’s been a busy time for the AMVA4NewPhysics network; we’ve had workshops, outreach events, training sessions, meetings, and many more things. I wanted to go through and pick out a few thinks I was involved in. Continue reading “Science in the sun: AMVA4NP’s summer events”
One of the advantages of belonging to a European ITN like AMVA4NewPhysics is the opportunity to participate to a range of outreach activities. Such an opportunity was given to me at the beginning of September, by volunteering at a literature festival called Festivaletteratura, taking place in Mantova, Italy. Continue reading “Volunteering at a literature festival”
Casual reader, be warned – the contents of this article, specifically the second part of it, are highly volatile, speculative stuff. But hey, that is the stuff that dreams are made of. And I have one or two good reasons to dream on.
Machine Learning is ubiquitous today. Self-driving cars; self-shaving robots (just kidding, but I’m sure they can be constructed if the need arises); programs that teach themselves chess and become world-champion-class players overnight; Siri; google search engines; google translate – okay, I am going too far. But you know it: machine learning has become a player in almost Continue reading “Can Neural Networks Design The Detector Of A Future Particle Collider?”
Hi everyone, my name is Giovanni Banelli and I’m the last student joining the AMVA4NewPhysics network. I will be based in Munich (Technical University) and formally I’m the only theorist among the ESRs; hence I will be working on the theory/phenomenology side of the use of advanced statistical tools in searches for New Physics. Continue reading “Science taken wide”
Do you know the works of Tim Blais, the guy behind “A Capella Science”? I sincerely hope you do, but otherwise this post is for you. Tim has a youtube page where he publishes his amazing works.
Tim sings modified lyrics of famous songs, and mixes them with multiple tracks of his own voice imitating each of the instruments of the underlying orchestra, or other choral voices. Until here you could well say there’s nothing new under Continue reading “A Capella Science at CERN”
What is spectroscopy ?
Well folks, it’s been quite a while since my last post; apologies for that, it’s been a busy few months recently.
Towards the end of last year I wrote a post on optimising the hyper parameters (depth, width, learning rate, et cetera) of neural networks. In this post I described how I was trying to use Bayesian methods to ‘quickly’ find useful sets of parameters. Continue reading “Hyper-parameters revisited”