Atlas is one of the large multi-purpose experiments at the 
Large Hadron Collider here at CERN. 
We have just performed a brand new, very precise measurement
 of something called Lepton Flavour Universality.
Leptons are a type of fundamental particle, be most familiar
with an electron, which is one of the components of atoms 
for instance.
There are other leptons known as muons and tau leptons which
 are exactly the same as the electron, but slightly heavier.
 
The universality of lepton couplings is the expectation that
 these leptons are all equally likely 
to be produced by a W boson, which is the particle 
responsible for the weak force.
So this is the fundamental assumption of a standard model. 
There's nothing to say that it should be true or should not 
be true. And therefore, we want to test this fundamental 
assumption as precisely as possible. So what we did the 
measurement of is the ratio of the, W boson decaying 
to muons and to tau leptons and this is predicted to happen 
at the same rate in the Standard Model
and we want to test as precisely as possible if this happens
 with exactly the same probability.
One of the reasons for carrying out this measurement is that
 at the previous Collider at CERN, 
the Large Electronic Positron Collider, which operated in 
the 1990s, they observed a discrepancy 
in this very same quantity. They looked at WW events and 
they measured the rates of decays to taus and to muons 
and they found a discrepancy to the Standard Model, which 
the likelihood of happening just by chance, 
was about the one and a hundred level and therefore, this 
made this a very important thing that we wanted to check, 
whether what they were seeing was a fluctuation or whether 
what they were seeing was new physics.
So the LHC, the Large Hadron Collider, that's currently 
based at CERN is a hadron collider
and therefore, the events are a lot messier than in the 
previous collider 
and therefore it is a lot harder to do these kinds of 
precision measurements at a hydron collider.
So for this particular measurement, what's extremely 
important to reach the high levels of precision that we 
needed 
was to obtain a very clean and unbiased sample of muons and 
tau leptons and the way that we did this
in this new approach, that the ATLAS measurement uses is to 
take a huge sample of top anti top quark pair collisions
we have about a hundred million of those were produced in 
the LHC in the data set that we used
and use them and within them look for the W bosons that then
decay to the muons and tau leptons.
 
The way we go about doing this analysis is that muons 
produce a very distinctive signature in our detector. 
They interact to various different points and you get little
 single dots and you can join them and form a track. 
Tau leptons also decay to muons very often, but they do so 
after flying a certain distance and therefore we can look
for muons from tau leptons and muons directly from W bosons 
and look at the displacement from the interaction point 
to see if they've flown a short distance and then being 
produced or being produced straight away.
So our measurement is of this ratio of the probabilities of 
a W decaying to a tau divided by the probability of the W
decaying to a muon.And you'll see a value very close to 1 
within our uncertainties and our uncertainty
only 1.3%, which is about half the LEP value, which is a 
combination of all the four experiments and therefore
the Standard Model survives the stringent test of lepton 
universality.
Measuring this kind of interaction at this level of 
precision at the Large Hadron Collider, 10 years ago was
not even really thought to be possible. Being able to 
perform measurements at this level of precision at the LHC
is a really important proof of principle and really 
demonstrates that the LHC is not just a search machine
looking for very high mass new particles, but it also helps 
us understand the fundamental particles of nature
at a precision level, which in itself is sensitive to 
looking for new physics.
