Listening to the ATLAS detector


Why can’t everyone enjoy the Large Hadron Collider as much as I do?

What do particles sound like? Can we make music out of LHC collisions? Will it teach us anything? I regularly talk to non-physicists about the LHC. The general consensus among the people I speak to seems to be that it is really exciting and interesting, but that the details are incomprehensible. One of my favourite feelings in the world is getting to the end of some really difficult calculations and realising that I have gained some meaningful knowledge about the universe. But not everyone is quite so keen on the idea of spending 7 or 8 years doing maths in order to get that feeling! How to share the love without sharing the pain?

ATLAS is a music box by Toya Walker

ATLAS is a music box by Toya Walker

Sonification means taking data and turning it into sounds while retaining the information in the data. A simple example of sonification is the car parking sensor that informs you of the space behind you via a beeping sound. The distance between you and the car behind you is mapped to the period of the sound, so that small distances produce a series of beeps that are very close together in time.

There are many more complex examples of using for sonification for some form of analysis; helping blind people to see, predicting earthquakes and identifying micrometeoroids impacting the Voyager II spacecraft.

(If you’re interested in finding out more about sonification, I’d recommend reading this sonification report and then trying some of the links from this page.)

Can we sonify the ATLAS detector data in such a way as to make them appreciable to non-physicists? It seems that this is the ideal candidate for sonification- it ticks all the boxes. Collisions data is associated with spatial postion and direction, changing in time and multi-dimensional. Because there is so much going on in the data, physicists often use artifical neural networks (computers programmed to behave a bit like very simple brains). In simple terms, if we were classifying birds we would do so based on their colour, wingspan, beak shape, diet, song, and so on. We can do this fairly easily using our eyes and ears. But what if we were to try and classify something more abstract? We turn to complex ‘black-box’ computer programmes because we have not found another way to deal with large amounts of multi-dimensional information.

Sound seems the perfect tool with which to represent the complexity of the data; our ears are superb at locating the source and location of sounds relative to one another, we can hear a vast range of frequencies and distinguish timbres (different instruments) before they have even played a full cycle. We also have an incredible ability to notice slight changes is pitch or tempo over time and to recognise patterns in sound after hearing them just once. Perhaps using our ears could allow us to make full use of the neural networks between them.

Higgs boson

500 points to the finder of the Higgs boson

LHCsound, the project to sonify ATLAS detector data, is taking shape.

The LHCsound team continues to grow, with sonifications being added all the time by Archer Endrich and Richard Dobson from the Composer’s Desktop Project and some great artwork by Toya Walker.

We have several sounds up on the website now, including a Higgs jet composition (the energy deposits in a fat jet are sonified in terms of their energy, distance from interaction point and angular distance from jet axis), an event monitor (the number of charged particles in events picked out by the minimum bias trigger determines pitch and the timing is time-stretched real time difference between triggers), and various whole event sonifications.

There is a lot of work to be done. Our hope is that other physicists, composers, and real-world people will get in touch with their own ideas.

Lily has finally worked up the nerves for running on the goliath of processing power known as the grid, meaning that we will shortly be able to sonify real 7 TeV collisions data. There is a growing list of physics processes and we would like to sonify, from event shape variables (Lily’s favourite) to Feynman diagrams themselves (Richard’s bold idea).

The website is still a bit rusty and amateurish, as one might expect from a physicist, but be assured we have grand designs for the future!

About these ads

9 comments

  1. Pingback: LHCsound – Die seltsamen Klänge des LHCs bei physikBlog
  2. Chris

    If you guys are curious, I read the symmetry article on this and it reminded me of the SNO experiment. We had there a speaker that the trigger cable was plugged into such that a crackling that related to the trigger rate was made. If there was no crackle, then either the event rate was zero or too high frequency out of audible range (ie. noisy channels, crates). It was very useful in running since you aren’t always looking at the event display, but can always hear the noise. You could even (if you did enough shifts) guess the event rate based on the noise. Cool stuff.

  3. Pingback: Large Hadron Collider: scientists create sound of ‘God particle’ | THE TRUTH BEHIND THE SCENES
  4. sarah fahrendorf

    For a real treat on putting sounds into visual media, check out this exhibit the Hirshhorn did a few years ago. I’d so love to visit that exhibit again.

    I think we’ll find that whatever these sounds or vibrations are, there will exist already a pattern that matches it. That’s the beauty of the universe. Such as listening to recordings of the northern lights—some aspects of these recordings sound very much like the spring peepers (froggies in mating season).

    Here’s one additional site (for the northern lights reference):

    http://www-pw.physics.uiowa.edu/mcgreevy/#latest

    • thecreationist

      Thanks for these links Sarah, this is definitely an area of the project we want to develop, perhaps with some data-driven animation. We’ve heard about the idea of certain sounds being related to particular colours, but would love to find out more.

      • sarah fahrendorf

        As to which sounds relate to which colors, I would look to the talent known as “synaesthesia”— which I do not have but wish I did! I would convene a focus group of folks with this talent—I do not think they agree on exact colors’ correspondence with tones. But maybe this is an area that needs more study. Wish I could help with that!

  5. sarah fahrendorf

    Will you be slowing the sound of this down, speeding it up, etc? I think that might help with pattern recognition, such that it might sound more familiar or even “just like” some other sound. For example, if you went to a building under construction, where they are jack-hammering (several of them all over the building)—the jack hammers’ sounding on/off collectively will mimic crickets starting/stopping their songs…

  6. Pingback: Hadron Collider Reaching for “God Particle” | DarkGovernment

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s