CERN Hadron Collider update 1
Kicking the world’s largest machine into overdrive is turning out to be harder than expected. Researchers at the Large Hadron Collider at CERN, near Geneva in Switzerland, say that plans to run their physics experiments at higher energies are likely to be delayed until next year.
The LHC was rebooted in April, after a two-year shutdown to upgrade the machine. In the second run, it should be able to gather physics data at energies of 13 teraelectronvolts, the highest-energy collisions of particle beams ever. But researchers in charge of getting it up and running again, who this week presented the first report on the LHC’s performance at a conference in Ljubljana, Slovenia, have revealed that things haven’t quite gone as planned.
“The process is slightly slower than we would have hoped,” says Paul Collier, the LHC’s head of beams. Clouds of electrons created by ionised gas in the beam chamber and microscopic dust particles – playfully known as unidentified falling objects, or UFOs – are interrupting the beams and making it harder to get the LHC running consistently.
These effects were present in the LHC’s previous run, but the higher energies, plus efforts to produce more frequent collisions by bunching particles in the beam closer together, make them a larger problem than before.
Collier says the team had anticipated such potential issues, but they have taken some time to deal with. He compares it to driving a car at high speed: although you might be fine at 50 kilometres per hour, things start rattling when you reach 150 kph. “Everything is much closer to the limits of what the equipment can do, so the machine is less forgiving,” he says.
It’s not the first problem this year: a short circuit in March delayed the reboot. The team is only a few weeks behind schedule in preparing beams for the LHC’s various experiments. But because the quality of the beam gets better and more stable the longer it runs, there won’t be time this year to reach peak physics. “Each stage is vastly superior to everything that happens before it, so it’s only towards the end of this process that you’re really mass-producing data,” says Collier.
The researchers now expect to only reach 3 inverse femtobarns (3 fb-1) – the esoteric measurement of beam quality – this year, down from a planned 10. To put this in context, the long-sought Higgs boson was discovered after the LHC reached 12 fb-1.
But they are still on target to reach 30 fb-1 next year, once they understand how to handle their souped-up collider. “We’re learning an awful lot that will help us run the machine even better,” says Collier. “I have good hopes for 2016.”
On the 2015 15th August a unique planetary alignment is to take place. The hadron collider will be pushed to its maximum output on the 15th. The purpose of this is to break human’s connection to source. The earth has a natural bio rhythm which rises and falls just as a surfer on a surf board rides the ocean wave so the hadron collider is intending to ride the natural bio rhythmic wave from the earth. Obviously this is not a good thing.
Today the CERN is activited again efforts of the large hadron collider, timed to the Dormition, are an attempt to put Sophia, the planet and thus us, back to sleep.
the movie reveals the opening of CERN on September 23rd—which just so happens to be the “Day of Atonement” on the Jewish calendar.
When is CERN Scheduled to “Open the Door”? Answer—September
When is the United Nations Meeting? Answer—September 25th-27th and on the 26th they will be hosting the “International Day for the Total Elimination of Nuclear Weapons” A “world peace” gathering.
CERN has been developing ways to improve data storage, cloud-technologies, data analytics and data security in support of its research. Its technological advancements have resulted in a number of successful research spin-offs from its primary particle work, including the World Wide Web, hypertext language for linking online documents and grid computing.
Its invention of grid computing technology, known as the Worldwide LHC Computing Grid, has allowed it to distribute data to 170 data centers in 42 countries in order to serve more than 10,000 researchers connected to CERN.
Storing data, sharing data During the LHC’s development phase 15 years ago, CERN knew that the storage technology required to handle the petabytes of data it would create didn’t exist. And researchers couldn’t keep storing data within the walls of their Geneva laboratories, which already house an impressive 160PB of data.
CERN also needed to share its massive data in a distributed fashion, both for speed of access as well as the lack of onsite storage.
As it has the past, CERN developed the storage and networking technology itself, launching the OpenLab in 2001 to do just that. OpenLab is an open source, public-private partnership between CERN and leading educational institutions and information and communication technology companies, such as Hewlett-Packard and Nexenta, a maker of software-defined storage. OpenLab itself is a software-defined data center that started phase five of its development cycle this year. That phase will continue through 2017 and tackle the most critical needs of IT infrastructures, including data acquisition, computing platforms, data storage architectures, compute provisioning and management, networks and communication, and data analytics.
A growing grid In all, the LHC Computing Grid has 132,992 physical CPUs, 553,611 logical CPUs, 300PB of online disk storage and 230PB of nearline (magnetic tape) storage. It’s a staggering amount of processing capacity and data storage that relies on having no single point of failure.
A bustling in sky