It has been just over a year since CERN, the home of the Large Hadron Collider (LHC), embarked on a three-year pilot phase for the Open Quantum Institute (OQI). This initiative aims to provide universal access to quantum computing and hasten its applications for the betterment of humanity.
In a recent discussion, Archana Sharma, senior advisor for relations with international organisations and a principal scientist at CERN, emphasized that the OQI serves as a platform for evaluating advancements in quantum computing, networks, and sensors, while taking stock of the ongoing research at CERN.
Sharma highlighted that while particle physics remains CERN's primary focus, there may be valuable synergies between quantum technology development and particle physics research. The intricate processes involved in particle acceleration are fundamentally rooted in quantum mechanics, which also powers the particle accelerator’s detectors that gather experimental data.
CERN produces an astronomical amount of data through its experiments. One notable technology is White Rabbit, an open-source timing system with sub-nanosecond accuracy, originally developed for particle physics and now being adapted to enhance quantum communications.
Recently, UK-based quantum networking firm Nu Quantum has joined CERN’s White Rabbit Collaboration, benefiting from CERN’s advanced synchronization technology that is essential for scaling quantum computing networks.
As Sharma remarked, computing is one of the three core pillars of CERN's operations, alongside research and infrastructure. The institute is proactively evolving its computing capabilities to accommodate the extensive demands generated by ongoing experiments.
Indeed, high-speed data processing capabilities are vital. CERN must narrow down data from approximately 40 million collisions per second to a more manageable figure, initially around 1,000 and eventually to about 100. The processing time is critical, occurring within approximately 2.5 milliseconds.
With an extensive network of roughly 100,000 channels per experiment, CERN employs pattern recognition and machine learning technologies to process vast datasets and create simulation models. Sharma stated, “That’s the biggest tool we have — we run simulations to produce models that tell us how each collision will be read out.”
These models facilitate the streamlined collection of trigger data, enabling reconstruction of energy measurements derived from the sensors during experiments. While this process shares similarities with the concept of a digital twin in enterprise IT, Sharma noted that CERN’s simulations are probabilistic and hence not entirely classified as digital twins.
As CERN prepares for an upcoming three-year technical stop to enhance the LHC’s capabilities, including a tenfold increase in luminosity for data collection, Sharma confirmed that the computing centre is also gearing up for the substantial data processing needs that will arise.
Get the latest tech news as it happens with our lightning-fast publishing system.
Our team of tech experts provides in-depth analysis and informed perspectives.
All our articles undergo rigorous fact-checking to ensure accuracy.
We cover tech developments from around the world, not just Silicon Valley.
At viwyjii, we're dedicated to delivering high-quality, timely, and accurate technology news to keep you informed about the rapidly evolving tech landscape. Our team of passionate tech enthusiasts and journalists works tirelessly to bring you the most relevant stories.
We believe in the power of technology to transform lives and societies. Our mission is to make complex technological developments accessible to everyone, whether you're a tech professional or simply curious about the digital world.