RapidIO technology to speed and improve data analytics at CERN’s Large Hadron Collider and data centre.
Integrated Device Technology (IDT) has initiated a three-year collaboration with the European Organisation for Nuclear Research (CERN) to use its RapidIO technology.
The partnership aims to improve data acquisition and analysis collected by the experiments on CERN’s Large Hadron Collider (LHC), the world’s largest and most powerful particle accelerator.
Sailesh Chittipeddi, IDT’s vice president of Global Operations and chief technology officer, said: "This CERN collaboration is about enabling programmable real-time mission critical data analytics.
"Since the job spans multiple processors, the interconnect between them has to be ultra-low latency, and our technology — already used across 4G wireless base station deployments worldwide — is ideally suited to CERN’s real-time interconnect needs."
The LHC is a massive data generator, creating one PB of new information per second through millions of collisions that happen every second in each detector.
The data produced by the Geneva based accelerator helps CERN obtain fundamental answers about the universe.
The RapidIO technology will increase the speed at which data travels by providing a low-latency connection between clusters of computer processors.
Through 4G base stations, the technology will also provide real-time data analytics and data management for high-performance computing (HPC) and data centres.
As part of the mandate for the fifth phase of the CERN openlab partnership, several of the LHC experiments will be looking at moving from custom-built hardware and backplanes to fully programmable heterogeneous computing with low-latency interconnect between large clusters of processors.
IDT’s current RapidIO 20 Gbps interconnect products will be used in the first stage of the collaboration with an upgrade path to RapidIO 10xN 40 Gbps technology in the future as research at CERN progresses.
By using algorithms within the custom-built ASIC hardware, the data will be sampled with only one percent being extracted for further analysis reducing time and costs associated with the high amount of data coming out from CERN’s labs.
The collaboration was based on industry standard IT form factor solutions suitable for deployment in HPC clusters and data centres.
Capable of supporting industry-standard servers, GPU, FPGA and low-power 64-bit SoCs, as well as top-of-rack RapidIO switches available from Prodrive Technologies, the computing platform will be built on RapidIO-enabled 1U heterogeneous servers.
These servers will be based on specifications from RapidIO.org that are targeted towards the Open Compute Project High Performance Computing initiative co-chaired by IDT.
Alberto Di Meglio, head of CERN openlab, said: "The bottleneck for better data acquisition, selection and analytics is superior real-time interconnect.
"Our collaboration with IDT to develop a RapidIO-based computing architecture should help solve CERN’s real-time data filtering problem, enabling us to select and utilise more meaningful events from the LHC and improve efficiency of analytics in our data centre monitoring and operations."
Corey Bell, CEO of the Open Compute Project, said: "We established the HPC initiative to service the unique needs of those end users with the highest compute-centric workloads in the industry.
"CERN has some of the most stringent workloads for low-latency computing, so this collaboration is a great opportunity to see the benefits of RapidIO in action."