The conference, now in its third year, is a celebration of groundbreaking GPU-accelerated work across the region. Nearly 300 developers, startups and researchers took the stage, sharing innovative projects. Among them were some of the major science centers in Europe, spanning fields as diverse as particle physics, climate research and neuroscience.
Understanding the Universe
Técnico Lisboa, Portugal
Nuclear energy is generated through nuclear fission: the process of splitting apart an atom’s nucleus, which creates both usable energy and radioactive waste. A cleaner and more powerful alternative is nuclear fusion, the joining together of two nuclei.
But so far, scientists haven’t been able to sustain a nuclear fusion reaction long enough to harness its energy. Using deep learning algorithms and a Tesla P100 GPU, university researchers at Portugal’s Técnico Lisboa are studying the plasma shape and behavior that takes place in a fusion reactor.
Gaining insight into the factors at play during nuclear fusion is essential for physicists. If researchers are able to predict when a reaction is about to be disrupted, they could make changes to take preventive action to prolong the reaction until enormous amounts of energy can be captured.
GPUs are essential to make these neural network inferences in real time during a fusion reaction. The deep learning models currently predict disruption with 85 percent accuracy, matching state-of-the-art systems. By adding more probes that collect measurements within the reactor, and using a multi-GPU system, the researchers can reach even higher accuracy levels.
European Organization for Nuclear Research, Switzerland
Physicists have long been in search of a theory of everything, a mathematical model that works in every case, even at velocities approaching the speed of light. CERN, the European Organization for Nuclear Research, is a major center for this research.
Best known in recent years for the discovery of the Higgs boson, often called the “God particle,” the organization uses a machine called the Large Hadron Collider to create collisions between subatomic particles.
The researchers use software first to simulate the interactions they expect to see in the collision, and then to compare the real collision with the original simulation. These experiments require a system that can handle five terabytes of data per second.
“We are working to speed up our software and improve its accuracy, to face at best the challenges of the next Large Hadron Collider phase,” said CERN researcher Andrea Bocci. “We are exploring the use of GPUs to accelerate our algorithms and to integrate fast inference of deep learning models in our next-generation real-time data processing system.”