The prototype of the European supercomputer is taking shape at Università di Torino

24 November 2021
Università di Torino will play a leading role in European studies on supercomputing, becoming part of the most advanced continental research network on the subject.

It all began with the development, in collaboration with Politecnico di Torino, of HPC4AI, an advanced computing system for artificial intelligence applications, currently in use by researchers and companies and included in the range of technological assets of CTE Next, the House of Emerging Technologies. The project is now worth about € 4.5 million, supported by European funding.

Marco Aldinucci, head of the Department of Computer Science at UniTo, explained: "We have created a Data Center that houses systems entirely designed by us, with the software part designed and built with internal resources. The work has lasted three years, with the pandemic in the middle, and has been a training ground for our team and has allowed us to engage in the most important European projects at this time, starting with large research projects aimed at European digital sovereignty as the two Advanced pilots towards the European exascale (EUPEX and TEP) and European Processor Initiative (EPI) worth 140 million".

The challenge for the next three years in the development of supercomputers is the exascale, that is the realization of a system that is 10 million times faster than an ordinary personal computer. "The European Union, at this moment in history, is playing it safe, and that wasn't the case until a few years ago. Cloud technology has made it possible to put the enormous computing power of supercomputers at the disposal of AI applications easily and effectively. In Italy, we have derogated a lot from the big players in the field now we need a greater awareness to support and expand research. It's about restarting from the skills in order to acquire important and necessary knowledge for the industry."

The system created by Università degli Studi di Torino is an experimental cloud and HPC system for which new applications in the AI field are being studied, such as Federated Learning: the development of an analysis and data processing system capable of processing information from different sources without the need for these to be moved and stored all together in one place. A completely different model from the one currently proposed or used in the US or China and based on maintaining the value of data ownership.

Aldinucci commented: "It's important to understand that data, once copied, is indistinguishable from the originals, and giving it away means giving up a key asset for good. We are working on a platform that allows to collaborate in data processing, in order to extract value, but without selling or transferring it. We need to shift the focus from the purchase of services on the market to the development of skills to accompany companies to develop HPC, cloud and AI technologies, which are fundamental elements of the value stream of many supply chains."

Author

Luca Coppolella
Head of Content

Share this article

Topics

Latest articles

Useful resources

English