Projects
The most common interpretation of Moore's Law is that the number of components on a chip and accordingly the computer performance doubles every two years. This experimental law has been holding from its first statement in 1965 until today. At the end of the 20th century, when clock frequencies stagnated at ~3 GHz, and instruction level parallelism reached the phase of...
IS-ENES2 is the second phase project of the distributed e-infrastructure of models, model data and metadata of the European Network for Earth System Modelling (ENES). This network gathers together the European modelling community working on understanding and predicting climate variability and change. ENES organizes and supports European contributions to international...
The evolution of protein interactions has produced interaction networks and much biological complexity. Although molecular phylogenetics reveals the end points of evolutionary searches, little is known about the trajectories of interacting proteins through sequence space over evolutionary time. A major bottleneck is the inability to extensively map how binding affinity...
Este proyecto tiene como objetivo aplicar las nuevas técnicas de computación de alto rendimiento (HPC) exaescala a las simulaciones de la industria energética yendo más allá del estado de la técnica en las simulaciones requeridas para diferentes fuentes de energía: producción de energía eólica y...
The main objective of the LIGHTNESS project was the design, implementation and experimental evaluation of high performance data centre interconnects through the introduction of innovative photonic switching and transmission inside data centres. Harnessing the power of optics enabled data centres to effectively cope with the unprecedented demand growth to be faced in the near...
The COMPOSE project aimed at enabling new services that can seamlessly integrate real and virtual worlds through the convergence of the Internet of Services with the Internet of Things. COMPOSE will achieve this through the provisioning of an open and scalable marketplace infrastructure, in which smart objects are associated to services that can be combined, managed, and...
AXLE focused on automatic scaling of complex analytics, while addressing the full requirements of real data sets. Real data sources have many difficult characteristics. Sources often start small and can grow extremely large as business/initiatives succeed, so the ability to grow seamlessly and automatically is at least as important as managing large data volumes once you know...
The increasing power and energy consumption of modern computing devices is perhaps the largest threat to technology minimization and associated gains in performance and productivity. On the one hand, we expect technology scaling to face the problem of “dark silicon” (only segments of a chip can function concurrently due to power restrictions) in the near future...
Over the 10 years prior to the initiation on this project, significant investments had been made in both the European Union and the USA for developing scientific data infrastructures to support the work of research communities, and in improving shared access to data. On both sides of the Atlantic, there is a shared understanding that solutions must be global and that the...
The use of High Performance Computing (HPC) is commonly recognized a key strategic element both in research and industry for an improvement of the understanding of complex phenomena. The constant growth of generated data -Big Data- and computing capabilities of extreme systems lead to a new generation of computers composed of millions of heterogeneous cores which will provide...