Projects

Showing 31 - 39 results of 39

EUrope-BRAzil Collaboration on ´BIG Data Scientific REsearch thorugh Cloud-Centric Applications´ aims to provide services in the cloud for the processing of massive data coming from highly connected societies, which impose multiple challenges on resource provision, performance, Quality of Service and privacy. Processing those data require rapidly provisioned...

Además de la necesidad de ejecutar grandes simulaciones numéricas, el crecimiento exponencial de la cantidad de datos disponibles aporta al método científico un nuevo
paradigma basado en su análisis masivo; "Big Data" se refiere al conjunto de retos tecnológicos asociados con la obtención, gestión, análisis y...

The overall objective of the Next Generation I/O Project (NEXTGenIO) is to design and prototype a new, scalable, high-performance, energy efficient computing platform designed to address the challenge of delivering scalable I/O performance to applications at the Exascale. It will achieve this using highly innovative, non-volatile, dual in-line memory modules (NV-DIMMs). These...

EUDAT2020 brings together a unique consortium of e-infrastructure providers, research infrastructure operators, and researchers from a wide range of scientific disciplines under several of the ESFRI themes, working together to address the new data challenge. In most research communities, there is a growing awareness that the rising tide of data will require new approaches to...

The main objective is to create IOStack: a Software Defined Storage toolkit for Big Data on top of the OpenStack platform. IOStack will enableefficient execution of virtualized analytics applications over virtualized storage resources thanks to flexible, automated, and low cost datamanagement models based on software defined storage (SDS). Major challenges...

The consortium of this European Training Network (ETN) "BigStorage: Storage-based Convergence between HPC and Cloud to handle Big Data will train future data scientists in order to enable them and us to apply holistic and interdisciplinary approaches for taking advantage of a data overwhelmed world, which requires HPC and Cloud infrastructures with a redefinition of storage...

The objective of the RETHINK big project was to bring together the key European system architects and consumers of Big Data systems, to describe a coherent vision that will highlight the various business and technical challenges within the EU, and to achieve success by addressing the need of EU for processing and analysis of Big Data over the next 10 years. More...

The use of High Performance Computing (HPC) is commonly recognized a key strategic element both in research and industry for an improvement of the understanding of complex phenomena. The constant growth of generated data -Big Data- and computing capabilities of extreme systems lead to a new generation of computers composed of millions of heterogeneous cores which will provide...

The use of High Performance Computing (HPC) is commonly recognized as a key strategic element in improving understanding of complex phenomena, in both research and industry. The constant growth of generated data - Big Data - and the computing capabilities of extreme systems are leading to a new generation of computers composed of millions of heterogeneous cores which will...

Pages