Projects

Showing 531 - 540 results of 562

Interoperability, which brings major benefits for enterprise and science, is key for the pervasive adoption of grids & clouds. Interoperability between existing grids and clouds was of primary importance for the European Union at the time of this project. Many of the policy issues required to achieve interoperability with DCIs across the world are already being explored...

With the challenges of service and infrastructure providers as the point of departure, OPTIMIS focused on open, scalable and dependable service platforms and architectures that allowed flexible and dynamic provision of advanced services. The OPTIMIS innovations can be summarized as a combination of technologies to create a dependable ecosystem of providers and consumers that...

The use of High Performance Computing (HPC) is commonly recognized as a key strategic element in improving understanding of complex phenomena, in both research and industry. The constant growth of generated data - Big Data - and the computing capabilities of extreme systems are leading to a new generation of computers composed of millions of heterogeneous cores which will...

Let’s imagine that we have developed a drug with very good potency for some cancer therapy. However, with time, many patients start developing resistance to the drug due to some particular point mutation as a result of the high metabolism of the cancer cells. Or let’s imagine that some specific treatment for a virus has lost its efficacy due to some viral...

Design complexity and power density implications stopped the trend towards faster single-core processors. The current trend is to double the core count every 18 months, leading to chips with 100+ cores in 10-15 years. Developing parallel applications to harness such multicores is the key challenge for scalable computing systems.

The ENCORE project...

There is an ever-increasing demand both for new functionality and for reduced development and production costs for all kinds of Critical Real-Time Embedded (CRTE) systems (safety, mission or business critical). Moreover, new functionality demands can only be delivered by more complex software and aggressive hardware acceleration features like memory hierarchies and multi-core...

Dataflow parallelism is key to reach power efficiency, reliability, efficient parallel programmability, scalability, data bandwidth. In this project we proposed dataflow both at task level and inside the threads, to offload and manage accelerated codes, to localize the computation, for managing the fault information with appropriate protocols, to easily migrate code to the...

Data storage technology today faces many challenges, including performance inefficiencies, inadequate dependability and integrity guarantees, limited scalability, loss of confidentiality, poor resource sharing, and increased ownership and management costs. Given the importance of both direct-attached and networked storage systems for modern applications, it becomes imperative...

Coastal-zone oceanographic predictions seldom appraise the land discharge as a boundary condition. River fluxes are sometimes considered, but neglecting their 3D character, while the "distributed" continental run-off is not taken into consideration. Moreover, many coastal scale processes, particulary those relevant in geographical restricted domains (coasts with harbours or...

Storage research increasingly gains importance based on the tremendous need for storage capacity and I/O performance. Over the past years, several trends have considerably changed the design of storage systems, starting from new storage media over the widespread use of storage area networks, up to grid and cloud storage concepts. Furthermore, to achieve cost...

Pages