Extreme computing

Primary tabs

Extreme Computing group approach is based on the need of novel mathematics and algorithms  that are “ essential and indispensable for addressing the major challenges in science, technology and society”. Therefore the research effort is focused on developing an  unifying mathematical approaches  to address key challenges in HPC, Data Analytics and in Digital Science overall.

New mathematical methods and algorithms are important ingredients in ensuring efficient usability of the current petascale and emerging exascale architectures and technologies.  as well as scalability starting from the mathematical methods level, through algorithms down to system levels.  The Extreme Computing strategic research contributes to the top of the stack, e.g. mathematical and algorithmic levels and the vertical integration.  This combined  with the research focus of other groups is enabling BSC Computer Science Department to address the research challenges  at all levels of the stack.

Objectives

  • Enabling Vertical Integration. One of strategic research topics the group is focusing on is vertical integration.  Advanced research is required to ensure for any of the developed mathematical methods and algorithms that these can be efficiently implemented on different types of HPC architectures and can easily be used by a broad user community. Extreme Computing group works on advanced stochastic and hybrid methods and algorithms for Linear Algebra, Optimization, etc applied to variety of practical problems which are representatives of such methods and algorithms.
  • Developing Efficient Methods for Solving Problems with Uncertainty. Uncertainty is unavoidable in almost all systems analysis, in risk analysis in decision making and economics and financial modelling, in weather and pollution modelling, disaster modelling and simulation. How uncertainty is handled and quantified shapes the integrity of the analysis, and the correctness and credibility of the solution and the results. Here the focus is on stochastic and hybrid methods and algorithms for solving problems with uncertainties, methods and algorithms for quantifying uncertainties such as dealing with data input, sensitivity analysis, model inadequacy, etc
  • Efficient Large Crowds Simulation  Animating varied crowds using a diversity of models and animations (assets) is complex and costly. One has to use models that are expensive if bought, take a long time to model, and consume too much memory and computing resources. We are developing methods for simulating, generating, animating and rendering crowds of varied aspect and a diversity of behaviours. The focus is on efficient simulations of large crowds that can be run on low cost systems using modern programmable GPUs, but being able to scale up for even larger crowds. The ultimate goal is to simulate very large crowds of over a million or several millions of subjects using variety of advanced architectures in real time.
  • Developing efficient mathematical methods and algorithms for discovering global properties on data. Many of now days grand challenge problems are Data and Compute Intensive. Significant challenges arise from extreme data challenges, which require a paradigm shift from a compute centric view to combining both data centric view and compute centric view. In addition we need to deploy algorithms for data discovery and in particularly those which can discover global properties of these data. The focus is on development of network science and optimization based algorithms for Social Media analysis, social and economic problems and cancer research respectively.