BSC and IBM Research Deep Learning Center to boost cognitive computing

16 June 2016

Supercomputers have played an important role in the field of computational science.

In the last decades they have been used for a wide range of computationally intensive tasks in various fields, including quantum mechanics, weather forecasting, climate research, oil and gas exploration, molecular modeling (computing the structures and properties of chemical compounds, biological macromolecules, polymers, and crystals), and physical simulations. In the last years we have seen the emergence of a new trend that is dramatically changing how the design of supercomputing systems and applications move forward: the emergence of data as the world’s newest, and arguably largest, natural resource.

As pointed in the Director’s View in March 2016 Newsletter, Cognitive Computing is the next big thing in Big Data, representing a great opportunity for each and every one of us at the Barcelona Supercomputing Center. Cognitive computing systems learn and interact naturally with people to extend what either humans or machines could do independently. Cognitive computing is the simulation of human thought processes in a computerized model. It involves self-learning systems that use data mining, pattern recognition and natural language processing to mimic the way the human brain works. Rather than being programmed to anticipate every possible answer or action needed to perform a function or set of tasks, cognitive computing systems are trained to sense, predict, infer and, in some ways, think. Cognitive systems will help human experts make better decisions by penetrating the complexity of Big Data.

Deep Learning

Cognitive computing systems use machine-learning algorithms. Such systems continually acquire knowledge from the data fed into them by mining data for information. The systems refine the way they look for patterns as well as the way they process data, so they become capable of anticipating new problems and modeling possible solutions.

In previous artificial intelligence efforts, scientists tackled problems that were difficult for humans but relatively easy for computers, such as large-scale mathematical calculations. In more recent years, scientists have been taking on tasks that are easy for people to perform but hard to describe to a machine, tasks humans solve “without thinking,” such as recognizing spoken words or faces in a crowd. Machine learning has become fundamental to build cognitive systems that can learn and understand the world by themselves. Machine learning is shifting from a highly manual process, where humans have had to design good representations for each task of interest, into an automated process where machines learn more like babies do — through experience – building internal representations that help to make sense of the world.

Deep Learning is not brand new. In the 1980s, the concept of neural networks was the precursor of Deep Learning. The new development is the accumulation of many scientific and technical advances in terms of multilayer perceptron and support vector machines that have yielded breakthroughs in AI applications such as speech recognition, computer vision, and natural language processing. These have become feasible due to advances in creating hierarchies of concepts and representations that computers discover on their own. Such hierarchies allow a computer to learn complicated concepts by building them out of simpler ones.  This is also how humans learn and build their understanding of the world; they gradually refine their model of the world to better fit what they observe and discover new ideas from the composition of older ones, new ideas that help to better fit the evidence, the data.

Thanks to more powerful computers, the availability of large and varied datasets, and advances in algorithms, is now becoming possible to cross a threshold that has long held back computer science, making cognitive computing a reality. Once computers truly understand text, speech, images and sounds, they will become indispensable assistants in human´s life. This will revolutionize the way we interact with computers, helping us live more conveniently in our day-to-day lives and perform more effectively at work. It will enable society to take on some of the grand challenges that matter – such as curing deadly diseases, customizing medicine and spreading knowledge and wealth more broadly.

The BSC-IBM collaboration agreement

In 2016 Barcelona Supercomputing Center-Centro Nacional de Supercomputación (BSC-CNS) and IBM renewed their agreement to create the IBM-BSC Deep Learning Center. The agreement will provide the framework for conducting joint research and development projects on the Deep Learning domain, an essential component of cognitive computing, with focus on the development of new algorithms to improve and expand the cognitive capabilities of deep learning systems. Additionally, the center will also research on flexible computing architectures –fundamental for big data workloads– like data centric systems and applications.

The goals to be pursued by the collaboration agreement revolve around enabling Cognitive Computing by focusing on greatly accelerating basic learning tasks that require Pattern Recognition and Machine Learning, with an emphasis on Deep Learning. The agreement will help BSC to evolve its scientific simulations in areas related with life and earth sciences and engineering to adopt cognitive computing technologies in their applications and workflows, including cognitive IoT. One of the scientific areas that is considered strategic for both BSC and IBM is personal medicine. This is well aligned with the strategy of the Computer Sciences Department inside the renewed BSC Severo Ochoa program to provide an infrastructure for Analytics as a Service, as shown in the following illustration.

The projects to be carried out in the framework of the agreement should plan for vertical integration and how to exploit the Cognitive applications in the marketplace, encouraging partnership with end-users of Cognitive applications in order to prove an impact in practice. Regarding system and architecture technologies, areas of interest are those that target system aspects (hardware and software) consistent with the overall goal of enabling Cognitive applications by greatly accelerating Deep Leaning and other machine learning kernels. For example, new algorithms and library enablement for hardware acceleration of Deep Learning (i.e. GPUs, FPGAs), effective use of near (non volatile) memory acceleration and of the superior bandwidth and memory coherency features (e.g. NVLINK, CAPI) in OpenPower architectures, to name a few.

The agreement also embraces education and transfer skills on deep learning to academia through different kind of activities supported by scholarships and grants.

A long collaboration journey

IBM's cooperation with the BSC-CNS dates back to 2000 when the Centro de Paralelismo de Barcelona (CEPBA), precursor of BSC, signed a four-year agreement with IBM to create the CEPBA-IBM Research Institute (CIRI) which specialized in deep computing. Since then IBM has invested more than 10 million Euros in joint research, technology transfer and training programs.

The new deep learning center is the forth big joint project between BSC and IBM in 11 years of fruitful collaboration. The previous ones were the Centro Tecnológico de Supercomputación (2013-2015), Research and Development MareIncognito project (2007-2011) and MareNostrum supercomputer (2005-2007). Through this collaboration, both organizations have conducted numerous joint research activities involving IBM labs in the area of high performance computing.

Authors

  • Elisa Martin Garijo, Chief Technology IBM Spain, Distinguished Engineer & Member IBM Academy of Technology
  • Eduard Ayguadé, BSC Computer Sciences Associate Director, Full Professor at UPC’s Computer Architecture Department