BSC develops a database for the automatic sign language translation thanks to AI

24 March 2021

The How2Sign dataset will enable the development of technologies for over 466 million deaf or hard-of-hearing people worldwide.

Amanda Duarte, researcher in the Emerging Technologies for Artificial Intelligence group at the Barcelona Supercomputing Center (BSC), has developed an extensive dataset for the automatic translation of sign language with artificial intelligence (AI). This new resource, called How2Sign, will be presented at the CVPR 2021 conference, the most impactful scientific conference across all disciplines, according to Google Scholars Metric 2020.

The dataset consists of 80 hours of sign language videos, where American Sign Language (ASL) professional interpreters translate video tutorials (cooking recipes, DIY tips, etc). Amanda Duarte has spent more than two years recording and preparing the data for its release. Furthermore, the BSC researcher recorded three hours of video at Carnegie Mellon University's Panoptic Studio, a singular dome-shaped multiview studio equipped with 510 cameras, which allow the interpreters' 3D posture to be reconstructed.

How2Sign is a public resource that will allow researchers in both the fields of natural language processing and computer vision to advance in the area of ​​automatic sign language understanding and production, which will facilitate the technological accessibility to an estimated 466 million deaf or hard-of-hearing people worldwide. One of the first applications of How2Sign has been the development of a software that can transfer sign language gestures from one person to another.

“The main advances in artificial intelligence require three ingredients: algorithms, computation and data. How2Sign is the data that, analyzed by deep neural networks in supercomputing centers, will significantly improve accessibility to technologies such as virtual or robotic assistants”, says Amanda Duarte.

Duarte, INPhiNIT doctoral student of the “la Caixa” Foundation, has received funding from Facebook AI, the “la Caixa” Foundation and the collaboration of the Image Processing Group of the Universitat Politècnica de Catalunya (UPC), Carnegie Mellon University and Gallaudet University to build this work-in-progress dataset collection. Jordi Torres, manager of the Emerging Technologies for AI group at the BSC and professor at the UPC, and Xavier Giró, researcher at the BSC and professor at the UPC, have directed the work of Amanda Duarte.