Distinguished lecture for the 30th anniversary of the ICGI conference
-
Dana Angluin (Yale University, USA)
Invited speakers
-
Cyril Allauzen (Google, New York)
Cyril Allauzen’s main research interests are in finite-state methods and their applications to text, speech, and natural language processing and machine learning. Before joining Google, he worked as a researcher at AT&T Labs Research and at NYU’s Courant Institute of Mathematical Sciences.
Cyril is an author of the OpenFst Library, the OpenKernel Library and the GRM Library. -
Ahmed Elnaggar (Technische Universität München, Germany)
Ahmed Elnaggar performs research on the applications of self-supervised deep learning and language models using high-performance computing in several domains, including NLP, Biology, software engineering, and speech.
Ahmed has recently contributed to the inference of major protein language models:
-
- Ahmed Elnaggar, Hazem Essam, Wafaa Salah-Eldin, Walid Moustafa, Mohamed Elkerdawy, Charlotte Rochereau, Burkhard Rost. Ankh: Optimized Protein Language Model Unlocks General-Purpose Modelling. arXiv (2023)
- ProtTrans: Towards Cracking the Language of Lifes Code Through Self-Supervised Deep Learning and High Performance Computing. IEEE Trans Pattern Anal Mach Intell. (2021)
- Modeling aspects of the language of life through transfer-learning protein sequences. BMC Bioinformatics. (2019) .