Posts by Collection

articles

portfolio

posts

publications

DataComp: In search of the next generation of multimodal datasets

Published as an Oral in NeurIPS 2023 [Paper] [Code]

Citation: Samir Yitzhak Gadre, Gabriel Ilharco, Alex Fang, Jonathan Hayase, Georgios Smyrnis, Thao Nguyen, Ryan Marten, Mitchell Wortsman, Dhruba Ghosh, Jieyu Zhang, Eyal Orgad, Rahim Entezari, Giannis Daras, Sarah Pratt, Vivek Ramanujan, Yonatan Bitton, Kalyani Marathe, Stephen Mussmann, Richard Vencu, Mehdi Cherti, Ranjay Krishna, Pang Wei Koh, Olga Saukh, Alexander Ratner, Shuran Song, Hannaneh Hajishirzi, Ali Farhadi, Romain Beaumont, Sewoong Oh, Alex Dimakis, Jenia Jitsev, Yair Carmon, Vaishaal Shankar, Ludwig Schmidt, "DataComp: In search of the next generation of multimodal datasets", NeurIPS 2023

talks

Athens Crossroad: Machine Learning and Healthcare

In this talk, the latest advancements regarding the application of Artificial Intelligence on the field of Medicine are discussed at a roundtable session for the Athens Crossroad (12th Congress of the Hellenic Society of Thoracic & Cardiovascular Surgeons) conference, under the theme “Artificial Intelligence”. Use cases of Natural Language Processing algorithms, Brain Computer Interfaces and Deep Learning Architectures for Image Processing are outlined.

Improving sparse transformer models for efficient self-attention

One disadvantage of using attention layers in a neural network architecture is that the memory and time complexity of the operation is quadratic. This talk tries to address the following question: “Can we design attention layers with lower complexity that are able to discover all dependencies in the input?”. The answer seems to be yes, by modeling the problem of introducing sparsity to the attention layer with Information Flow Graphs.

teaching