Portfolio item number 1
Short description of portfolio item number 1
Short description of portfolio item number 1
Published in CVPR 2020 [Paper] [Code]
Citation: Giannis Daras, Augustus Odena, Han Zhang, Alexandros G. Dimakis, "Your Local GAN: Designing Two Dimensional Local Attention Mechanisms for Generative Models", CVPR 2020
Published in NeurIPS 2020 [Paper] [Code]
Citation: Giannis Daras, Nikita Kitaev, Augustus Odena, Alexandros G. Dimakis, "SMYRF: Efficient Attention using Asymmetric Clustering", NeurIPS 2020
Published in NeurIPS 2020 Deep Inverse Workshop [Paper]
Citation: Joseph Dean, Giannis Daras , Alexandros G. Dimakis, "Intermediate Layer Optimization for Inverse Problems using Deep Generative Models", NeurIPS 2020 Deep Inverse Workshop
Preprint in arXiv [Paper] [Code]
Citation: Giannis Daras , Joseph Dean, Ajil Jalal, Alexandros G. Dimakis, "Intermediate Layer Optimization for Inverse Problems using Deep Generative Models", arXiv
Presented Google Summer of Code 2018 work for the project: “Adding Greek Language to spaCy” under the GFOSS — the Open Technologies Alliance.
In this talk, the latest advancements regarding the application of Artificial Intelligence on the field of Medicine are discussed at a roundtable session for the Athens Crossroad (12th Congress of the Hellenic Society of Thoracic & Cardiovascular Surgeons) conference, under the theme “Artificial Intelligence”. Use cases of Natural Language Processing algorithms, Brain Computer Interfaces and Deep Learning Architectures for Image Processing are outlined.
One disadvantage of using attention layers in a neural network architecture is that the memory and time complexity of the operation is quadratic. This talk tries to address the following question: “Can we design attention layers with lower complexity that are able to discover all dependencies in the input?”. The answer seems to be yes, by modeling the problem of introducing sparsity to the attention layer with Information Flow Graphs.