Some novel DL workloads in 2020

Table of Contents

Most Deep Learning Systems/Compilers in 2021 are still trying to optimize workloads like ResNet & VGG, though these models are really useful and we can publish papers on decent conference by proposing new compiler techniques for them (e.g. TASO/PET). I'm not so enthusiastic about competing on this track, Graph Neural Networks brings structured data into Deep Learning, which is good, but unfortunately the application of large scale GNNs are mostly limited to recommendation systems. I always expect my work to have broader positive social impact and apparently recommendation system can't.

I started surveying the evolving Deep Learning workloads this semester and try to find the "future" of Machine Learning Systems. Currently I feel like AlphaFold2 and Neural Radiance Field as two models that worth investigating into.

In this article I'll try to introduce the two workloads as concise as possible and explain their challenges in the view of a MLSys researcher.

AlphaFold/RoseTTAFold

TODO Multiple Sequence Alignment

TODO AlphaFold

TODO SE(3)-Transformer

Tensor Field Networks

\[ \textbf{f}(\textbf{x}) = \sum_{j=1}^{N} \textbf{f}_j \delta(\textbf{x} - \textbf{x}_j)\]

NVIDIA's optimization of SE(3)-Transformer

NVIDIA released an acceleration of SE(3)-Transformer.

RoseTTAFold

Neural Radiance Field (NeRF)

TODO Volume Rendering

TODO NeRF

Author: expye(Zihao Ye)

Email: expye@outlook.com

Date: 2021-10-28 Thu 00:00

Last modified: 2022-04-26 Tue 23:33

Licensed under CC BY-NC 4.0