Notes on Google's Pathway

Table of Contents

Pathway1 is a Distributed Machine Learning System released by Google recently, with the help of Pathway, Google successfully scales Language Models to 540 Billion Parameters (the PaLM) 2. James Bradbury, lead of JAX team, claims3 that PaLM was trained on 6144 TPU v4 chips across two pods, and it achieves 46.2% end-to-end FLOPs utilization: a marvelous record.

In this article I'll go through the paper and briefly introduce the design highlights of Pathway.

Why Pathway

System Architecture


Why designed so?

The simple question is the underlying hardwares decides the system design.

My two cents on Sparsity

I'm working on compiler for sparse workloads in Deep Learning.


Author: expye(Zihao Ye)



Last modified: 2022-08-12 Fri 09:07

Licensed under CC BY-NC 4.0