Richard Liu
|
cd34029e91
|
Refactor TPU requirements file and pin build dependencies (#10010)
Signed-off-by: Richard Liu <ricliu@google.com>
|
2024-11-05 16:48:44 +00:00 |
|
Michael Green
|
1d4cfe2be1
|
[Doc] Updated tpu-installation.rst with more details (#9926)
Signed-off-by: Michael Green <mikegre@google.com>
|
2024-11-02 10:06:45 -04:00 |
|
Woosuk Kwon
|
211fe91aa8
|
[TPU] Correctly profile peak memory usage & Upgrade PyTorch XLA (#9438)
|
2024-10-30 09:41:38 +00:00 |
|
Woosuk Kwon
|
61f4a93d14
|
[TPU][Bugfix] Use XLA rank for persistent cache path (#8137)
|
2024-09-03 18:35:33 -07:00 |
|
Woosuk Kwon
|
eeffde1ac0
|
[TPU] Upgrade PyTorch XLA nightly (#7967)
|
2024-08-28 13:10:21 -07:00 |
|
Woosuk Kwon
|
a08df8322e
|
[TPU] Support multi-host inference (#7457)
|
2024-08-13 16:31:20 -07:00 |
|
Woosuk Kwon
|
90bab18f24
|
[TPU] Use mark_dynamic to reduce compilation time (#7340)
|
2024-08-10 18:12:22 -07:00 |
|
Woosuk Kwon
|
fad5576c58
|
[TPU] Reduce compilation time & Upgrade PyTorch XLA version (#6856)
|
2024-07-27 10:28:33 -07:00 |
|
Woosuk Kwon
|
c467dff24f
|
[Hardware][TPU] Support MoE with Pallas GMM kernel (#6457)
|
2024-07-16 09:56:28 -07:00 |
|
Woosuk Kwon
|
8c00f9c15d
|
[Docs][TPU] Add installation tip for TPU (#5761)
|
2024-06-21 23:09:40 -07:00 |
|
Woosuk Kwon
|
1a8bfd92d5
|
[Hardware] Initial TPU integration (#5292)
|
2024-06-12 11:53:03 -07:00 |
|