docs: Add documentation for s390x cpu implementation (#14198)

Signed-off-by: Dilip Gowda Bhagavan <dilip.bhagavan@ibm.com>
Co-authored-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
This commit is contained in:
Dilip Gowda Bhagavan 2025-03-11 22:32:17 +05:30 committed by GitHub
parent 4bf82d4b90
commit 07964e2f30
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
3 changed files with 97 additions and 0 deletions

View File

@ -21,6 +21,7 @@ installation/ai_accelerator
- Intel/AMD x86 - Intel/AMD x86
- ARM AArch64 - ARM AArch64
- Apple silicon - Apple silicon
- IBM Z (S390X)
- <project:installation/ai_accelerator.md> - <project:installation/ai_accelerator.md>
- Google TPU - Google TPU
- Intel Gaudi - Intel Gaudi

View File

@ -36,6 +36,16 @@ vLLM is a Python library that supports the following CPU variants. Select your C
:::: ::::
::::{tab-item} IBM Z (S390X)
:sync: s390x
:::{include} cpu/s390x.inc.md
:start-after: "# Installation"
:end-before: "## Requirements"
:::
::::
::::: :::::
## Requirements ## Requirements
@ -75,6 +85,16 @@ vLLM is a Python library that supports the following CPU variants. Select your C
:::: ::::
::::{tab-item} IBM Z (S390X)
:sync: s390x
:::{include} cpu/s390x.inc.md
:start-after: "## Requirements"
:end-before: "## Set up using Python"
:::
::::
::::: :::::
## Set up using Python ## Set up using Python
@ -123,6 +143,16 @@ Currently, there are no pre-built CPU wheels.
:::: ::::
::::{tab-item} IBM Z (s390x)
:sync: s390x
:::{include} cpu/s390x.inc.md
:start-after: "### Build wheel from source"
:end-before: "## Set up using Docker"
:::
::::
::::: :::::
## Set up using Docker ## Set up using Docker
@ -147,6 +177,10 @@ $ docker run -it \
For ARM or Apple silicon, use `Dockerfile.arm` For ARM or Apple silicon, use `Dockerfile.arm`
:::: ::::
::::{tip}
For IBM Z (s390x), use `Dockerfile.s390x` and in `docker run` use flag `--dtype float`
::::
## Supported features ## Supported features
vLLM CPU backend supports the following vLLM features: vLLM CPU backend supports the following vLLM features:

View File

@ -0,0 +1,62 @@
# Installation
vLLM has experimental support for s390x architecture on IBM Z platform. For now, users shall build from the vLLM source to natively run on IBM Z platform.
Currently the CPU implementation for s390x architecture supports FP32 datatype only.
:::{attention}
There are no pre-built wheels or images for this device, so you must build vLLM from source.
:::
## Requirements
- OS: `Linux`
- SDK: `gcc/g++ >= 12.3.0` or later with Command Line Tools
- Instruction Set Architecture (ISA): VXE support is required. Works with Z14 and above.
- Build install python packages: `pyarrow`, `torch` and `torchvision`
## Set up using Python
### Pre-built wheels
### Build wheel from source
Install the following packages from the package manager before building the vLLM. For example on RHEL 9.4:
```console
dnf install -y \
which procps findutils tar vim git gcc g++ make patch make cython zlib-devel \
libjpeg-turbo-devel libtiff-devel libpng-devel libwebp-devel freetype-devel harfbuzz-devel \
openssl-devel openblas openblas-devel wget autoconf automake libtool cmake numactl-devel
```
Install rust>=1.80 which is needed for `outlines-core` and `uvloop` python packages installation.
```console
curl https://sh.rustup.rs -sSf | sh -s -- -y && \
. "$HOME/.cargo/env"
```
Execute the following commands to build and install vLLM from the source.
::::{tip}
Please build the following dependencies, `torchvision`, `pyarrow` from the source before building vLLM.
::::
```console
sed -i '/^torch/d' requirements-build.txt # remove torch from requirements-build.txt since we use nightly builds
pip install -v \
--extra-index-url https://download.pytorch.org/whl/nightly/cpu \
-r requirements-build.txt \
-r requirements-cpu.txt \
VLLM_TARGET_DEVICE=cpu python setup.py bdist_wheel && \
pip install dist/*.whl
```
## Set up using Docker
### Pre-built images
### Build image from source
## Extra information