
So, you want to run all your models through PyTorch on your AMD GPU but you don’t know how to install it? Look no further, with this guide you will learn how to install PyTorch for AMD HIP ROCm
Prerequisite: Installing ROCm
To actually use PyTorch you need to install ROCm on your computer. There are numerous way to do it. I suggest you to check this post where it is explained how to install ROCm. It’s not difficult, give it a try! I also point out that this guide is expressly aimed to Linux not Windows.
Install PyTorch in your python environment
If you followed my guide you probably have a container so enter in it with this command:
distrobox enter almalinux-rocmIf you don’t, well, worse for you! Jokes aside it is possible to install ROCm in other way so let’s keep going. Now we have to decide where to install the python packages. I highly suggest to install them in a virtual environment. A virtual environment is an isolated workspace used to prevent conflict between packages. You can for example create different virtual environment for all your applications so if one program needs a specific version to work it will not mess up packages of other programs. The program that we will use to manage virtual environment is Conda. If you followed my tutorial to setup ROCm you will already have Conda installed, otherwise I suggest to install it. Let’s create a generic python environment:
conda create -y -n py312 python=3.12And let’s activate it with:
conda activate py312Let’s install the packages with the following command
For ROCm 7.0.2:
pip install --pre torch==2.8.0 torchvision torchaudio==2.8.0 -f https://repo.radeon.com/rocm/manylinux/rocm-rel-7.0.2/ \
&& pip install tritonFor ROCm 6.4:
pip install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/rocm6.4 \
&& pip install tritonWith this command you will install the nightly version of the packages. I suggest you to always try the nightly version first because newer versions are better for newer hardware and for performance but you can also install the stable version
As you can see, there may be a mismatch between the ROCm and PyTorch versions. I suggest avoiding such actions. You can have a completely list of option on the official site of PyTorch
I also suggest to create different virtual environments for different programs, such as ComfyUI and Stable Diffusion web UI, and install PyTorch on them.
We are arrived to the end of this guide, you have officially installed PyTorch for your AMD GPU and enhanced it with HIP/ROCm software. I wish you a lot of iterations per seconds!
What about Windows?
You can find more information about Windows in this articles:
What to do after installing PyTorch to use your AMD GPU
The possibility are infinite: you can use Hugging face transformers directly on your scripts or generate images and videos with Stable Diffusion web UI or ComfyUI. You can find how to install them for AMD GPU on this blog:
- Install ComfyUI for AMD HIP/ROCm on Windows
Learn how to install ComfyUI for AMD HIP ROCm on Windows and generate images and videos. - Install PyTorch for AMD HIP ROCm on Windows
Learn how to install PyTorch for AMD HIP ROCm on Windows and accelerate AI programs. - How to install ROCm python packages on Windows
Learn how to install AMD HIP/ROCm on Windows trough python packages - Install axolotl for AMD HIP/ROCm on Linux
Learn how to install Llama-Factory on AMD GPUs using HIP/ROCm. Step-by-step guide for setting up LLM fine-tuning with AMD hardware and open-source tools. - Install Llama-Factory for AMD HIP/ROCm on Linux
Learn how to install Llama-Factory on AMD GPUs using HIP/ROCm. Step-by-step guide for setting up LLM fine-tuning with AMD hardware and open-source tools.





