Skip to content

GPU ISSUES

ROBERT MCDOWELL edited this page Dec 27, 2025 · 45 revisions

Help! torch library is not compatible with My NVIDIA/ROCm/XPU GPU!

Check if the toolkit and which version is installed

# CUDA
nvcc --version

# ROCm
rocminfo

# XPU
sycl-ls
  • Check if your version is compatible with the min/max version of ebook2audiobook in lib/conf.py.
    Once your toolkit version is compatible, you can go to the next step.

Note

Replace [XXX] in the commands below by your gpu and version.
example: cu128 for cuda 12.8, rocm6.2 for rocm 6.2 etc.
xpu, mps and cpu don't need a version.

Native mode

First of all delete completely any previous ebook2audiobook folder. Then clone again the ebook2audiobook git repo or the last version and make a first run to create the python virtual environment.

cd ebook2audiobook

# Windows
set DEVICE_TAG=[XXX] && ebook2audiobook.cmd

# Linux/MacOS
DEVICE_TAG=[XXX] ./ebook2audiobook.sh

Docker mode

  1. Build
# for All CPU/GPU/MPS

# Windows
set DEVICE_TAG=[XXX] && ebook2audiobook.cmd --script_mode build_docker

# Linux/MacOS
DEVICE_TAG=[XXX] ./ebook2audiobook.sh --script_mode build_docker
  1. Run
# GUI mode
        # CPU:
        docker run --rm -it -p 7860:7860 ebook2audiobook:cpu
        # CUDA:
        docker run --gpus all --rm -it -p 7860:7860 ebook2audiobook:cu[118/121/128 etc..]
        # ROCM:
        docker run --device=/dev/kfd --device=/dev/dri --rm -it -p 7860:7860 ebook2audiobook:rocm[6.0/6.1/6.4 etc..]
        # XPU:
        docker run --device=/dev/dri --rm -it -p 7860:7860 ebook2audiobook:xpu
        # JETSON:
        docker run --runtime nvidia  --rm -it -p 7860:7860 ebook2audiobook:jetson[60/61 etc...]

# Headless mode
# Windows Users: you must use "\" rather than "/" for all paths but device path
        # CPU:
        docker run --rm -it -v "/my/real/ebooks/folder/absolute/path:/app/ebooks" -v "/my/real/output/folder/absolute/path:/app/audiobooks" -p 7860:7860 ebook2audiobook:cpu --headless --ebook "/app/ebooks/myfile.pdf" [--voice /app/my/voicepath/voice.mp3 etc..]
        # CUDA:
        docker run --gpus all --rm -it -v "/my/real/ebooks/folder/absolute/path:/app/ebooks" -v "/my/real/output/folder/absolute/path:/app/audiobooks" -p 7860:7860 ebook2audiobook:cu[118/122/126 etc..] --headless --ebook "/app/ebooks/myfile.pdf" [--voice /app/my/voicepath/voice.mp3 etc..]
        # ROCM:
        docker run --device=/dev/kfd --device=/dev/dri --rm -it -v "/my/real/ebooks/folder/absolute/path:/app/ebooks" -v "/my/real/output/folder/absolute/path:/app/audiobooks" -p 7860:7860 ebook2audiobook:rocm[6.0/6.1/6.4 etc..] --headless --ebook "/app/ebooks/myfile.pdf" [--voice /app/my/voicepath/voice.mp3 etc..]
        # XPU:
        docker run --device=/dev/dri --rm -it -v "/my/real/ebooks/folder/absolute/path:/app/ebooks" -v "/my/real/output/folder/absolute/path:/app/audiobooks" -p 7860:7860 ebook2audiobook:xpu --headless --ebook "/app/ebooks/myfile.pdf" [--voice /app/my/voicepath/voice.mp3 etc..]
        # JETSON:
        docker run --runtime nvidia --rm -it -v "/my/real/ebooks/folder/absolute/path:/app/ebooks" -v "/my/real/output/folder/absolute/path:/app/audiobooks" -p 7860:7860 ebook2audiobook:jetson[60/61 etc...] --headless --ebook "/app/ebooks/myfile.pdf" [--voice /app/my/voicepath/voice.mp3 etc..]

Docker compose and Podman compose

  1. Build
# Docker compose
DEVICE_TAG=[XXX] docker-compose up -d --build

# Podman compose
DEVICE_TAG=[XXX] podman compose -f podman-compose.yml up -d --build
  1. Run
# Docker Compose
docker-compose up -d

# Podman Compose
podman compose -f podman-compose.yml up -d

Clone this wiki locally