Skip to content

Deployment Options

Bazzite AI workloads run on any platform with containers.

Deployment Methods

Platform Guide Description
Bazzite AI OS Native Commands Recommended: ujust commands
Docker & Podman Any Platform Standard container runtimes
Podman Quadlets Systemd Services Auto-start on boot
Kubernetes k3d Clusters Local k3d or production
HPC (Apptainer) Research Clusters SLURM + Apptainer
Dev Containers VS Code / IDEs IDE integration

Quick Start

Workload Services (Ollama, Jupyter, etc.)

Step Command Description Recording
1 ujust ollama config Configure
2 ujust ollama start Start
3 ujust ollama status Check

HPC Containers (Apptainer)

Step Command Description Recording
1 ujust apptainer pull Download image
2 ujust apptainer shell Interactive shell
3 ujust apptainer gpu GPU detection

Kubernetes (k3d)

Step Command Description Recording
1 ujust k3d config Configure cluster
2 ujust k3d create Create cluster
3 ujust deploy jupyterhub install Deploy app

GPU Support

GPU Setup Command Recording
NVIDIA ujust config gpu setup
AMD Automatic via /dev/dri -
Intel Automatic via /dev/dri -

Available Workloads

Workload Command Recording
Ollama ujust ollama
Jupyter ujust jupyter
ComfyUI ujust comfyui
OpenWebUI ujust openwebui
FiftyOne ujust fiftyone
Jellyfin ujust jellyfin
Portainer ujust portainer

See Also