Installation¶
This page explains how to install RAITAP from PyPI and how to select the optional dependency groups you need.
Use uv add if you are working inside a managed Python project (pyproject.toml exists). Use uv pip install or pip install otherwise.
1. Install RAITAP¶
uv add raitap
pip install raitap
2. Install optional dependencies¶
Execution dependencies¶
RAITAP supports both PyTorch and ONNX models, and both CPU and GPU execution. To avoid conflicts, only install the dependencies that match your setup.
First, choose the right group from this table:
CPU |
CUDA |
Intel GPU |
|
|---|---|---|---|
Torch |
|
|
|
ONNX |
|
|
|
Then, adapt the following example command and run it:
uv add "raitap[onnx-cpu]" # replace `onnx-cpu` with your group
pip install "raitap[onnx-cpu]" # replace `onnx-cpu` with your group
Note
CUDA corresponds to NVIDIA GPUs.
torch-inteluses the Intel XPU API directly.onnx-inteluses the OpenVINO ONNX Runtime.Apple MPS support is coming soon.
Assessment dependencies¶
You can then install the dependencies for the assessment modules you want to use.
For instance, if you want to assess the model’s transparency, run:
uv add "raitap[transparency]"
pip install "raitap[transparency]"
If you plan to use a single underlying framework (here Captum), you can run the following instead:
uv add "raitap[captum]"
pip install "raitap[captum]"
Note
Alibi requires dependency overrides before it can be installed. See Alibi (transparency) for the exact steps.
Combine multiple extras¶
Of course, such optional dependency groups can be combined. For instance:
uv add "raitap[onnx-cpu,transparency]"
pip install "raitap[onnx-cpu,transparency]"