Skip to content

llama.cpp#

Available modules#

The overview below shows which llama.cpp installations are available per HPC-UGent Tier-2 cluster, ordered based on software version (new to old).

To start using llama.cpp, load one of these modules using a module load command like:

module load llama.cpp/b4595-foss-2023a-CUDA-12.1.1

(This data was automatically generated on Wed, 26 Feb 2025 at 15:45:13 CET)

accelgor doduo donphan gallade joltik shinx
llama.cpp/b4595-foss-2023a-CUDA-12.1.1 x - x - x -
llama.cpp/b4595-foss-2023a x - x - x -