flash-attention#
Available modules#
The overview below shows which flash-attention installations are available per HPC-UGent Tier-2 cluster, ordered based on software version (new to old).
To start using flash-attention, load one of these modules using a module load
command like:
module load flash-attention/2.6.3-foss-2023a-CUDA-12.1.1
(This data was automatically generated on Wed, 13 Nov 2024 at 15:44:32 CET)
accelgor | doduo | donphan | gallade | joltik | shinx | skitty | |
---|---|---|---|---|---|---|---|
flash-attention/2.6.3-foss-2023a-CUDA-12.1.1 | x | - | x | - | x | - | - |