TensorRT
NVIDIA TensorRT is a platform for high-performance deep learning inference
https://developer.nvidia.com/tensorrt
Available modules
The overview below shows which TensorRT installations are available per target architecture in the HPCC module system, ordered based on software version (new to old).
To start using TensorRT, load one of these modules using a module load
command like:
module load TensorRT/10.4.0-foss-2023a-CUDA-12.3.0
(This data was automatically generated on Thu, 03 Jul 2025 at 12:32:57 EDT)
gateway | generic | zen2 | zen3 | zen4 | haswell | skylake_avx512 | |
---|---|---|---|---|---|---|---|
Gateway nodes | everywhere (except Grace nodes) | amd20 | amd22 | amd24 | intel16 | intel18,amd20-v100,amd21,intel21 | |
TensorRT/10.4.0-foss-2023a-CUDA-12.3.0 | - | x | - | - | - | - | - |
TensorRT/8.6.1-foss-2022a-CUDA-11.7.0 | - | - | - | - | - | x | - |