Hardware of Cluster Stage 1 of Lichtenberg II
This new system is in regular operation since 1st December 2020.
Login (8 nodes)
- 2x “Intel® Xeon® Platinum 9242 Processor” (Cascade Lake)
- 96 CPU cores and 2x Intel® AVX-512 units per core
- 768 GByte Main memory (DDR4-2933)
- HPC interconnect: InfiniBand HDR100 (100 GBit/s)
- Hostnames:
logc0001 … logc0008
- Accessible from outside as:
lcluster13 … lcluster20.hrz.tu-darmstadt.de
MPI – Section (630 nodes)
- 2x “Intel® Xeon® Platinum 9242 Processor” (Cascade Lake)
- 96 cores and 2x Intel® AVX-512 units per core
- 384 GByte Main memory (DDR4-2933)
- HPC interconnect: InfiniBand HDR100 (100 GBit/s)
- Hostnames:
mpsc0001 … mpsc0630
MEM – Section (2 nodes)
- 2x “Intel® Xeon® Platinum 9242 Processor” (Cascade Lake)
- 96 cores und 2x Intel® AVX-512 units per core
- 1536 GByte Main memory (DDR4-2933)
- HPC interconnect: InfiniBand HDR100 (100 GBit/s)
- Hostnames:
mpqc0001 … mpqc0002
ACC – Section GPUs (8 nodes)
- 4x “Intel® Xeon® Platinum 8260 Processor” (Cascade Lake)
- 96 cores and 2x Intel® AVX-512 units per core
- 384 GByte Main memory (DDR4-2933)
- 4x nodes with each 4x “NVIDIA® Tesla® V100” (Volta generation, GV100 chip)
Hostnames:gvqc0001 … gvqc0004
- 4x nodes with each 4x “NVIDIA® A100” (Ampere generation, GA100 chip)
Hostnames:gaqc0001 … gaqc0004
ACC – Section DGX A100 (3 nodes)
- 2x “AMD EPYC™ 7742” Processor
- 128 cores and 2x AVX-2 units per core
- 1024 GByte Main memory (DDR4-3200)
- HPC interconnect: 2x InfiniBand HDR200 (200 GBit/s)
- Hostnames:
gaoc0001 … gaoc0003
- 8x “NVIDIA® A100 Tensor Core GPUs” (Ampere generation)
Totals (all compute and login nodes)
- 62.592 cores
- 257 TByte RAM
- 16x Nvidia Volta V100
- 40x Nvidia Ampere A100
- 4x Nvidia Tesla T4