Access and User Regulations
General Scientific/Research Usage
For accessing and using the Lichtenberg cluster as a scientist, the following is necessary:
1) Project : A project must be set up to which the computing time used can be billed (booked) for. Please submit a for this purpose. Project Application
Usually the project application is submitted by the project manager. NHR Large and NHR Normal projects are scrutinized scientifically (according to the conditions of the NHR4CES Resource Allocation Board ( )). Small projects will just be checked technically for feasibility. RAB
2) User account: A user account must be activated which is assigned to a project, i.e. the project manager specifies who is assigned to the project. The user account is assigned to exactly one person and is managed via TU-ID. For this purpose, an application for using the high-performance computer must be submitted: Application form
Your user account (2) can be extended even without valid/running project, for you to have access to your research data even after end of a project (eg. to transfer and back up data from the file systems of the HPC to your own storage).
Lectures and workshops
For utilizing the Lichtenberg cluster in the context of lectures and courses, please find further details under “. Lectures and workshops
Recommendation: In order to have a smooth and efficient start when working with the Lichtenberg-HPC, we advise all new users to attend the ”Introduction to the Lichtenberg High Performance Computer" mentioned below.
News and Events
-
Introduction to the Lichtenberg High Performance Computer
2022/03/10
Attendance is free of charge
For the (potential) users of the Lichtenberg supercomputer, an introduction is being held every second Tuesday a month. Subjects are the available hardware and software and the general use of the (batch) system. This will take place in hybrid mode (presence and webinar).
-
New HPC Portal
2025/05/25
for “Small” projects on the Lichtenberg HPC
The new portal uses the same technology as the NHR application portal for NHR-N- and NHR-L projects, and supersedes our deprecated old system.
-
Picture: TUDaPicture: TUDa
TUDa Open Campus on 25 May at the Lichtwiese
2025/05/25
Guided tours through the Lichtenberg II high-performance computer
On 25 May, TU Darmstadt is offering the public exciting insights into research, innovation and teaching at the TUDa Open Campus. The varied and colourful programme for watching, listening, learning and participating offers exciting activities for young and old. Together with the Institute of Technical Thermodynamics, the HRZ offers guided tours of the high-performance computer.
-
Preparations for Migration to RedHat EL 9
2025/02/24
Jumping up a major release of the cluster's operating system
Some login and compute nodes are already migrated to RHEL 9.4
-
Solved: Failure of the Cluster-wide File System
2024/11/04
System back at normal and available
+++ Update 17:00: The deadlock problem could only be fixed by (hard) reset of various GPFS master servers and a reboot of all compute nodes. Hence, all running jobs at the time of the GPFS lockup unfortunately are lost. If you did not explicitly prohibit it (by using special parameters), the scheduler will restart those jobs on its own. +++
-
New Defaults for OpenMP- and Hybrid Programs
2024/10/24
-
HPC and Housing in L5|08: Downtime
2024/09/30
for operations on the power infrastructure
For the final repair of the 2000A power rail, the whole HPC cluster will be down.