Skip to content

Using the LUMI web interface

Presenters: Mats Sjöberg (CSC) and Oskar Taubert (CSC)

Content:

  • Introduction to the Open OnDemand web interface
  • Using PyTorch in JupyterLab on LUMI
  • Limitations of the web-based interactive interface and the CLI interface

Extra materials

Q&A

  1. Is it possible to use the built-in interactive breakpoint debugging features of IDEs like VS Code while running a model on LUMI interactively?

    • VSCode and most of its features work either through the web app or with the normal remote features.

    How to set the debugging feature to work interactively? I remember I tried, but didn't manage to.

    • Frankly, in LUST, we don't use it ourselves so we cannot really help you here.

    • Note also that VSCode was never meant to be an HPC debugger. For debugging applications on LUMI, there are better choices that we discuss in other courses.

    Any links to tutorials on how to set it to work?

    Anything straight forward directly used to debug existing python scripts? The links look like are more for C.

  2. Can I use interactive features and build a UI with tools like Gradio or Streamlit while running on LUMI?

    • In principle, yes.

    • In practice though, it is a good idea to fully decouple anything graphical from computations. Running a GUI app remotely is always painful due to network delays. LUMI really is not a good graphical workstation. In many cases, you'll be happier running the GUI pre- and postprocessing on a machine closer to where you are.

      You have to use either a web browser or VNC client that connects to a VNC server on LUMI (which is also what the desktop app uses under the hood) so all you see are really a sequence of compressed images sent over the network, with some lag possible in mouse pointer movements etc. When using such interfaces, you really feel how slow light is...