Lovelace - University of Plymouth HPC

Contents:

  • Login and access the internal site
  • Passwords Expiry and Updates (Login Required)
  • Introduction
  • HPC system policy
  • HPC Support Drop-In Sessions
  • Using LiCO for Jupyter, Rstudio, and More on the Lovelace Cluster (Login Required)
  • Accessing Compute Resources
  • Accessing Software with Modules
  • Data Transfer Guide
  • Running Graphical Applications With X11 Forwarding
  • Projects
  • GNU Parallel
  • Available Software
  • Containerisation
Lovelace - University of Plymouth HPC
  • Lovelace Cluster Public Documentation
  • View page source

Lovelace Cluster Public Documentation

Welcome to the University of Plymouth’s Public Documentation on High Performance Compute and the Lovelace cluster. This information on this site is shared with the intention of being useful to all HPC users. See the link below for the Internal Documentation site.

Note

There is an upcoming HPC training event on Tuesday 24/06/2025 and Thursday 26/06/2025. Please see the event page for more info.

Contents:

  • Login and access the internal site
  • Passwords Expiry and Updates (Login Required)
  • Introduction
    • Acknowledging the Lovelace Cluster
    • Getting help
    • Training
    • Hardware
  • HPC system policy
    • HPC Usage Policy
    • HPC account suspension
    • Acknowledging Lovelace in Publications
    • Allocation policy
  • HPC Support Drop-In Sessions
  • Using LiCO for Jupyter, Rstudio, and More on the Lovelace Cluster (Login Required)
  • Accessing Compute Resources
    • Using Slurm
    • Queues
    • Accounts
    • Output from Slurm jobs
    • Examples of Job Submission Scripts
  • Accessing Software with Modules
    • Using the modules environment
    • Available Compiler Suites
    • Compiling MPI codes
    • Compiler Information and Options
  • Data Transfer Guide
    • scp command
    • rsync command
  • Running Graphical Applications With X11 Forwarding
    • Slurm X11 Forwarding
  • Projects
    • Adding Users To Your Project
  • GNU Parallel
  • Available Software
    • Python
    • NVIDIA® CUDA® Toolkit
    • MATLAB
    • GROMACS
    • Ansys
  • Containerisation
    • Singularity
    • Podman
    • Running Containerised MPI Workloads with Slurm
Next

© Copyright 2025, University of Plymouth & Collaborators.

Built with Sphinx using a theme provided by Read the Docs.