Rstudio on the Urblauna cluster Rstudio can be run on the Urblauna cluster from within a singularity container, with an interactive interface provided on the web browser of a Guacamole session. Running interactively with Rstudio on the clusters is only meant for testing. Development must be carried out on the users workstations, and production runs must be accomplished from within R scripts/codes in batch mode . The command Rstudio is now available in r-light module. You have to do a reservation first with Sinteractive, ask the right amount of resources and then launch the command 'Rstudio'. Procedure Sinteractive # specify here the right amount of resources module load r-light Rstudio The procedure below is now deprecated !! Preparatory steps on Curnagl side A few operations have to be executed on the Curnagl cluster: Create a directory in your /work project dedicated to be used as an R library, for instance: mkdir /work/FAC/FBM/DBC/mypi/project/R_ROOT   Optional : install required R packages, for instance ggplot2 module load gcc r export R_LIBS_USER=/work/FAC/FBM/DBC/mypi/project/R_ROOT R >>>install.packages("ggplot2") The batch script Create a file rstudio-server.sbatch with the following contents (it must be on the cluster, but the exact location does not matter): #!/bin/bash -l #SBATCH --account <<>> #SBATCH --job-name rstudio-server #SBATCH --signal=USR2 #SBATCH --output=rstudio-server.job #SBATCH --nodes 1 #SBATCH --ntasks 1 #SBATCH --cpus-per-task 1 #SBATCH --mem 8G #SBATCH --time 02:00:00 #SBATCH --partition interactive #SBATCH --export NONE RLIBS_USER_DIR=<<>> RSTUDIO_CWD=~ RSTUDIO_SIF="/dcsrsoft/singularity/containers/rstudio-4.3.2.sif" module load python singularityce module load r RLIBS_DIR=${R_ROOT}/rlib/R/library module unload r # Create temp directory for ephemeral content to bind-mount in the container RSTUDIO_TMP=$(mktemp --tmpdir -d rstudio.XXX) mkdir -p -m 700 \ ${RSTUDIO_TMP}/run \ ${RSTUDIO_TMP}/tmp \ ${RSTUDIO_TMP}/var/lib/rstudio-server mkdir -p ${RSTUDIO_CWD}/.R cat > ${RSTUDIO_TMP}/database.conf < ${RSTUDIO_TMP}/rsession.sh <&2 <&2 exit $SINGULARITY_EXIT_CODE You need to carefully replace, at the beginning of the file, the following elements: On line 3: <<< ACCOUNT_NAME>>> with the project id that was attributed to your PI for the given project On line 14: <<< RLIBS_PATH>>> must be replaced with the absolute path (ex. /work/FAC/.../R_ROOT ) to the chosen folder you created on the preparatory steps Running Rstudio Submit a job for running Rstudio from within the cluster with: [me@urblauna ~]$ sbatch rstudio-server.sbatch Once the job is running (you can check that with Squeue), a new file rstudio-server.job is then automatically created. Its contents will give you instructions on how to proceed in order to start a new Rstudio remote session from Guacamole. In this script we have reserved 2 hours