|
|
|
## Issue
|
|
|
|
## False example:
|
|
|
|
|
|
|
|
```
|
|
|
|
#[ctchiling@pollux running]# ssh p040
|
|
|
|
Warning: Permanently added 'p040,10.3.0.80' (ECDSA) to the list of known hosts.
|
|
|
|
Access denied: user ctchiling (uid=34213) has no active jobs on this node.
|
|
|
|
Authentication failed.
|
|
|
|
```
|
|
|
|
|
|
|
|
You can't enter in the node when you don't have a job in there.
|
|
|
|
|
|
|
|
## Real example:
|
|
|
|
**Go to Licallo:**
|
|
|
|
|
|
|
|
```
|
|
|
|
ctchiling@pcp-ctchiling:~$ ssh licallo
|
|
|
|
ctchiling@licallo's password:
|
|
|
|
Last login: Thu Feb 18 14:02:31 2021 from 10.254.1.241
|
|
|
|
_ _
|
|
|
|
_ __ ___ | | |_ ___ __
|
|
|
|
| '_ \ / _ \| | | | | \ \/ /
|
|
|
|
| |_) | (_) | | | |_| |> <
|
|
|
|
| .__/ \___/|_|_|\__,_/_/\_\
|
|
|
|
|_|
|
|
|
|
|
|
|
|
Welcome to CentOS Linux 7 (Core) (GNU/Linux 3.10.0-1127.el7.x86_64 x86_64)
|
|
|
|
|
|
|
|
You can get several information on: https://pollux/
|
|
|
|
|
|
|
|
System information as of Thu Feb 18 16:59:01 CET 2021
|
|
|
|
|
|
|
|
System Uptime: 99 days 20 hours 37 min 31 sec
|
|
|
|
System Load: 0.03, 0.03, 0.05
|
|
|
|
Processes: 515
|
|
|
|
Local Users: 14
|
|
|
|
```
|
|
|
|
|
|
|
|
**Then, go SLURM**
|
|
|
|
```
|
|
|
|
17:01:47 [ctchiling@pollux ~]# cd view/HPC/SLURM
|
|
|
|
|
|
|
|
**Then, create a file name's running**
|
|
|
|
#mkdir running
|
|
|
|
#./sequential/sequential.slurm ./sleep.slurm
|
|
|
|
#nano sleep.slurm
|
|
|
|
```
|
|
|
|
In this text you write:
|
|
|
|
```
|
|
|
|
" GNU nano 2.3.1 File: sleep.slurm
|
|
|
|
|
|
|
|
#!/usr/bin/env bash
|
|
|
|
#SBATCH --job-name=waste
|
|
|
|
#SBATCH --partition=seq
|
|
|
|
#SBATCH --time=0:20:0
|
|
|
|
#SBATCH --output %x.%j.out
|
|
|
|
#SBATCH --error %x.%j.err
|
|
|
|
|
|
|
|
echo "This job was launch from $SLURM_SUBMIT_HOST in $SLURM_SUBMIT_DIR"
|
|
|
|
|
|
|
|
sleep 300
|
|
|
|
"
|
|
|
|
```
|
|
|
|
Your commande sleeps while 5 minutes.
|
|
|
|
|
|
|
|
**Then**
|
|
|
|
|
|
|
|
```
|
|
|
|
#[ctchiling@pollux running]# sbatch sleep.slurm
|
|
|
|
|
|
|
|
Submitted batch job 33691086
|
|
|
|
|
|
|
|
|
|
|
|
[ctchiling@pollux running]# squeue -u ctchiling
|
|
|
|
JOBID PARTITION NAME USER ST TIME NODES NODELIST(REASON)
|
|
|
|
33691090 x40 waste ctchilin R 0:08 1 x011
|
|
|
|
|
|
|
|
```
|
|
|
|
|
|
|
|
Now you can enter in the x011 node:
|
|
|
|
```
|
|
|
|
#[ctchiling@x011 ~]$ ssh x011
|
|
|
|
#Last login: Thu Feb 18 17:25:49 2021 from pollux
|
|
|
|
#[ctchiling@x011 ~]$
|
|
|
|
|
|
|
|
``` |
|
|
|
\ No newline at end of file |