Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 4 Current »

https://mediahub.qut.edu.au/media/t/0_d0bsv333

Launching a Batch Job - Constructing a Job Script

It is possible to submit a batch job completely from the command line, saving the job parameters and commands in a text file is very handy for documenting you use of the HPC. In a job script #PBS is used to provide instructions to PBS, they are not run as commands. A small script is:

#!/bin/bash -l
#PBS -l select=1:ncpus=1:mem=2gb
#PBS -l walltime=00:10:00

echo $(hostname)

This job will request one node (select=1), one cpu (ncpus=1), 2gb of memory (mem=2gb) and run for a maximum of 10 minutes (walltime=00:10:00)

Notice how the options after #PBS are the same as the qsub command line?

This script is very basic, it will run the command hostname, which outputs the name of the computer this job is running on, then echo that to the screen.

While the name of the file is not important, I like to save my PBS job scripts as {name}.pbs to easily identify them in the file list. Use training01.pbs here.

Let’s use nano to create the file, nano is provided by a module:

module load nano

Create the file:

nano training01.pbs

Launching a Batch Job - Submitting a Job Script

Since all the options are contained in the job script, the qsub line is short:

qsub training01.pbs

And you will see a job number printed on the screen. Use qjobs to check on the status of the job.

Checking on the Job Status

To quickly check on you jobs that are queued and running, use the qjobs command

qjobs

You will get a summary of each queued job and the running ones. The finished ones are not displayed.

An alternative way to list your jobs:

qstat -u $user

Get more details about a particular job:

qstat -f {jobid}

Checking the Output

Since we told the job to print the name of the node the job was running on, how do we see it? PBS will save the output of the commands run in the job into two files by default. The format is {job name}.o{job id} and {job name}.e{job id}

Let's examine these files:

# find the files by listing the contents of the folder sorted by reverse date
ls -ltr
# the 'e' file is empty
cat training01.pbs.o{tab}
cl4n018
PBS Job 5228698.pbs
CPU time  : 00:00:00
Wall time : 00:00:02
Mem usage : 0b

We can see in this case, the job ran on the cl4n018 node, use no measurable cpu and memory, and lasted for 2 seconds. The two files represent the standard output and the error output of the commands. The name of the files and merging them is possible with more options.

More options in job scripts

We have just scratched the surface of what you can specify when you submit and run jobs. A few useful ones are:

  • Be notified when the job starts, use the -m option eg be sent an email if the job is aborted, when it begins, and when it ends: #PBS -m abe

  • Give the job a name: To find your job in a long list give it a meaning name with the -N option: #PBS -N MyJob01

  • Merge the error file into the standard output file: #PBS -j oe

  • Overriding the email address: If you want to send the job notification email to another address, use the -M option, eg #PBS -M bob@bob.com

Another example

From the Introduction to the Unix Shell for HPC users course lets run the do-stats.sh script as a job.

First, change to the folder:

cd ~/workshop/2024-2/shell-lesson-data/north-pacific-gyre

#!/bin/bash -l
#PBS -N GooStatsRun01
#PBS -l select=1:ncpus=1:mem=2gb
#PBS -l walltime=00:30:00
#PBS -m abe
cd $PBS_O_WORKDIR

# Calculate stats for data files.
for datafile in NENE*A.txt NENE*B.txt
do
    echo $datafile
    bash goostats.sh $datafile stats-$datafile
done

Call this do-goostats.pbs

Now submit to the scheduler:

qsub do-goostats.pbs

And check the status:

qjobs

When run, check the output:

ls -ltr
cat GooStatsRun01.o{job_id}
...
CPU time  : 00:00:00
Wall time : 00:00:33
Mem usage : 4648kb
cat GooStatsRun01.e{job_id}
{Empty File}

Tricks and Tips

When the job starts, PBS will logon to the node as you and your working directory will be your home folder. If your data is in a sub folder or in a shared folder, you can use this to automatically change to that folder:

cd $PBS_O_WORKDIR

$PBS_O_WORKDIR is a special environment variable created by PBS. This will be the folder where you ran the qsub command.

  • No labels