Parallel execution

To perform execution in parallel, first one needs to create a Parallel object that will describe the parameters of the calculation.

class wannierberri.parallel.Parallel(num_cpus=None, npar_k=0, ray_init={}, cluster=False, progress_step_percent=1)[source]

Bases: object

a class to store parameters of parallel evaluation

Parameters:
  • num_cpus (int) – number of parallel processes. If None - automatically chose by Ray (1 per CPU)

  • npar_k (int) – additional parallelisation over k-points inside the FFT grid

  • cluster (bool) – set to True to use a multi-node ray cluster ( see also wannierberri.cluster module)

  • ray_init (dict) – parameters to be passed to ray.init(). Use only if you know wwhat you are doing.

  • progress_step_percent (int or float) – progress (and estimated time to end) will be printed after each percent is completed

class wannierberri.parallel.Serial(npar_k=None, progress_step_percent=1)[source]

Bases: Parallel

a class defining the serial execution (although npar_k is allowed)

Parameters:
  • npar_k (int) – additional parallelisation ove k-points inside the FFT grid

  • progress_step_percent (int or float) – progress (and estimated time to end) will be printed after each percent is completed

NOTE: Ray will produce a lot of temorary files during running. /tmp is the default directory for temporary data files. More information about temorary files.

If you are using a cluster, you may have no permission to delete them under /tmp. Please store them under the folder which under your control by adding ray_init={'_temp_dir': Your_Path}. Please keep Your_Path shorter. There is a problem if your path is long. Please check temp_dir too long bug

multi-node mode

When more than one node are employed on a cluster, first they should be connected together into a Ray cluster. This can be done by a script suggested by gregSchwartz18 here

Such a script can be generated, for example if your cluster uses SLURM

python -m wannierberri.cluster --batch-system slurm --exp-name wb-2nondes --num-nodes 2  --partition cmt --command "python -u example.py 2-nodes" --submit

Or if you are using PBS

python -m wannierberri.cluster --batch-system pbs --exp-name wb-2nondes --num-nodes 2  --partition cmt --command "python -u example.py 2-nodes" --submit

for more info see

python -m wannierberri.cluster -h