University of Calgary
UofC Navigation

Storm QuickStart Guide

About this QuickStart Guide

This QuickStart guide gives a overview of the Storm cluster at the University of Calgary.

It is intended to be read by new account holders getting started on Storm, covering such topics as the Storm hardware and performance characteristics, available software, usage policies and how to log in and run jobs. 

For Storm-related questions not answered here, please write to .


A large equipment donation from HP Labs, along with additional contributions from project partners Alberta Innovation & Science (AI&S), Western Economic Diversification (WED) Canada, HP Canada, and the University of Calgary, led to the installation of a cluster with more than 1000 cores at the University of Calgary in the fall of 2008. The primary purpose of the acquisition was for research into grid and utility computing. However, a portion of the cluster was made available for general scientific computing cycles as the Terminus cluster. In the fall of 2012, a major reconfiguration of the system was undertaken, under the new name, Storm.

As of this writing (2012-11-19), Storm consists of only 30 4-core nodes.  However, by January 2013 all of the existing Terminus nodes will have been incorporated into Storm, bringing the total core count to over 700.

Intended use

The Storm cluster is intended for researchers (including faculty, postdoctoral fellows, research assistants and graduate students) who need more computing power than available on their own desktop machines. Compute Canada resources, such as WestGrid should be considered as a first alternative, but, Storm may provide a better choice in some cases, due to licensing issues or longer queue waiting times on WestGrid systems.  Please keep in mind that Storm is using fairly old hardware so run-time performance may be only 50-60% of that on recent WestGrid machines.  Also, jobs requiring more than 8 GB of RAM on a single compute node are not suitable for Storm.

Storm can be used for a wide variety of calculations.  Researchers can run multiple low-memory serial jobs or multi-node distributed-memory parallel jobs.  Software running on Storm can be set up to access central campus license servers.


Accounts on the Storm cluster use the same credentials (user name and password) as for University of Calgary computing accounts offered by Information Technologies for email and other services. If you would like a Storm account but do not yet have a University of Calgary email address, please visit first.

Once you have a University of Calgary computing account, to request a Storm account, please write to with the subject line "Storm account requested (username)", filling in your University of Calgary computing account username in the brackets. In the body of the message, briefly indicate that you would like your Storm account activated and give your name and department. Also, please include a few sentences indicating the nature of your proposed work.



Storm is a cluster based on HP C7000 chassis. Each chassis houses sixteen BL465c G1 CTO Blades. Each blade (compute node) contains two dual-core 2.4 GHz AMD Opteron processors. As mentioned above the number of cores available was 120 in late October 2012 but will increase to over 700 by January 2013.

About half the nodes currently deployed have 4 GB of RAM and the othe half have 8 GB each.


The compute nodes communicate via Infiniband, a high-bandwidth, low-latency network.


There is about 10 TB of disk space allocated for home directories and global scratch space. Each user has a subdirectory in /global/scratch.



GNU, Portland Group and Intel compilers are available. The setup of the environment for using the compilers is handled through the module command. An overview of modules on WestGrid is largely applicable to Storm.

To list available modules, type:

module avail

To see currently loaded modules, type:

module list

By default, modules are installed on Storm to set up Intel compilers and to support parallel programming with MPI (including the determination of which compilers are used by the wrapper scripts mpicc, mpif90, etc.).

Application software

Look for installed software under /global/software. Write to if you need additional software installed.

Using Storm

To log in to Storm, connect to using an ssh (secure shell) client. For more information about connecting and setting up your environment, the WestGrid QuickStart Guide for New Users may be helpful.

The Storm login node may be used for short interactive runs during development. Production runs should be submitted as batch jobs. Batch jobs are submitted through TORQUE and scheduled using Maui (similar to Moab used on WestGrid). Processors may also be reserved for interactive sessions, in a similar manner to batch jobs.

Most of the information on the Running Jobs page on the WestGrid web site is also relevant for submitting and managing batch jobs and reserving processors for interactive work on Storm.

There is a 7-day maximum walltime limit for jobs on Storm.

A user may run a maximum of 100 jobs at one time, using a maximum of 150 processor equivalents.  It may be possible to accommodate large jobs by special request during maintenance periods.


Send Storm-related questions to

Updated 2015-04-07 - Updated link to IT account page.



Please send corrections or suggestions about the site to