As datasets increase in size the need for more memory in your available compute hardware becomes more pressing. Applications such as R, Stata, python and MATLAB are often used to process these datasets and can require significant amounts of memory – to load the dataset, process it and generate results.
The Computational Shared Facility (CSF) contains a number of 16-core high memory compute nodes with 256GB and 512GB of RAM. These are ideal for jobs with moderate memory requirements and will very likely provide a lot more memory than you have in a local workstation.
In addition, the Research Lifecycle Programme recently funded compute nodes containing 1.5TB of RAM. These nodes contain 32 CPU-cores and so they offer 46GB RAM per core. For example, if you run an 8-core job on one of these nodes, your job will have access to approximately 375 GB of RAM. If you request all 32-cores then you can use all 1.5TB of RAM!
Running your jobs on these nodes is simple – only one extra line is needed in your jobscript. All that we ask is that you only run jobs on these nodes that genuinely require the extra RAM! Please email us if you would like to use the 1.5TB compute nodes. For more technical information on how to run high-memory jobs on the CSF please see our documentation.