ANSYS RSM Cluster (ARC) Job Submission from Rescale Desktops

This tutorial will outline how to set up a Rescale ARC cluster running the ANSYS RSM Cluster scheduler. This will allow job submissions from Rescale desktops to a Linux compute nodes.

Limitations

  • Due to cloud provider compatibility, ARC is available only on certain core types
  • ARC will not automatically shut down the provisioned Linux compute cluster. As a result, the user will have to manually terminate the cluster. To be safe, a wall time should be set on the provisioned compute cluster
  • ARC transmits input and output data over the network. Jobs with large input or output may run longer than expected due to the time required to transfer files
  • ARC is Supported on versions 19.0 and higher

This tutorial presents an example using ANSYS Fluent Workbench Project. To obtain the Workbench project file (.wbpz) for the tutorial, please click on the Import Workbench Project button below and click on the Save option located on the top right corner of the job submission page to have a copy of this file in your Rescale cloud files.

The steps are as follows

In order to submit a job to an ARC cluster from Workbench, you will need to spin up a compute cluster first in the form of a Rescael job, and then send the job to that running cluster.

First, click on the "+ New Job" icon on the top left hand corner of the dashboard. Name the job and jump to the Software Settings page, and select ANSYS RSM Cluster (ARC) . For this tutorial, we will select Version 19.1. No input files are needed here.


arc-cluster-software


You can leave the default command in the Command window. Select the license option below.


command-window


Next, proceed to the Hardware Settings page and select the hardware configuration that you would like to run your simulation on. Also, set a suitable wall time so that the ARC cluster will be terminated after the simulation


specify-hardware


Once done, hit submit. Your cluster will now take a few minutes to spin up, and configure itself. Please use the live-tailing section to monitor the process_output.log file. Once the log looks like the one in the screenshot below, and the cluster_config.areg files is present, your cluster is ready to receive jobs.

From this step, 3 pieces of information are required for subsequent steps

Click on process-output.log on the job status page to obtain this information.


log-message


For the tutorial, ARC Cluster login details obtained from process_output.log are as shown below

  • submit host ip-10-23-2-142
  • user name udev-brian_cCZzXS
  • password cCzXS

We will now set up a job on the Rescale platform and attach the workbench archive file. The purpose of this step is to have the workbench archive file available on the Rescale Desktop.

First, click on the + New Job icon on the top left hand corner of the dashboard. Name the job. Click on Upload from this computer and browse to the location where the workbench archive is saved. Select the file and click Open.


attach-wb-file-job


Jump to the Software Settings page, and select Bring your Own Software as your software of choice.


attach-wb-file-software


You could leave the default selection for Hardware. Save the job.

Go to the Desktops tab on the top and click + New Custom Desktop. Under the Add Software Tab, select ANSYS Fluent Desktop and enter your license information. We will use Version 19d.1 for this tutorial.

Under Add Jobs, select the job created in the previous step.


fluent-desktop-setup


Once the server has been started and the loading grid has disappeared, click Connect > Connect using In-Browser Desktop. You will now be taken to the remote desktop. Please double click the ANSYS Workbench shortcut icon on the desktop to launch ANSYS Workbench.

  • Go to Start Menu > View all Programs > Open RSM Configuration
  • Once RSM is open, click on + to add a new HPC resource

rsm-configuration


  • On the HPC Resource tab, provide a name for the HPC resource. Enter the submit host using the information printed in process_output.log on your cluster. Select linux-64 from the drop down list. Click Apply

hpc-resources


  • On the File Management tab, select RSM internal file transfer method and specify $HOME/work/shared as your staging directory path on the cluster. Click Apply

file-management


  • On the Queue tab, click on Import/Refresh HPC Queues button. You should be prompted to enter credentials to your compute cluster. Fill in the credentials using the information from process_output.log.

queue-tab



credentials


  • After verifying your credentials, you should see a Default and local queue appear. Click Apply The queue that will utilize all nodes of your cluster is the Default queue.

rsm-queue


Open Ansys Workbench on Rescale Desktop. Click on File and Restore Archive. Browse to Desktop > attached_jobs . Click on the job folder and browse to the workbench archive file. Click on the workbench archive file and click Open. Once the project is open, right click on Parameters and click on Properties . This will open up the properties window.

Make selections in the properties as shown below.


parameter-set-properties


Click on Save to save all the settings. Next, right click on Parameters and click on Update all Design Points. You will notice that the status bar below is updated. Click on Monitor Jobs to see the status of the simulation.


update-design-points


job-monitor-button


The Job Monitor window is displayed. Each row in the top window represents a design point. By clicking on row, the details are displayed in the Details window below.


job-monitor-details


Once all the design points are solved, double click on parameters in the Ansys worbench window and the results will be displayed


results