Data hosting is provided by SDSC. You can manually download all data files from:
For the sake of data integrity, we provide checksums for each data file.
Apart from the AIRS data set, all modalities have been packed up into a single bzipped tarball. The AIRS data set consists of four parts, each of which contains a number of consecutive time steps, one for the first (am) and one for the second half (pm) of each day. In order to use AIRS conveniently, unpack the parts and then manually move the data to a common ./airs directory.
Note that the directory contains an additional support file, which features coastlines and a fitting earth texture, both in legacy vtk format, too.
For Linux and MacOS users, the data directory contains a download script, which automatically handles the entire process. It takes as input the local target directory to which it should unpack the data and will take care of everything else. Proceed as follows:
Download the script via curl:
curl -O https://cloud.sdsc.edu/v1/AUTH_sciviscontest/2014/data/download.sh
Make it locally executable:
chmod 750 download.sh
Execute the script, which in this example will download the data to the directory my_local_data_dir which will be created inside the current working directory:
In case of a failed download, the script will automatically retry to download the data file in question ten times over. If the script fails anyway during download, just re-run it. It will try to resume the download with the first incomplete file. If this still doesn't work, please try manual download.
As a final remark, the script relies on md5 checksums, specifically on the md5sum tool. If you don't have that tool available on your machine (e.g. some versions of MacOS), either get it or just comment every line in the download script that refers to md5sum.