S3 object storage – Brage

Table of Contents

    What is Brage and S3?
    Brage (brage.it.ntnu.no) is an object storage based on Dell ECS. It uses amazon S3 api for access. It's primary use case for backing up large amounts of scientific data, 100 Terrabytes or more, which is either to be archived or to be used by a web application or to be accessed by users who does not have a NTNU account.

    How Do I get access to Brage & S3?

    Please send an email to: help@hpc.ntnu.no and make an inquiry.

    How do I upload or download data from NTNU S3/brage.it.ntnu.no?

    All data access is done with S3 api, so you need access ID key and Access secret key. With access ID and key you may use either s3browser (Win) and equivalents or awscli
    We have an example on how to use Cyberduck S3 browser (on mac) for s3

    Another good client for S3 is Rclone, which also comes with it's own experimental gui. Rclone can also serve your s3 files with its built in web server.

    Different keys with different access rights may be created per bucket or object.

    Where can I find documentation for S3?
    Generally you should look up aws s3 api documentation, but in addition Dell has it's documentation for their S3 online
    We also have a local copy of ECS data access guide, but latest information is always available on Dell websites.

    awscli

    NOTE: There is issue with latest aws cli relase. Erro XAmzContentSHA256Mismatch when uploading file:

    upload failed: ./test.txt to s3://backet00/test.txt An error occurred (XAmzContentSHA256Mismatch) when calling the PutObject operation: The Content-SHA256 you specified did not match what we received

    Install older version. For example 2.15.27 works.

    Article: Installing past releases of the AWS CLI version 2
    https://docs.aws.amazon.com/cli/latest/userguide/getting-started-version.html

    Linux - installation example on IDUN HPC login node

    Commands to install older version:

    curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64-2.15.27.zip" -o "awscliv2.zip"
    unzip awscliv2.zip
    ./aws/install --bin-dir /cluster/home/USERNAME/.local/bin/ --install-dir /cluster/home/USERNAME/aws-cli-2.15.27/

    Mac OS - installation

    Download older version: https://awscli.amazonaws.com/AWSCLIV2-2.15.27.pkg

    awscli - command examples

    Configure profile with credentials, create bucket, list.

    $ aws configure --profile brage
    AWS Access Key ID [None]: AK................F3
    AWS Secret Access Key [None]: Vl....................................ii
    
    $ aws --profile brage --endpoint=https://brage.it.ntnu.no s3 mb s3://my-first-bucket 
    make_bucket: my-first-backet
    
    $ aws --profile brage --endpoint=https://brage.it.ntnu.no s3api list-buckets 
    {
        "Buckets": [
            {
                "Name": "my-first-bucket",
                "CreationDate": "2025-11-27T12:39:17.041000+00:00"
            }
        ],
        "Owner": {
            "DisplayName": "urn:ecs:iam::support:root",
            "ID": "urn:ecs:iam::support:root"
        }
    }

    Create, upload, list and download file:

    
    $ echo "test" > test.txt
    
    $ aws --profile brage --endpoint=https://brage.it.ntnu.no s3 cp test.txt s3://my-first-bucket
    upload: ./test.txt to s3://my-first-bucket/test.txt
    
    $ aws --profile brage --endpoint=https://brage.it.ntnu.no s3 ls s3://my-first-bucket
    2025-11-27 13:43:05         12 test.txt
    
    $ aws --profile brage --endpoint=https://brage.it.ntnu.no s3 cp s3://my-first-bucket/test.txt ./test.txt
    download: s3://my-first-bucket/test.txt to ./test.txt

    awscli - configuration files examples.

    This example configuration file if you use only one Profile. With this configuration you can skip --profile and --endpoint options:

    An you can use shorter commands:

    aws s3 cp test.txt s3://my-first-backet

    File .aws/config

    [default]
    services = default_service
    
    [services default_service]
    s3 =
       endpoint_url = https://brage.it.ntnu.no/
    
    [profile brage]
    services = brage_service
    
    [services brage_service]
    s3 =
       endpoint_url = https://brage.it.ntnu.no/

    File .aws/credentials

    [default]
    aws_access_key_id = AK................F3
    aws_secret_access_key = Vl....................................ii
    
    [brage]
    aws_access_key_id = AK................F3
    aws_secret_access_key = Vl....................................ii

    rclone (experience from Brage S3 user)

    Configuring rclone - make sure to select type [s3] and provider [Other] (lots of people get the second choice wrong, and wind up trying to configure for AWS)

    If you're working with multiple remotes, rclone can configure a "meta" remote of type combined, which groups several remotes as subfolders of a single remote. Very handy for the researchers here who may have as many as 6 different projects, each with their own credentials to a specific bucket.

    For transferring data from Forskning to Brage, I was able to get a typical speed of around 5.8Gbps with occasional spikes over 7.5Gbps (on a 10Gbit connection) with these commands (both forskning and brage set up as rclone remotes, had lower performance using the WIndows mount point for forskning)

    rclone sync src dest -PLM -multi-thread-streams=12 --transfers=12 --checkers=24 --max-backlog=50000

    For mounting buckets as pseudo-filesystems, including with full local caching, I've suggested these to the researchers (who are used to point and click in Windows Explorer!)

    rclone mount brage:bucket-name q: --network-mode --vfs-cache-mode full --transfers 6

    The rclone docs are really quite good - I found it much easier to grok how [move] worked with rclone than I did with the equivalent operations for rsync.

    Scroll to Top