Documentation

Overview

Archon Cloud is a cloud object storage service, designed for high durability, resilience, and security. It achieves this by encoding objects into a proprietary type-agnostic format, splitting the resulting data into shards, and distributing the shards to multiple cloud storage providers.

Archon Cloud is currently in beta, and is appropriate to test drive with 100GB or less of data. Once Archon Cloud launches, it will be designed to handle large datasets and archives, and is ideal for 1TB+ inventories.

The target price for Archon Cloud is $0.0045/GB/mo of storage, with free egress*. This allows users to cut cloud storage costs by over 80% compared to other cloud storage services.

*Egress is free for downloads up to the amount of storage you have in your account per month. Egress thereafter will be charged $0.035/GB.


Account Details

Registration Steps

  1. Click Here to sign up for your 30-day free trial of the Archon Cloud Beta.
  2. Fill out the Form and press Submit. You will be prompted to enter a verification code, which can be found in the verification email sent to your email.
  3. You will receive an email with your verification link. If you do not receive it immediately, please check your Spam folder in a few minutes, as it may take a few minutes to appear in your inbox.
  4. Either click the ‘Verify Email Now’ button in the email, or copy/paste the 48 character verification code from the end of Step 2 and press ‘Verify’.
  5. Once logged in, you can create buckets to begin uploading and managing files.

API Token

Once you register for an account, you will be given an Account API Token. The API Token is how your account authorizes storage requests with the Archon Cloud, so please keep note of it, and do not share it. It is available to you at the top of your Account Overview page.

Buckets

  1. Buckets have no limit on the number of objects that can be stored.
  2. You can store all of your objects in a single bucket or organize them across several buckets with no degradation in performance.
  3. Buckets cannot be nested, meaning buckets cannot be created within buckets.
  4. Object storage focuses on GET, PUT, LIST, and DELETE operations. It is better to create or delete buckets in a separate initialization or set up a routine that you run less often. Avoid bucket create or delete calls on the high availability code path of your application.
  5. Archon is compliant with Amazon S3, including its global namespace pattern. (meaning buckets must have names unique from everyone else globally) It's similar to how DNS or email addresses work where each must be unique. Therefore, you must create a unique bucket name when creating Archon buckets. Valid syntax for bucket names are as follows:

    Not Accepted:

    • Periods (.)

    Accepted, but not recommended:

    • Capital Letters
    • Underscores (_)
    • Should not end with a dash (-)

    Valid namespaces and best practices:

    • Between 6 and 63 characters
    • (e.g.) joe-smith-archon-bucket, company-bucket-2, 123-partners-llc

Usage

The Archon Cloud Service (ACS) aims to be AWS S3 compatible. Files may be uploaded and downloaded with clients that support HTTP PUT and GET requests, respectively. Errors return an S3-like XML response.

Uploads are restricted to users having a verified account. Files may only be updated to buckets belonging to that user.

Files may be downloaded from any bucket by any user, regardless of uploader.

Below are examples using a some common clients. The examples assume the user has account API token PDL4S56NVC57WRGHFAAZPI4B, owns the bucket sample-bucket, and has the following local files:

files
├── books
│   ├── dracula.txt
│   └── ulysses.txt
└── other
    └── lorem-ipsum.txt

HTTP

ACS clients use HTTP requests to upload and download files. A file may be uploaded with a PUT request like

PUT /sample-bucket/other/lorem-ipsum.txt?token=PDL4S56NVC57WRGHFAAZPI4B HTTP/1.1
Host: acs.archon.cloud
Content-Type: text/plain
Content-Length: 446

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.

which produces a response like

HTTP/1.1 204 No Content
Etag: edc715389af2498a623134608ba0a55b
Date: Thu, 05 Mar 2020 23:00:57 GMT

Similarly, a file may be downloaded with a GET request like

GET /sample-bucket/other/lorem-ipsum.txt HTTP/1.1
Host: acs.archon.cloud

which produces a response like

HTTP/1.1 200 OK
Content-Type: text/plain
Etag: edc715389af2498a623134608ba0a55b
Last-Modified: Thu, 5 Mar 2020 23:00:55 GMT
Date: Thu, 05 Mar 2020 23:06:54 GMT
Content-Length: 446

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.

curl

curl is a popular tool to execute HTTP requests from the command line. A file may be uploaded to ACS with the command

$ curl -T files/books/dracula.txt 'https://acs.archon.cloud/sample-bucket/dracula.txt?token=PDL4S56NVC57WRGHFAAZPI4B'

It may be downloaded with the command

$ curl https://acs.archon.cloud/sample-bucket/dracula.txt

AWS CLI

The official AWS command line client may be used to upload and download files to ACS if properly configured. The credentials file (usually ~/.aws/credentials) should have an entry like

[ACS]
aws_access_key_id=PDL4S56NVC57WRGHFAAZPI4B
aws_secret_access_key=PDL4S56NVC57WRGHFAAZPI4B

Then, files may be uploaded like

$ aws s3 --profile=ACS --endpoint=https://acs.archon.cloud cp files/books/dracula.txt s3://sample-bucket/dracula.txt

and downloaded like

$ aws s3 --profile=ACS --endpoint=https://acs.archon.cloud cp s3://sample-bucket/dracula.txt ./dracula.txt

rclone

Rclone is "rsync for cloud storage." It may be used to upload and download files to ACS if properly configured. The configuration file (usually ~/.config/rclone/rclone.conf) should have an entry like

[acs]
type = s3
provider = "Archon Cloud Storage"
env_auth = false
access_key_id = PDL4S56NVC57WRGHFAAZPI4B
secret_access_key = PDL4S56NVC57WRGHFAAZPI4B
endpoint = https://acs.archon.cloud
acl = public-read
bucket_acl = public-read
force_path_style = true

Then, entire directories may be synced to ACS like

$ rclone copy files/ acs://sample-bucket

and synced from ACS like

$ rclone copy acs://sample-bucket files/

In both cases, only files that don't exist or have been modified will be uploaded or downloaded.