Using rclone with DreamObjects

Overview

Rclone is a rsync-like tool for Windows, Mac OS, Linux, and other operating systems designed for cloud storage, such as DreamObjects.

It features:

  • MD5/SHA1 hashes checked at all times for file integrity
  • Timestamps preserved on files
  • Partial syncs supported on a whole file basis
  • Copy mode to just copy new/changed files
  • Sync (one way) mode to make a directory identical
  • Check mode to check for file hash equality
  • Can sync to and from network, e.g., two different cloud accounts
  • Optional encryption (Crypt)
  • Optional FUSE mount (rclone mount)

Configure rclone

  1. Download and install rclone based on your computer's operating system. You can download it from rclone's site here:
  2. After you download it, navigate to the directory to which the file was downloaded. (NOTE: In Windows you can navigate to the /Downloads directory and extract the rclone files.) Copy the directory in your Explorer bar.
  3. Open your command prompt and navigate to the directory to which the files were extracted.
  4. In your terminal or command prompt, run the following to begin configuring rclone using the interactive configuration tool:
    [user@localhost]$ rclone config
    No remotes found - make a new one
    n) New remote
    s) Set configuration password
    q) Quit config
    n/s/q> n
  5. Enter 'n' to set up a new configuration profile (i.e. a remote).

    name > dreamobjects
  6. Pick a name you want to assign to this connection. This example uses the name 'dreamobjects'.

    Type of storage to configure.
    Enter a string value. Press Enter for the default ("").
    Choose a number from below, or type in your own value
     1 / A stackable unification remote, which can appear to merge the contents of several remotes
       \ "union"
     2 / Alias for a existing remote
       \ "alias"
     3 / Amazon Drive
       \ "amazon cloud drive"
     4 / Amazon S3 Compliant Storage Providers (AWS, Ceph, Dreamhost, IBM COS, Minio)
       \ "s3"
     5 / Backblaze B2
       \ "b2"
     6 / Box
       \ "box"
     7 / Cache a remote
       \ "cache"
     8 / Dropbox
       \ "dropbox"
     9 / Encrypt/Decrypt a remote
       \ "crypt"
    10 / FTP Connection
       \ "ftp"
    11 / Google Cloud Storage (this is not Google Drive)
       \ "google cloud storage"
    12 / Google Drive
       \ "drive"
    13 / Hubic
       \ "hubic"
    14 / JottaCloud
       \ "jottacloud"
    15 / Local Disk
       \ "local"
    16 / Mega
       \ "mega"
    17 / Microsoft Azure Blob Storage
       \ "azureblob"
    18 / Microsoft OneDrive
       \ "onedrive"
    19 / OpenDrive
       \ "opendrive"
    20 / Openstack Swift (Rackspace Cloud Files, Memset Memstore, OVH)
       \ "swift"
    21 / Pcloud
       \ "pcloud"
    22 / QingCloud Object Storage
       \ "qingstor"
    23 / SSH/SFTP Connection
       \ "sftp"
    24 / Webdav
       \ "webdav"
    25 / Yandex Disk
       \ "yandex"
    26 / http Connection
       \ "http"
    Storage> 4
  7. Configure the type of storage as Amazon S3 (option 4).
    Choose your S3 provider.
    Enter a string value. Press Enter for the default ("").
    Choose a number from below, or type in your own value
     1 / Amazon Web Services (AWS) S3
       \ "AWS"
     2 / Ceph Object Storage
       \ "Ceph"
     3 / Digital Ocean Spaces
       \ "DigitalOcean"
     4 / Dreamhost DreamObjects
       \ "Dreamhost"
     5 / IBM COS S3
       \ "IBMCOS"
     6 / Minio Object Storage
       \ "Minio"
     7 / Wasabi Object Storage
       \ "Wasabi"
     8 / Any other S3 compatible provider
       \ "Other"
    provider> 4
  8. Choose #4 for DreamHost.
    Get AWS credentials from runtime (environment variables or EC2/ECS meta data if no env vars).
    Only applies if access_key_id and secret_access_key is blank.
    Enter a boolean value (true or false). Press Enter for the default ("false").
    Choose a number from below, or type in your own value
     1 / Enter AWS credentials in the next step
       \ "false"
     2 / Get AWS credentials from the environment (env vars or IAM)
       \ "true"
    env_auth> 1
  9. Select option 1.
    AWS Access Key ID.
    Leave blank for anonymous access or runtime credentials.
    Enter a string value. Press Enter for the default ("").
    access_key_id> YOUR_ACCESS_KEY
    AWS Secret Access Key (password)
    Leave blank for anonymous access or runtime credentials.
    Enter a string value. Press Enter for the default ("").
    secret_access_key> YOUR_SECRET_KEY
  10. Replace YOUR_ACCESS_KEY and YOUR_SECRETE_KEY with your actual Access Key and its corresponding Secret Key. These can be located on the DreamObjects page.

    Region to connect to.
    Leave blank if you are using an S3 clone and you don't have a region.
    Enter a string value. Press Enter for the default ("").
    Choose a number from below, or type in your own value
    1 / Use this if unsure. Will use v4 signatures and an empty region.
    \ ""
    2 / Use this only if v4 signatures don't work, eg pre Jewel/v10 CEPH.
    \ "other-v2-signature"
    region>
  11. Click the "[Enter]" key to continue.
    Endpoint for S3 API.
    Required when using an S3 clone.
    Enter a string value. Press Enter for the default ("").
    Choose a number from below, or type in your own value
    1 / Dream Objects endpoint
    \ "objects-us-west-1.dream.io"
    endpoint> objects-us-east-1.dream.io
  12. Enter DreamObject's hostname as the endpoint for the S3 API.
    Location constraint - must be set to match the Region.
    Leave blank if not sure. Used when creating buckets only.
    Enter a string value. Press Enter for the default ("").
    location_constraint>
  13. As with the Region, leave this option blank and press "[Enter]" to continue.
    Canned ACL used when creating buckets and storing or copying objects.
    
    For more info visit https://docs.aws.amazon.com/AmazonS3/latest/dev/acl-overview.html#canned-acl
    
    Note that this ACL is applied when server side copying objects as S3
    doesn't copy the ACL from the source but rather writes a fresh one.
    Enter a string value. Press Enter for the default ("").
    Choose a number from below, or type in your own value
     1 / Owner gets FULL_CONTROL. No one else has access rights (default).
       \ "private"
     2 / Owner gets FULL_CONTROL. The AllUsers group gets READ access.
       \ "public-read"
       / Owner gets FULL_CONTROL. The AllUsers group gets READ and WRITE access.
     3 | Granting this on a bucket is generally not recommended.
       \ "public-read-write"
     4 / Owner gets FULL_CONTROL. The AuthenticatedUsers group gets READ access.
       \ "authenticated-read"
       / Object owner gets FULL_CONTROL. Bucket owner gets READ access.
     5 | If you specify this canned ACL when creating a bucket, Amazon S3 ignores it.
       \ "bucket-owner-read"
       / Both the object owner and the bucket owner get FULL_CONTROL over the object.
     6 | If you specify this canned ACL when creating a bucket, Amazon S3 ignores it.
       \ "bucket-owner-full-control"
    acl> private
  14. Set the canned ACL based on how you want to use rclone. This example uses "private".
    Edit advanced config? (y/n)
    y) Yes
    n) No
    y/n> n
  15. Choose if you want to edit the advanced config. This example chooses not to.
    Remote config
    --------------------
    [dreamobjects]
    type = s3
    provider = Dreamhost
    env_auth = false
    access_key_id = YOUR_ACCESS_KEY
    secret_access_key = YOUR_SECRETE_KEY
    endpoint = objects-us-east-1.dream.io
    acl = private
    --------------------
    y) Yes this is OK
    e) Edit this remote
    d) Delete this remote
    y/e/d> y
  16. Type in 'y'.
    Current remotes:
    
    Name                 Type
    ====                 ====
    dreamobjects         s3
    
    e) Edit existing remote
    n) New remote
    d) Delete remote
    r) Rename remote
    c) Copy remote
    s) Set configuration password
    q) Quit config
    e/n/d/r/c/s/q> q
  17. Review the remote you configured. If everything looks correct, save it by entering y and quit the configuration wizard by entering q.

Using rclone

View the following for a list of commands you can use with rclone:

With a remote configured, you can list the buckets in it with this command:

[user@localhost]$ rclone lsd dreamobjects:
          -1 2024-03-29 02:19:25        -1 samplebucket
          -1 2024-03-29 22:06:53        -1 anotherbucket
          -1 2024-03-29 21:33:25        -1 greatbucket

Make a new bucket:

[user@localhost]$ rclone mkdir dreamobjects:mynewbucket

Sync /home/localuser/directory to the remote bucket, deleting any excess files in the bucket:

[user@localhost]$ rclone sync /home/localuser/directory dreamobjects:mynewbucket

List the contents of a bucket:

[user@localhost]$ rclone ls dreamobjects:mynewbucket

For more examples on how to use the software, see rclone's official documentation here:

See also

Did this article answer your questions?

Article last updated PST.

Still not finding what you're looking for?