Getting Started with IPWorks Cloud

Requirements: IPWorks Cloud

IPWorks Cloud provides easy-to-use components for accessing cloud storage services including Amazon S3, Google Drive, Dropbox, Box.com, OneDrive, Wasabi, and more. The toolkit consists of components for each individual service, and a CloudStorage component with a unified API for accessing multiple services.

Contents

Amazon S3

The S3 Component allows easy access to Amazon's Simple Storage Service (S3) to manage the S3 resources. Capabilities include managing buckets, managing objects, and strong encryption support.

To begin, first create an Amazon S3 account service. Consult the Amazon S3 documentation for instructions on this process.

Authentication

Authentication is performed using the AccessKey and SecretKey provided by Amazon.

s3 = new S3();
s3.AccessKey = S3_ACCESS_KEY;
s3.SecretKey = S3_SECRET_KEY;

Managing Buckets

The ListBuckets method will return all the buckets for a provided account. When called the following fields will be populated:

  • BucketName
  • CreationDate
  • OwnerId
  • OwnerName
  • OtherData

s3.ListBuckets();
for (int i = 0; i < s3.Buckets.Count; i++) {
  Console.WriteLine(s3.Buckets[i].Name);
  Console.WriteLine(s3.Buckets[i].CreationDate);
  Console.WriteLine(s3.Buckets[i].OwnerDisplayName);
}

The S3 component allows for deletion of buckets by calling the DeleteBucket method and allows for the creation of a new bucket by calling the CreateBucket method.

Other features for managing buckets include:

  • UpdateBucketACL will update the access policy of the bucket.
  • GetBucketLocation will return the location value of the bucket.

Managing Objects

The ListObjects method will return all the objects within a given bucket. The bucket specified is set by the Bucket property. ObjectPrefix, ObjectDelimiter, and ObjectMarker can be used to filter or control the objects listed from the ListObjects method.

When called the following fields will be populated:

  • ObjectName
  • ObjectModifiedDate
  • ObjectSize
  • ETag
  • OwnerId
  • OwnerName
  • UploadId
  • OtherData

s3.Bucket = "TEST_BUCKET";

//set prefix
s3.ObjectPrefix = "photos/2016/";

s3.ListObjects();
for (int i = 0; i < s3.Objects.Count; i++) {
  Console.WriteLine(s3.Objects[i].Name);
  Console.WriteLine(s3.Objects[i].LastModified);
  Console.WriteLine(s3.Objects[i].Size);
}

The S3 component allows for deletion of objects by calling the DeleteObject method and allows for the creation of a new object by calling the CreateObject.

In order to retrieve a desired object, call GetObject. This will store the object in the file specified by the LocalFile property. A bucket name must also be set using the Bucket property.

s3.Bucket = "TEST_BUCKET";
s3.LocalFile = "C:\\testFile.txt";
s3.GetObject("testObject");

Other features for managing objects include:

  • UpdateObjectACL updates the access policy of an object.
  • GetTorrent retrieves an object from a bucket as a torrent.
  • GetObjectInfo stores the meta-data in the ParsedHeaders property.
  • AddUserMetaData allows for custom meta data to be associated with an object. Before calling CreateObject use the AddUserMetaData to add up to 2K of user meta data.
  • GetLink creates an authenticated link to allow access to objects.

Managing Uploads

StartMultiPartUpload begins a multipart upload. This method will initiate a multipart upload and return the UploadId associated with the upload. There is no expiration of the upload. The upload must be completed by calling CompleteMultipartUpload or aborted by calling AbortMultipartUpload.

The UploadId returned by this method is used to reference the upload when uploading parts via UploadPart and other methods such as AbortMultipartUpload, CompleteMultipartUpload, and ListParts.

//start to upload parts
s3.Bucket = "TEST_BUCKET";
String uploadId = s3.StartMultipartUpload("test_file.dat");

//list the current multipart uploads
s3.ListMultipartUploads();
for (int i = 0; i < s3.Objects.Count; i++) {
  Console.WriteLine(s3.Objects[i].Name);
  Console.WriteLine(s3.Objects[i].UploadId);
  Console.WriteLine(s3.Objects[i].Size);
}
s3.CompleteMultipartUpload("test_file.dat", uploadId);

The ListParts method lists all the parts in a current multipart upload. The PartList event will be fire once for each part in the current upload as reported by the server. The Parts collection will also be populated. Parts may be inspected to determine various information such as ETag, PartNumber, ObjectName, etc.

By default, only the first 1000 parts will be returned. To determine if the results are paged check IsPaged. If the results are paged continue to call ListParts to obtain the next set of parts until IsPaged returns false. The Parts collection will then contain all of the parts for the upload. To change the default maximum number of parts to be returned set MaxParts.

s3.Bucket = "TEST_BUCKET";
String uploadId = s3.StartMultipartUpload("test_file.dat");

//upload parts 
s3.UploadPart("test_file.dat", 1, uploadId);
s3.UploadPart("test_file.dat", 2, uploadId);

//list all the parts 
s3.ListParts("test_file.dat", uploadId);
for (int i = 0; i < s3.Parts.Count; i++) {
  Console.WriteLine(s3.Parts[i].ObjectName);
  Console.WriteLine(s3.Parts[i].PartNumber);
  Console.WriteLine(s3.Parts[i].Size);
}
String uploadId = s3.StartMultipartUpload("test_file.dat");

Additional Functionality

The S3 component offers advanced functionality. For instance:

  • Encrypt and decrypt files using EncryptionAlgorithm and EncryptionPassword.
  • Manage bucket and object ACLs with UpdateBucketACL and UpdateObjectACL.
  • Use CopyObject to copy objects on the server.
  • And more!

Azure Blob

The Blob component provides an easy to use interface to Microsoft's Azure Blob Service, which allows you to store text and binary data. The Blob service offers the following three resources: the storage account, containers, and blobs. Within your storage account, containers provide a way to organize sets of blobs.

To begin, first sign up for the Azure Blob Service. Consult the Microsoft Azure documentation for instructions on this process.

Authentication

Authentication is performed using the Account and AccessKey provided by Microsoft Azure.

blob = new Azureblob();
blob.Account = BLOB_ACCOUNT;
blob.AccessKey = BLOB_ACCESS_KEY;

Managing Containers

The ListContainers method will return all the containers for the provided account. Prefix can be used to filter the containers listed by this method.

If there are more than MaxResults results, Marker will be populated with the marker identifying the position in the results. Subsequent ListContainers calls will return the next portion of results. If Marker is an empty string, the end of the list has been reached.

When called the following fields will be populated:

  • Name
  • ETag
  • LastModified
  • URL

blob.ListContainers();
for (int i = 0; i < blob.Containers.Count; i++) {
  Console.WriteLine(blob.Containers[i].Name);
  Console.WriteLine(blob.Containers[i].LastModified);
  Console.WriteLine(blob.Containers[i].URL);
}

The Blob component allows for deletion of containers by calling the DeleteContainer method, and allows for the creation of a new container by calling the CreateContainer method.

Other features for managing containers include:

  • GetContainerACL/SetContainerACL for working with container access policies
  • GetContainerMetadata/SetContainerMetadata for working with container metadata

Managing Blobs

The ListBlobs method will list the blobs within the container specified by ContainerName. Prefix and BlobDelimiter can be used to filter the blobs listed by this method.

If there are more than MaxResults results, Marker will be populated with the marker identifying the position in the results. Subsequent ListBlobs calls will return the next portion of results. If Marker is an empty string, the end of the list has been reached.

When called the following fields will be populated:

  • Name
  • ETag
  • LastModified
  • URL
  • BlobType
  • LeaseStatus
  • CacheControl
  • ContentEncoding
  • ContentLanguage
  • ContentLength
  • ContentMD5

blob.ContainerName = "TEST_CONTAINER";
blob.ListBlobs();
for (int i = 0; i < blob.Blobs.Count; i++) {
  Console.WriteLine(blob.Blobs[i].Name);
  Console.WriteLine(blob.Blobs[i].LastModified);
  Console.WriteLine(blob.Blobs[i].ContentLength);
}

The Blob component allows for deletion of blobs by calling the DeleteBlob method, and allows for the creation of a new blob by calling the CreateBlob method.

In order to retrieve a desired blob, call GetBlob. This will store the blob in the file specified by LocalFile, unless it is set to an empty string, in which case it will be stored in BlobData.

Other features for managing containers include:

  • CopyBlob to copy a source blob to a destination blob within the storage account
  • GetBlobMetadata/SetBlobMetadata for working with blob metadata
  • LeaseBlob for working with blob leases

Managing Blocks

The ListBlocks method will list the blocks that have been uploaded as part of the blob specified by the blobName parameter. Prefix and BlobDelimiter can be used to filter the blobs listed by this method.

There are two types of block lists maintained for a blob, Committed and Uncommitted. The committed block list contains a list of blocks that have been successfully committed using the PutBlockList method. The uncommitted block list contains a list of blocks that have been uploaded, using the CreateBlock method, but have not been committed.

The blockListType parameter allows you to specify which of these lists should be returned. The possible values are:

  • 0 - Only the committed block list is retrieved
  • 1 - Only the uncommitted block list is retrieved
  • 2 - Both the committed and the uncommitted block lists are retrieved

When called the following fields will be populated:

  • BlockType
  • Id
  • Size

blob.ListBlocks("BLOB_NAME", 2);
for (int i = 0; i < blob.Blobs.Count; i++) {
  Console.WriteLine(blob.Blobs[i].BlockType);
  Console.WriteLine(blob.Blobs[i].Id);
  Console.WriteLine(blob.Blobs[i].Size);
}

The Blob component allows for the creation of a new block by calling the CreateBlock method. The AddBlock method is used to add a block to the list of blocks that will be committed to form a blob. The PutBlockList method is used to commit the current block list in order to create/update a blob.

Additional Functionality

The Blob component offers additional functionality, such as:

  • GetLink to create a link to access a blob
  • CreateSnapshot to create snapshots of blobs
  • And more!

Box.com

The Box component provides a simple interface to working with Box.com. Capabilities include uploading and downloading files, strong encryption support, creating folders, moving and copying resources, and more.

Authentication

This component supports authentication via OAuth 2.0. First, perform OAuth authentication using the OAuth component or a separate process. Once complete you should have an authorization string which looks like: Bearer ya29.AHES6ZSZEJzATdZYjeihDn5W-VrXSsxEZu5p0pclxGdKKQ

Assign this value to the Authorization property before attempting any operations. Consult the documentation for the service for more information about supported scope values and more details on OAuth authentication.

Listing Resources

ListResources() lists resources within the specified folder. Calling this method will fire the ResourceList event once for each resource, and will also populate the Resources collection.

If there are still more resources available to list when this method returns, the ResourceMarker property will be populated. Continue to call this method until ResourceMarker is empty to accumulate all pages of results in the Resources collection.

// ResourceList event handler.
box.OnResourceList += (s, e) => {
  Console.WriteLine(e.Name);
};

do {
  box.ListResources("d:123456");

  for (int i = 0; i < box.Resources.Count; i++) {
    // Process resources here.
  }
} while (!string.IsNullOrEmpty(box.ResourceMarker));

Downloading Files

The DownloadFile() method downloads file resources.

If a stream has been specified using SetDownloadStream(), the file data will be sent through it. If a stream is not specified, and LocalFile is set, the file will be saved to the specified location; otherwise, the file data will be held by ResourceData.

To download and decrypt an encrypted file, set EncryptionAlgorithm and EncryptionPassword before calling this method.

In the simplest use-case, downloading a file looks like this:

box.LocalFile = "../MyFile.zip";
box.DownloadFile(box.Resources[0].Id);

Resuming Downloads
The component also supports resuming failed downloads by using the StartByte property. If a download is interrupted, set StartByte to the appropriate offset before calling this method to resume the download.

string downloadFile = "../MyFile.zip";
box.LocalFile = downloadFile;
box.DownloadFile(box.Resources[0].Id);

//The transfer is interrupted and DownloadFile() above fails. Later, resume the download:

//Get the size of the partially download file
box.StartByte = new FileInfo(downloadFile).Length;
box.DownloadFile(box.Resources[0].Id);

Resuming Encrypted File Downloads
Resuming encrypted file downloads is only supported when LocalFile was set in the initial download attempt.

If LocalFile is set when beginning an encrypted download, the component creates a temporary file in TempPath to hold the encrypted data until the download is complete. If the download is interrupted, DownloadTempFile will be populated with the path of the temporary file that holds the partial data.

To resume, DownloadTempFile must be populated, along with StartByte, to allow the remainder of the encrypted data to be downloaded. Once the encrypted data is downloaded it will be decrypted and written to LocalFile.

box.LocalFile = "../MyFile.zip";
box.EncryptionPassword = "password";
box.DownloadFile(box.Resources[0].Id);

//The transfer is interrupted and DownloadFile() above fails. Later, resume the download:

//Get the size of the partially download temp file
box.StartByte = new FileInfo(box.Config("DownloadTempFile")).Length;
box.DownloadFile(box.Resources[0].Id);

Uploading Files

The UploadFile() method uploads new file resources.

If SetUploadStream() has been used to set an upload stream, it will take priority as the file data source. If LocalFile is set the file will be uploaded from the specified path. If LocalFile is not set the data in ResourceData will be used.

To encrypt the file before uploading it, set EncryptionAlgorithm and EncryptionPassword.

Box offers two ways to upload a file. For smaller files a simple upload option is provided to upload data in one request. This is the default option. For larger files (must be larger than 20 MB), uploads can be fragmented into multiple pieces, allowing resuming of uploads that may be interrupted.

Simple
By default the component uses the simple upload mechanism. When doing a simple upload, the HashSimpleUploads setting is applicable.

box.LocalFile = "../MyFile.zip";
box.UploadFile("MyFile.zip", "");

Resumable
To enable resumable uploads set UseResumableUpload to True. This is recommended for large files (must be larger than 20 MB). The component will automatically fragment the specified file into smaller pieces and upload each individually.

When UseResumableUpload is set to True and UploadFile() is called, a resumable upload session is started by the component. ResumeURL is populated with a URL identifying the session (this value may be needed for additional operations if the upload does not complete normally).

During a resumable upload, the FragmentComplete event fires after each fragment is uploaded to indicate overall progress. The component also updates StartByte as necessary to indicate the current offset in the file.

If the upload is interrupted for any reason, resuming it is easy. First, verify that ResumeURL is populated (if the same instance of the component is used, it should already be populated, and no special action should be needed). If uploading from a stream, be sure to reset its position to where it was the first time the upload was started (typically the beginning). Call PollUploadStatus() to populate the correct values for StartByte and UploadFragmentSize. Then call UploadFile() again to resume the upload at the specified StartByte offset.

Note that if the upload is not resumed after some time the upload session will expire. PollUploadStatus() may be used to check the status of a resumable upload, including when it will expire (which is stored in the UploadExpDate configuration setting). An interrupted upload can be aborted explicitly using the AbortUpload() method.

box.LocalFile = "../MyFile.zip";
box.UploadFile("MyFile.zip", "");

//The transfer is interrupted and UploadFile() above fails. Later, resume the download.
//Using the same instance StartByte and ResumeURL are already populated from the previous
//upload attempt.
box.UploadFile("MyFile.zip", "");
MemoryStream uploadStream = new MemoryStream(File.ReadAllBytes("../MyFile.zip"));
box.SetUploadStream(uploadStream);
box.UploadFile("MyFile.zip", "");

//The transfer is interrupted and UploadFile() above fails. Later, resume the download.
//Using the same instance StartByte and ResumeURL are already populated from the previous 
//upload attempt.
//You MUST reset the stream's position to where it was when you first started the upload!
uploadStream.Position = 0;
box.UploadFile("MyFile.zip", "");

Additional Functionality

The Box component offers advanced functionality beyond simple uploads and downloads. For instance:

  • Encrypt and decrypt files using the EncryptionAlgorithm and EncryptionPassword properties.
  • Basic file and folder manipulation and organization using methods such as CopyResource(), CreateFolder(), DeleteResource(), MoveResource(), and RestoreResource().
  • Support for resource sharing using CreateLink() and RevokeLink().
  • Resource metadata management with CreateMetadata(), ListMetadata(), UpdateMetadata(), and DeleteMetadata().
  • Advanced resource listing using the Search() method.
  • Retrieval of account and space usage details using GetAccountInfo().
  • File version handling with ListVersions() and PromoteVersion().
  • And more!

CloudStorage

The CloudStorage component provides a single interface that can be used to work with a variety of services. Capabilities include uploading and downloading files, strong encryption support, creating folders, and more.

By supporting multiple providers with a single API code may be written once and used to support multiple services. The following providers are currently supported by this component:

  • Amazon S3
  • Azure Blob
  • Box.com
  • DigitalOcean
  • Dropbox
  • Google Cloud Storage
  • Google Drive
  • HadoopDFS
  • OneDrive
  • Wasabi

To begin, first create an account and register your application with the desired provider(s). Consult each provider's service documentation for instructions on this process.

Authentication

Depending on the provider, authentication is either handled by setting the Authorization property or by configuring various fields on the Account property. Please refer to the CloudStorage component's documentation for more information on how to authenticate for each provider.

Selecting a Provider

To specify the provider simply set ServiceProvider. This tells the component to which service requests will be made.

Listing Files and Folders

ListDirectory lists files and folder the path specified by RemotePath. A file mask may optionally be supplied in RemoteFile.

The directory entries are provided through the DirList event and also via the DirList property.

cloudstorage.RemotePath = "MyFolder";
cloudstorage.ListDirectory();
for (int i = 0; i < cloudstorage.DirList.Count; i++) {
  Console.WriteLine(cloudstorage.DirList[i].FileName);
  Console.WriteLine(cloudstorage.DirList[i].FileSize);
  Console.WriteLine(cloudstorage.DirList[i].FileTime);
  Console.WriteLine(cloudstorage.DirList[i].IsDir);
}

Optionally set RemoteFile to a file mask to list only specific files. For instance:

cloudstorage.RemoteFile = "*.txt";
cloudstorage.ListDirectory();

Downloading Files

The Download method downloads a specific file.

Set RemoteFile to the name the file to download before calling this method. If RemoteFile only specifies a filename it will be downloaded from the path specified by RemotePath. RemoteFile may also be set to an absolute path.

The file will be downloaded to the stream specified (if any) by SetDownloadStream. If a stream is not specified and LocalFile is set the file will be saved to the specified location. If a stream is not specified and LocalFile is not set the file data will be held by ResourceData.

To decrypt an encrypted file set EncryptionAlgorithm and EncryptionPassword before calling this method.

cloudstorage.RemotePath = "My Folder";
cloudstorage.RemoteFile = "MyFile.zip";
cloudstorage.LocalFile = "../MyFile.zip";
cloudstorage.Download();

Resuming Downloads

The component also supports resuming failed downloads by using the StartByte property. If the download was interrupted, set StartByte to the appropriate offset before calling this method to resume the download.

cloudstorage.RemotePath = myRemoteFolder;
cloudstorage.RemoteFile = myRemoteFile;
cloudstorage.LocalFile = downloadFile;
cloudstorage.Download();

//The transfer is interrupted and Download() above fails. Later, resume the download:

//Get the size of the partially download file
cloudstorage.StartByte = new FileInfo(downloadFile).Length; 
cloudstorage.RemotePath = myRemoteFolder;
cloudstorage.RemoteFile = myRemoteFile;
cloudstorage.LocalFile = downloadFile;
cloudstorage.Download();

Resuming Encrypted File Downloads

Resuming encrypted file downloads is only supported when LocalFile was set in the initial download attempt. When beginning an encrypted download if LocalFile is set the component will create a temporary file in TempPath to hold the encrypted data until it is complete.

If the download is interrupted DownloadTempFile will be populated with the temporary file holding the partial data. When resuming, DownloadTempFile must be populated along with StartByte to allow the remainder of the encrypted data to be downloaded. Once the encrypted data is downloaded it will be decrypted and written to LocalFile.

cloudstorage.RemotePath = myRemoteFolder;
cloudstorage.RemoteFile = myRemoteFile;
cloudstorage.LocalFile = downloadFile;
cloudstorage.EncryptionPassword = "password";
cloudstorage.Download();

//The transfer is interrupted and Download() above fails. Later, resume the download:

//Get the size of the partially download temp file
cloudstorage.StartByte = new FileInfo(cloudstorage.Config("DownloadTempFile")).Length; 
cloudstorage.RemotePath = myRemoteFolder;
cloudstorage.RemoteFile = myRemoteFile;
cloudstorage.LocalFile = downloadFile;
cloudstorage.EncryptionPassword = "password";
cloudstorage.Download();

Uploading Files

The Upload method is used to upload files. If SetUploadStream is used to set an upload stream the data to upload is taken from the stream instead.

RemoteFile should be set to either a relative or absolute path. If RemoteFile is not an absolute path it will be uploaded relative to RemotePath.

To encrypt a file before uploading set EncryptionAlgorithm and EncryptionPassword.

Note: Resuming uploads is not currently supported.

//Upload with a relative path
cloudstorage.LocalFile = "C:\localfile.txt"
cloudstorage.RemoteFile = "remotefile.txt"
cloudstorage.Upload()

//Upload with an absolute path
cloudstorage.LocalFile = "C:\localfile2.txt"
cloudstorage.RemoteFile = "/folder/remotefile2.txt"
cloudstorage.Upload()

Additional Functionality

The CloudStorage component offers advanced functionality beyond simple uploads and downloads. For instance:

  • Encrypt and decrypt files using EncryptionAlgorithm and EncryptionPassword.
  • DeleteFile provides a way to delete files.
  • MakeDirectory and RemoveDirectory support creating and deleting folders.
  • RenameFile allows renaming of existing files on the server.
  • And more!

Dropbox

The Dropbox component provides a simple interface to working with Dropbox. Capabilities include uploading and downloading files, strong encryption support, creating folders, moving and copying resources, and more.

Authentication

This component supports authentication via OAuth 2.0. First, perform OAuth authentication using the OAuth component or a separate process. Once complete you should have an authorization string which looks like: Bearer ya29.AHES6ZSZEJzATdZYjeihDn5W-VrXSsxEZu5p0pclxGdKKQ

Assign this value to the Authorization property before attempting any operations. Consult the documentation for the service for more information about supported scope values and more details on OAuth authentication.

Addressing Resources

Dropbox typically allows resources to be addressed in multiple ways:

  • Using a path (e.g., /path/to/resource.txt).
  • Using a resource Id (e.g, id:xxxxx).
  • Using an Id-based relative path (e.g., id:xxxxx/relative/path/test.txt, where the Id is that of a folder resource).
  • For certain methods, using a revision Id (e.g., rev:xxxxx).
The documentation for this component's methods will always note which of the above options are acceptable for each applicable method parameter.

Listing Resources

ListResources() lists resources within the specified folder. Calling this method will fire the ResourceList event once for each resource, and will also populate the Resources collection.

If there are still more resources available to list when this method returns, the ResourceMarker property will be populated. Continue to call this method until ResourceMarker is empty to accumulate all pages of results in the Resources collection.

// ResourceList event handler.
dropbox.OnResourceList += (s, e) => {
  Console.WriteLine(e.Name);
};

do {
  dropbox.ListResources("/work_files/serious_business/cats");

  for (int i = 0; i < dropbox.Resources.Count; i++) {
    // Process resources here.
  }
} while (!string.IsNullOrEmpty(dropbox.ResourceMarker));

Downloading Files

The DownloadFile() method downloads file resources.

If a stream has been specified using SetDownloadStream(), the file data will be sent through it. If a stream is not specified, and LocalFile is set, the file will be saved to the specified location; otherwise, the file data will be held by ResourceData.

To download and decrypt an encrypted file, set EncryptionAlgorithm and EncryptionPassword before calling this method.

In the simplest use-case, downloading a file looks like this:

dropbox.LocalFile = "../MyFile.zip";
dropbox.DownloadFile(dropbox.Resources[0].Id);

Resuming Downloads
The component also supports resuming failed downloads by using the StartByte property. If a download is interrupted, set StartByte to the appropriate offset before calling this method to resume the download.

string downloadFile = "../MyFile.zip";
dropbox.LocalFile = downloadFile;
dropbox.DownloadFile(dropbox.Resources[0].Id);

//The transfer is interrupted and DownloadFile() above fails. Later, resume the download:

//Get the size of the partially download file
dropbox.StartByte = new FileInfo(downloadFile).Length;
dropbox.DownloadFile(dropbox.Resources[0].Id);

Resuming Encrypted File Downloads
Resuming encrypted file downloads is only supported when LocalFile was set in the initial download attempt.

If LocalFile is set when beginning an encrypted download, the component creates a temporary file in TempPath to hold the encrypted data until the download is complete. If the download is interrupted, DownloadTempFile will be populated with the path of the temporary file that holds the partial data.

To resume, DownloadTempFile must be populated, along with StartByte, to allow the remainder of the encrypted data to be downloaded. Once the encrypted data is downloaded it will be decrypted and written to LocalFile.

dropbox.LocalFile = "../MyFile.zip";
dropbox.EncryptionPassword = "password";
dropbox.DownloadFile(dropbox.Resources[0].Id);

//The transfer is interrupted and DownloadFile() above fails. Later, resume the download:

//Get the size of the partially download temp file
dropbox.StartByte = new FileInfo(dropbox.Config("DownloadTempFile")).Length;
dropbox.DownloadFile(dropbox.Resources[0].Id);

Uploading Files

The UploadFile() method uploads new file resources.

If SetUploadStream() has been used to set an upload stream, it will take priority as the file data source. If LocalFile is set the file will be uploaded from the specified path. If LocalFile is not set the data in ResourceData will be used.

To encrypt the file before uploading it, set EncryptionAlgorithm and EncryptionPassword.

Dropbox offers two ways to upload a file. For smaller files a simple upload option is provided to upload data in one request. This is the default option. For larger files, uploads can be fragmented into multiple pieces, allowing resuming of uploads that may be interrupted.

Simple
By default the component uses the simple upload mechanism.

dropbox.LocalFile = "../MyFile.zip";
dropbox.UploadFile("/MyFile.zip");

Resumable
To enable resumable uploads set UseResumableUpload to True. This is recommended for large files. The component will automatically fragment the specified file into smaller pieces and upload each individually. FragmentSize may be set to specify the size of the fragment if desired. The default fragment size is 10 MB.

When UseResumableUpload is set to True and UploadFile() is called, a resumable upload session is started by the component. UploadSessionId is populated with a resumable upload session Id identifying the session (this value may be needed for additional operations if the upload does not complete normally).

During a resumable upload, the FragmentComplete event fires after each fragment is uploaded to indicate overall progress. The component also updates StartByte as necessary to indicate the current offset in the file.

If the upload is interrupted for any reason, resuming it is easy. First, verify that UploadSessionId and StartByte are populated (if the same instance of the component is used, they should already be populated, and no special action should be needed). If uploading from a stream, be sure to reset its position to where it was the first time the upload was started (typically the beginning). Then call UploadFile() again to resume the upload at the specified StartByte offset.

Note that if the upload is not resumed after some time the upload session will expire.

dropbox.LocalFile = "../MyFile.zip";
dropbox.UploadFile("MyFile.zip");

//The transfer is interrupted and UploadFile() above fails. Later, resume the download.
//Using the same instance StartByte and ResumeURL are already populated from the previous
//upload attempt.
dropbox.UploadFile("MyFile.zip");
MemoryStream uploadStream = new MemoryStream(File.ReadAllBytes("../MyFile.zip"));
dropbox.SetUploadStream(uploadStream);
dropbox.UploadFile("MyFile.zip");

//The transfer is interrupted and UploadFile() above fails. Later, resume the download.
//Using the same instance StartByte and ResumeURL are already populated from the previous 
//upload attempt.
//You MUST reset the stream's position to where it was when you first started the upload!
uploadStream.Position = 0;
dropbox.UploadFile("MyFile.zip");

Additional Functionality

The Dropbox component offers advanced functionality beyond simple uploads and downloads. For instance:

  • Encrypt and decrypt files using the EncryptionAlgorithm and EncryptionPassword properties.
  • Basic file and folder manipulation and organization using methods such as CopyResource(), CreateFolder(), DeleteResource(), and MoveResource().
  • Support for resource sharing using CreateLink(), ListSharedLinks(), and RevokeLink().
  • Change tracking with ListChanges() and WaitForChanges().
  • Advanced resource listing using the Search() method.
  • Retrieval of account and space usage details using GetAccountInfo().
  • File revision handling with ListRevisions() and RestoreResource().
  • And more!

Google Drive

The GoogleDrive component provides an easy-to-use interface for Google Drive. Capabilities include uploading and downloading files, file and folder manipulation and organization, Google Team Drive support, strong client-side file encryption functionality, and more.

Authentication

This component supports authentication via OAuth 2.0. First, perform OAuth authentication using the OAuth component or a separate process. Once complete you should have an authorization string which looks like: Bearer ya29.AHES6ZSZEJzATdZYjeihDn5W-VrXSsxEZu5p0pclxGdKKQ

Assign this value to the Authorization property before attempting any operations. Consult the documentation for the service for more information about supported scope values and more details on OAuth authentication.

Listing Resources

The ListResources() method is used to list resources within the scope specified by the ListResourcesScope property. The ListChildren(), ListParents(), and GetResourceInfo() methods are also available for finer-grained control.

// ResourceList event handler.
googledrive.OnResourceList += (s, e) => {
  Console.WriteLine(e.Name);
};

// List all of the current user's resources.
googledrive.ListResourcesScope = GoogledriveListResourcesScopes.lrsUser;
do {
  googledrive.ListResources();

  for (int i = 0; i < googledrive.Resources.Count; i++) {
    // Process resources here.
  }
} while (!string.IsNullOrEmpty(googledrive.ResourceMarker));

// List all of the resources in the specified team drive.
googledrive.TeamDrive = "qwerty1234567890";
googledrive.ListResourcesScope = GoogledriveListResourcesScopes.lrsTeamDrive;
do {
  googledrive.ListResources();

  for (int i = 0; i < googledrive.Resources.Count; i++) {
    // Process resources here.
  }
} while (!string.IsNullOrEmpty(googledrive.ResourceMarker));

Downloading Files

The DownloadFile() method is used to download files.

Downloading an Encrypted File
To decrypt an encrypted file set EncryptionAlgorithm and EncryptionPassword before calling this method.

googledrive.LocalFile = "../MyFile.zip";
googledrive.DownloadFile(googledrive.Resources[0].Id, "");

Resuming Downloads
The component also supports resuming failed downloads by using the StartByte property. If a download is interrupted, set StartByte to the appropriate offset before calling this method to resume the download.

string downloadFile = "../MyFile.zip";
googledrive.LocalFile = downloadFile;
googledrive.DownloadFile(googledrive.Resources[0].Id, "");

//The transfer is interrupted and DownloadFile() above fails. Later, resume the download:

//Get the size of the partially download file
googledrive.StartByte = new FileInfo(downloadFile).Length;
googledrive.DownloadFile(googledrive.Resources[0].Id, "");

Resuming Encrypted File Downloads
Resuming encrypted file downloads is only supported when LocalFile was set in the initial download attempt.

If LocalFile is set when beginning an encrypted download, the component creates a temporary file in TempPath to hold the encrypted data until the download is complete. If the download is interrupted, DownloadTempFile will be populated with the path of the temporary file that holds the partial data.

To resume, DownloadTempFile must be populated, along with StartByte, to allow the remainder of the encrypted data to be downloaded. Once the encrypted data is downloaded it will be decrypted and written to LocalFile.

googledrive.LocalFile = "../MyFile.zip";
googledrive.EncryptionPassword = "password";
googledrive.DownloadFile(googledrive.Resource[0].Id, "");

//The transfer is interrupted and DownloadFile() above fails. Later, resume the download:

//Get the size of the partially download temp file
googledrive.StartByte = new FileInfo(googledrive.Config("DownloadTempFile")).Length;
googledrive.DownloadFile(googledrive.Resource[0].Id, "");

Uploading Files

The UploadFile() method is used to upload files.

Google Drive offers two ways to upload a file. For smaller files a simple upload option is provided to upload data in one request. This is the default option. For larger files, uploads can be fragmented into multiple pieces, allowing resuming of uploads that may be interrupted.

Simple
By default the component uses the simple upload mechanism.

googledrive.LocalFile = "../MyFile.zip";
googledrive.UploadFile("MyFile.zip", "");

Resumable
To enable resumable uploads set UseResumableUpload to True. This is recommended for large files. The component will automatically fragment the specified file into smaller pieces and upload each individually. FragmentSize may be set to specify the size of the fragment if desired. The default fragment size is 10 MB.

When UseResumableUpload is set to True and UploadFile() is called, a resumable upload session is started by the component. ResumeURL is populated with a URL identifying the session (this value may be needed for additional operations if the upload does not complete normally).

During a resumable upload, the FragmentComplete event fires after each fragment is uploaded to indicate overall progress. The component also updates StartByte as necessary to indicate the current offset in the file.

If the upload is interrupted for any reason, resuming it is easy. First, verify that ResumeURL and StartByte are populated (if the same instance of the component is used, they should already be populated, and no special action should be needed). If uploading from a stream, be sure to reset its position to where it was the first time the upload was started (typically the beginning). Then call UploadFile() again to resume the upload at the specified StartByte offset.

Note that if the upload is not resumed after some time the upload session will expire. GetUploadStatus() may be used to check the status of a resumable upload.

googledrive.LocalFile = "../MyFile.zip";
googledrive.UploadFile("MyFile.zip", "");

//The transfer is interrupted and UploadFile() above fails. Later, resume the download.
//Using the same instance StartByte and ResumeURL are already populated from the previous
//upload attempt.
googledrive.UploadFile("MyFile.zip", "");
MemoryStream uploadStream = new MemoryStream(File.ReadAllBytes("../MyFile.zip"));
googledrive.SetUploadStream(uploadStream);
googledrive.UploadFile("MyFile.zip", "");

//The transfer is interrupted and UploadFile() above fails. Later, resume the download.
//Using the same instance StartByte and ResumeURL are already populated from the previous 
//upload attempt.
//You MUST reset the stream's position to where it was when you first started the upload!
uploadStream.Position = 0;
googledrive.UploadFile("MyFile.zip", "");

Team Drive Support

The GoogleDrive component has full support for Google Team Drives. For the most common use-cases (such as those described above), there is very little difference when using a team drive versus a personal Google Drive ("My Drive").

For more information about how to use the component with Google Team Drives, refer to both the team-drive-specific documentation sections for commonly-used methods like ListResources(), MoveResource(), UpdatePermissions(), etc.; and browse through the documentation for team-drive-specific methods and properties such as the AddTeamDriveMember() and ListTeamDrives() methods, the TeamDrive property, etc..

Additional Functionality

The GoogleDrive component offers advanced functionality beyond simple uploads and downloads. For instance:

  • Encrypt and decrypt files using the EncryptionAlgorithm and EncryptionPassword properties.
  • Basic file and folder manipulation and organization using methods such as CopyResource(), CreateFolder(), DeleteResource(), MoveResource(), and UpdateResource().
  • Enumeration and manipulation of parent-child relationships using the AddParents(), ListChildren(), ListParents(), and RemoveParents() methods.
  • Resourcing trashing and deletion: DeleteResource(), TrashResource(), RestoreResource().
  • Control over permissions using ListPermissions() and UpdatePermissions().
  • Change tracking with ListChanges().
  • And more!

HadoopDFS

The HadoopDFS component offers an easy-to-use API compatible with any Hadoop distributed file system (HDFS) cluster that exposes Hadoop's standard WebHDFS REST API. Capabilities include uploading and downloading files, strong encryption support, creating folders, file manipulation and organization, and more.

Authentication

First, set the URL property to the base WebHDFS URL of the server (see the URL property's documentation for more details).

Depending on how the server is configured, there are a few different authentication methods that might be used; or, the server might not require authentication at all). Refer to the AuthScheme property's documentation for more information about configuring the component to authenticate correctly.

Addressing Resources

HDFS addresses resources (files, directories, and symlinks) using Linux-style absolute paths. Unless otherwise specified, the component always works in terms of absolute paths, and will always prepend a forward slash (/) to any path passed to it that does not already start with one.

Listing Directory Contents

ListResources() lists resources (files, directories, and symlinks) within the specified directory. Calling this method will fire the ResourceList event once for each resource, and will also populate the Resources collection.

// ResourceList event handler.
hdfs.OnResourceList += (s, e) => {
  Console.WriteLine(e.Name);
};

hdfs.ListResources("/work_files/serious_business/cats");

for (int i = 0; i < hdfs.Resources.Count; i++) {
  // Process resources here.
}

Downloading Files

The DownloadFile() method downloads files.

If a stream has been specified using SetDownloadStream(), the file data will be sent through it. If a stream is not specified, and LocalFile is set, the file will be saved to the specified location; otherwise, the file data will be held by ResourceData.

To download and decrypt an encrypted file, set EncryptionAlgorithm and EncryptionPassword before calling this method.

In the simplest use-case, downloading a file looks like this:

hdfs.LocalFile = "../MyFile.zip";
hdfs.DownloadFile(hdfs.Resources[0].Path);

Resuming Downloads
The component also supports resuming failed downloads by using the StartByte property. If a download is interrupted, set StartByte to the appropriate offset before calling this method to resume the download.

string downloadFile = "../MyFile.zip";
hdfs.LocalFile = downloadFile;
hdfs.DownloadFile(hdfs.Resources[0].Path);

//The transfer is interrupted and DownloadFile() above fails. Later, resume the download:

//Get the size of the partially download file
hdfs.StartByte = new FileInfo(downloadFile).Length;
hdfs.DownloadFile(hdfs.Resources[0].Path);

Resuming Encrypted File Downloads
Resuming encrypted file downloads is only supported when LocalFile was set in the initial download attempt.

If LocalFile is set when beginning an encrypted download, the component creates a temporary file in TempPath to hold the encrypted data until the download is complete. If the download is interrupted, DownloadTempFile will be populated with the path of the temporary file that holds the partial data.

To resume, DownloadTempFile must be populated, along with StartByte, to allow the remainder of the encrypted data to be downloaded. Once the encrypted data is downloaded it will be decrypted and written to LocalFile.

hdfs.LocalFile = "../MyFile.zip";
hdfs.EncryptionPassword = "password";
hdfs.DownloadFile(hdfs.Resources[0].Path);

//The transfer is interrupted and DownloadFile() above fails. Later, resume the download:

//Get the size of the partially download temp file
hdfs.StartByte = new FileInfo(hdfs.Config("DownloadTempFile")).Length;
hdfs.DownloadFile(hdfs.Resources[0].Path);

Uploading Files

The UploadFile() method uploads new files.

If SetUploadStream() has been used to set an upload stream, it will take priority as the file data source. If LocalFile is set the file will be uploaded from the specified path. If LocalFile is not set the data in ResourceData will be used.

To encrypt the file before uploading it, set EncryptionAlgorithm and EncryptionPassword.

hdfs.LocalFile = "../MyFile.zip";
hdfs.UploadFile("/MyFile.zip");

Additional Functionality

The HadoopDFS component offers advanced functionality beyond simple uploads and downloads. For instance:

  • Encrypt and decrypt files using the EncryptionAlgorithm and EncryptionPassword properties.
  • Basic file and folder manipulation and organization using methods such as AppendFile(), DeleteResource(), MakeDirectory(), MoveResource(), and TruncateFile().
  • Advanced file and directory manipulation with SetFileReplication(), SetOwner(), SetPermission(), and SetTimes().
  • Retrieval of both general file/directory information, as well as directory quota information, using GetResourceInfo() and GetDirSummary().
  • Execute any arbitrary WebHDFS operation with ease using the DoCustomOp() method.
  • And more!

OneDrive

The OneDrive component provides a simple interface to working with Microsoft OneDrive. Capabilities include uploading and downloading files, strong encryption support, creating folders, moving and copying resources, OneDrive for Business and SharePoint Online support, and more.

Authentication

This component supports authentication via OAuth 2.0. First, perform OAuth authentication using the OAuth component or a separate process. Once complete you should have an authorization string which looks like: Bearer ya29.AHES6ZSZEJzATdZYjeihDn5W-VrXSsxEZu5p0pclxGdKKQ

Assign this value to the Authorization property before attempting any operations. Consult the documentation for the service for more information about supported scope values and more details on OAuth authentication.

Note: There are a couple of extra factors to consider when doing OAuth for OneDrive; please refer to the Authorization property documentation for more information.

Listing Resources

ListResources() lists resources within the folder resource currently selected by RemoteId or RemotePath. Calling this method will fire the ResourceList event once for each resource, and will also populate the Resources collection.

If there are still more resources available to list when this method returns, the ResourceMarker property will be populated. Continue to call this method until ResourceMarker is empty to accumulate all pages of results in the Resources collection.

// ResourceList event handler.
onedrive.OnResourceList += (s, e) => {
  Console.WriteLine(e.Name);
};

// (Assume that the RemoteId property isn't set here; it takes precedence if it is.)
onedrive.RemotePath = "/work_files/serious_business/cats";
do {
  onedrive.ListResources();

  for (int i = 0; i < onedrive.Resources.Count; i++) {
    // Process resources here.
  }
} while (!string.IsNullOrEmpty(onedrive.ResourceMarker));

Downloading Files

The DownloadFile() method downloads the file resource currently selected by RemoteId or RemotePath.

If a stream has been specified using SetDownloadStream(), the file data will be sent through it. If a stream is not specified, and LocalFile is set, the file will be saved to the specified location; otherwise, the file data will be held by ResourceData.

To download and decrypt an encrypted file, set EncryptionAlgorithm and EncryptionPassword before calling this method.

In the simplest use-case, downloading a file looks like this:

onedrive.LocalFile = "../MyFile.zip";
onedrive.RemoteId = onedrive.Resources[0].Id;
onedrive.DownloadFile();

Resuming Downloads
The component also supports resuming failed downloads by using the StartByte property. If a download is interrupted, set StartByte to the appropriate offset before calling this method to resume the download.

string downloadFile = "../MyFile.zip";
onedrive.LocalFile = downloadFile;
onedrive.RemoteId = onedrive.Resources[0].Id;
onedrive.DownloadFile();

//The transfer is interrupted and DownloadFile() above fails. Later, resume the download:

//Get the size of the partially download file
onedrive.StartByte = new FileInfo(downloadFile).Length;
onedrive.DownloadFile();

Resuming Encrypted File Downloads
Resuming encrypted file downloads is only supported when LocalFile was set in the initial download attempt.

If LocalFile is set when beginning an encrypted download, the component creates a temporary file in TempPath to hold the encrypted data until the download is complete. If the download is interrupted, DownloadTempFile will be populated with the path of the temporary file that holds the partial data.

To resume, DownloadTempFile must be populated, along with StartByte, to allow the remainder of the encrypted data to be downloaded. Once the encrypted data is downloaded it will be decrypted and written to LocalFile.

onedrive.LocalFile = "../MyFile.zip";
onedrive.EncryptionPassword = "password";
onedrive.RemoteId = onedrive.Resources[0].Id;
onedrive.DownloadFile();

//The transfer is interrupted and DownloadFile() above fails. Later, resume the download:

//Get the size of the partially download temp file
onedrive.StartByte = new FileInfo(onedrive.Config("DownloadTempFile")).Length;
onedrive.DownloadFile();

Uploading Files

The UploadFile() method uploads new file resources to the folder resource currently selected by RemoteId or RemotePath.

If SetUploadStream() has been used to set an upload stream, it will take priority as the file data source. If LocalFile is set the file will be uploaded from the specified path. If LocalFile is not set the data in ResourceData will be used.

To encrypt the file before uploading it, set EncryptionAlgorithm and EncryptionPassword.

OneDrive offers two ways to upload a file. For smaller files a simple upload option is provided to upload data in one request. This is the default option. For larger files, uploads can be fragmented into multiple pieces, allowing resuming of uploads that may be interrupted.

Simple
By default the component uses the simple upload mechanism.

onedrive.LocalFile = "../MyFile.zip";
onedrive.UploadFile("MyFile.zip");

Resumable
To enable resumable uploads set UseResumableUpload to True. This is recommended for large files. The component will automatically fragment the specified file into smaller pieces and upload each individually. FragmentSize may be set to specify the size of the fragment if desired. The default fragment size is 10 MB.

When UseResumableUpload is set to True and UploadFile() is called, a resumable upload session is started by the component. ResumeURL is populated with a URL identifying the session (this value may be needed for additional operations if the upload does not complete normally).

During a resumable upload, the FragmentComplete event fires after each fragment is uploaded to indicate overall progress. The component also updates StartByte as necessary to indicate the current offset in the file.

If the upload is interrupted for any reason, resuming it is easy. First, verify that ResumeURL and StartByte are populated (if the same instance of the component is used, they should already be populated, and no special action should be needed). If uploading from a stream, be sure to reset its position to where it was the first time the upload was started (typically the beginning). Then call UploadFile() again to resume the upload at the specified StartByte offset.

Note that if the upload is not resumed after some time the upload session will expire. PollUploadStatus() may be used to check the status of a resumable upload, including when it will expire (which is stored in the UploadExpDate configuration setting). An interrupted upload can be aborted explicitly using the AbortUpload() method.

onedrive.LocalFile = "../MyFile.zip";
onedrive.UploadFile("MyFile.zip");

//The transfer is interrupted and UploadFile() above fails. Later, resume the download.
//Using the same instance StartByte and ResumeURL are already populated from the previous
//upload attempt.
onedrive.UploadFile("MyFile.zip");
MemoryStream uploadStream = new MemoryStream(File.ReadAllBytes("../MyFile.zip"));
onedrive.SetUploadStream(uploadStream);
onedrive.UploadFile("MyFile.zip");

//The transfer is interrupted and UploadFile() above fails. Later, resume the download.
//Using the same instance StartByte and ResumeURL are already populated from the previous 
//upload attempt.
//You MUST reset the stream's position to where it was when you first started the upload!
uploadStream.Position = 0;
onedrive.UploadFile("MyFile.zip");

Additional Functionality

The OneDrive component offers advanced functionality beyond simple uploads and downloads. For instance:

  • Encrypt and decrypt files using the EncryptionAlgorithm and EncryptionPassword properties.
  • Basic file and folder manipulation and organization using methods such as CopyResource(), CreateFolder(), DeleteResource(), MoveResource(), and UpdateResource().
  • Creation of resource sharing links using CreateLink().
  • Change tracking with ListChanges().
  • Advanced resource listing using the Search() method.
  • Support for OneDrive for Business and SharePoint Online functionality, including drive selection using ListDrives(), Drive, and other API members.
  • And more!

Wasabi

Wasabi has an API that is fully compatible with that of Amazon S3, allowing you to store arbitrary data using the same bucket-and-object paradigm that S3 uses. But Wasabi is more than just an S3 alternative; it's faster, cheaper, and it adds additional functionality on top of the S3 API. Whether you're doing simple bucket-and-object data manipulation, or leveraging Wasabi-exclusive API features, the Wasabi component makes it easy to access the Wasabi service quickly and securely.

Since Wasabi's API is a superset of the Amazon S3 API, refer to the Amazon S3 section for information about basic bucket and object manipulation. This section will cover Wasabi-exclusive functionality.

To begin, first create a Wasabi service account and obtain an access and secret key. Consult the Wasabi documentation for instructions on this process.

Authentication

Authentication is performed using the AccessKey and SecretKey provided by Wasabi.

wasabi = new Wasabi();
wasabi.AccessKey = WASABI_ACCESS_KEY;
wasabi.SecretKey = WASABI_SECRET_KEY;

Renaming

Wasabi provides the ability to rename buckets, objects, and folders.

Renaming Buckets
Renaming buckets is simple, and doesn't require setting the Bucket property or calling the ListBuckets method first. The new bucket name cannot already be in use.

wasabi.RenameBucket("MyBucket", "MyRenamedBucket");

Renaming Objects and Folders
Renaming objects and folders is easy too, and again doesn't require you to call the ListObjects method first. If you have versioning enabled on the owning bucket, or if the OverwriteOnRename configuration setting set to True (the default), objects and folders will be overwritten automatically if there is a naming conflict. The following code shows how to rename an object:

wasabi.Bucket = "MyBucket";
wasabi.RenameObject("cats.jpg", "dogs.jpg");

Note that, due to how Wasabi's data structure works, "folders" are really just name prefixes formatted to mimic a traditional file system. Due to this, the RenameFolders method can rename multiple levels of "folders" at once since it is actually just renaming a prefix. The following code shows this.

// Assume we start with these objects in our bucket:
// "/pictures/animals/cats/cat1.jpg"
// "/pictures/animals/cats/cat2.jpg"
// "/pictures/animals/cats/cat3.jpg"
// "/pictures/animals/cats/cat4.jpg"
// "/pictures/animals/dogs/dog4.jpg"
// "/pictures/animals/dogs/dog5.jpg"
wasabi.Bucket = "MyBucket";

// If we do this...
wasabi.RenameFolder("/pictures/animals/cats/", "pictures/pets/dogs");

// We will end up with these objects:
// "/pictures/pets/dogs/cat1.jpg"
// "/pictures/pets/dogs/cat2.jpg"
// "/pictures/pets/dogs/cat3.jpg"
// "/pictures/pets/dogs/cat4.jpg"
// "/pictures/pets/dogs/dog4.jpg"
// "/pictures/pets/dogs/dog5.jpg"

// If we did this instead...
wasabi.RenameFolder("/pictures/animals/cats/cat", "pictures/pets/dogs/dog");

// We would instead be left with these objects (Note how the original dog4.jpg was overwritten
// with the original cat4.jpg due to the prefix rename including part of the "filename"):
// "/pictures/pets/dogs/dog1.jpg"
// "/pictures/pets/dogs/dog2.jpg"
// "/pictures/pets/dogs/dog3.jpg"
// "/pictures/pets/dogs/dog4.jpg"
// "/pictures/pets/dogs/dog5.jpg"

Object Composition

Object composition is a server-side method of building an object by concatenating multiple existing objects together. This feature of Wasabi is a simpler alternative to the typical multipart-upload method of building objects.

A composed object looks and acts just like a regular object. However, its data is determined by its component objects (that is, the objects that make it up) and the order in which they are composed together.

The Wasabi component's ComposeObjects method is used to do object composition. To use this method, you must first specify the names of the objects you wish to compose together using the Objects property. When you call the method, the ComposedObjectName parameter determines what name is given to the resulting composed object. See the following code snippet for examples of how you can compose objects:

// Upload some objects to start with. Let's assume we have some text files to upload.
for (int i = 1; i <= 3; i++) {
  wasabi.LocalFile = "file" + i + ".txt";
  wasabi.CreateObject("file" + i + ".txt");
}

// Compose a new object.
SetObjectNames("file1.txt", "file2.txt", "file3.txt");
wasabi.ComposeObjects("composed1.txt");

// Compose a new object using a composed object.
SetObjectNames("file1.txt", "composed1.txt");
wasabi.ComposeObjects("composed2.txt");

// Compose onto an existing composed object. 
// (Similar to appending, but all server-side, no uploading needed.)
SetObjectNames("file1.txt", "file2.txt");
wasabi.ComposeObjects("composed3.txt");
SetObjectNames("composed3.txt", "file3.txt");
wasabi.ComposeObjects("composed3.txt");

// Compose a new object using the same object twice.
SetObjectNames("file1.txt", "file1.txt");
wasabi.ComposeObjects("composed4.txt");

// Compose a composed object onto itself.
SetObjectNames("composed4.txt", "composed4.txt");
wasabi.ComposeObjects("composed4.txt");

Assume we have this helper method for the above code:

void SetObjectNames(params string[] names) {
  wasabi.Objects.Clear();
  foreach (string name in names) wasabi.Objects.Add(new WasabiObject(name));
}

As you can see, object composition is a powerful and flexible feature. However, there are some things to keep in mind when doing object composition:

  • The order of the object names in the Objects property is the order that they will be concatenated in when they are composed.
  • Wasabi does not allow composing objects from multiple buckets together.
  • Wasabi will not allow you to specify more than 32 object names in a single compose request.
  • Wasabi will not allow you to compose more than 1024 original objects together. This is a transitive limit, you cannot circumvent it by composing together composed objects.

Wasabi does not make copies of data when composing objects, it uses references, so there are no extra data charges. Even if the original objects are deleted, a single copy of the data is kept until there are no longer any composed objects which reference it. Refer to the Wasabi documentation for more information.

Compliance

Compliance is a Wasabi feature that prevents objects from being modified or deleted before a specified time. The following tables and examples show what compliance settings are available at the bucket- and object-levels, what they do, and how they are used. Be sure to review the documentation for each of the individual settings for more information about them.

Bucket Compliance Settings

Name Description
Status Whether or not compliance is enabled for a bucket. Either "enabled" or "disabled".
LockTime If not "off" (default), the time at which the compliance settings for a bucket were/should be locked.
RetentionDays An integer representing the minimum number of days to retain objects in a bucket. Defaults to 0.
ConditionalHold Whether or not newly created objects in a bucket should be placed under conditional hold. Defaults to "false".
DeleteAfterRetention Whether or not to automatically delete objects after their retention time has been passed. Defaults to "false".
(Note that there is also a read-only IsLocked field which is set based on the value of LockTime.)

Compliance starts at the bucket level, it must be turned on for a bucket in order for the objects within that bucket to be under compliance. The component makes this easy to do:

wasabi.Bucket = "MyBucket";
wasabi.BucketCompliance = new WasabiBucketCompliance("enabled", "", 10, "true", "");
wasabi.UpdateCompliance("");

This code turns on compliance settings for the bucket called "MyBucket", setting them so that objects are placed under conditional hold, and must be retained for 10 days after the conditional hold is released. The LockTime and DeleteAfterRetention settings are left alone, so they will default to "off" and "false" (respectively).

Important: Locking the compliance settings on a bucket using the LockTime setting is a one-way action. You cannot unlock a bucket's compliance settings without contacting the Wasabi support team. Refer to the Wasabi documentation for more information.

Object Compliance Settings

Name Description
RetentionTime An ISO 8601 date-time before which an object cannot be deleted. Defaults to the time at which the object was created (or at which compliance was turned on) plus the value of the bucket's RetentionDays setting at that time.
LegalHold Whether or not an object is under legal hold, preventing it from being deleted. Defaults to "false".
ConditionalHold Whether or not an object is under conditional hold, preventing it from being deleted and its retention period from beginning. Default depends on the bucket's ConditionalHold setting.
(Note that there is also a read-only Hash field, containing an SHA-256 hash of the object that can be used to verify that it hasn't changed while under compliance.)

When an object's bucket has compliance enabled, all objects in that bucket are under compliance, and you can query any object's compliance settings using the GetCompliance method.

Any individual object can have its compliance settings updated as well. The LegalHold setting can be toggled to prevent (or allow) deletion; the object's ConditionalHold, if "true", can be released by setting to "false", at which time the object will begin its retention period. The object's RetentionTime can also be extended (but not reduced). For example, if you wanted to release an object from conditional hold:

wasabi.Bucket = "MyBucket";
wasabi.ObjectCompliance = new WasabiObjectCompliance("", "false", "");
wasabi.UpdateCompliance("MyObject");

Further Information
Compliance is a powerful tool, but it must be handled carefully. These notes, and the documentation for the compliance-related properties, methods, and types, do not supersede Wasabi's documentation, and reviewing that documentation is highly recommended to gain a full understanding of how compliance works.


We appreciate your feedback.  If you have any questions, comments, or suggestions about this article please contact our support team at kb@nsoftware.com.