Getting Started with Cloud Storage Cmdlets
NetCmdlets includes a set of cmdlets that provide simple command-line tools for accessing cloud storage services including Amazon S3, Google Drive, Dropbox, Box.com, OneDrive, Wasabi, and more:
The cmdlets provide built-in transport-layer security, and also support authentication via OAuth 2.0, allowing for securely authenticated connections.
Each storage provider is accessed through the same three cmdlets, allowing for a unified interface. This article will cover the basic usage of each cmdlet and explain the authentication requirements for each individual cloud storage provider.
- OAuth Authentication
- Non-OAuth Authentication
The following Cloud Storage providers require OAuth authentication:
- Google Drive
The OAuth authentication flow is similar for each of these providers, however, one important difference when authenticating with OneDrive is discussed at the end of this section.
OAuth Flow Overview
By design, user-interaction is required to fetch an OAuth token for the first time. A user's OAuth credentials are sent to an OAuth server, where the user must manually authenticate. If the authentication succeeds, this data is encoded in a token and that token is returned to the original application (in this case, the cloud storage cmdlets). The cmdlets then use this token to authenticate with the cloud storage provider.
These authentication tokens are valid for a limited period of time; however, the OAuth server also returns a refresh token that can be used to refresh expired authentication tokens. Refreshing a token does not require user-interaction.
Performing OAuth Authentication with the Cmdlets
The cloud storage cmdlets cannot bypass the requirement that users manually authenticate when generating their first OAuth token. However, once the first token is generated, the cmdlets automatically cache the authentication token and refresh token so that user-interaction is not required for subsequent calls to the same cloud storage provider. The OAuth token information is cached at the location specified by OAuthCacheDir, and caching can be disabled by passing the empty string to this parameter.
The OAuthClientId and OAuthClientSecret parameters are required to fetch the initial OAuth authentication token. These values should be set to the appropriate OAuth credentials provided by the particular cloud storage provider. Once the OAuth authentication token has been cached, the OAuthClientId and OAuthClientSecret parameters are not required to authenticate; however, when the authentication token expires and must be refreshed, these parameters are necessary to perform the token refresh. As such, it is recommended to pass these credentials on each call to a cloud storage provider to avoid unexpected errors when the token expires.
The cloud storage cmdlets use an embedded web server to listen for the responses from the OAuth server. Depending on the particular OAuth server, it may be necessary to register ahead-of-time the port to which these responses should be sent. The OAuthWebServerPort parameter governs the port on which the embedded web server listens, and should match the port expected by the server.
Enabling SSL for the OAuth Embedded Web Server (Required for OneDrive)
OneDrive's OAuth servers require that the token responses are sent to an SSL-enabled server. In order to host an SSL-enabled server, the server must have an SSL certificate. SSL certificates can be specified using the following parameters of the cloud storage cmdlets:
Note: Depending on the trust of the certificate used, the OAuth redirect may view the embedded web server as insecure. When fetching the initial OAuth token, during manual authentication, the browser may indicate that the redirect target is not trusted. This warning can be safely ignored.
The following Cloud Storage providers do not require OAuth, and instead authenticate in a different way:
- Amazon S3
- Azure Blob
Instead of using an OAuth token, each of these providers authenticate via a set of credentials that is roughly equivalent to a user/password combination. The specific parameters required to authenticate which each of these providers is listed below:
These parameters should be set to the appropriate credentials from the storage provider. The required parameters should be passed during each cmdlet operation, as no caching is performed.
The Get-CloudStorage cmdlet is used to list remote directories as well as download remote files and folders.
To list a remote directory, pass the desired directory path to the List parameter. Besides the authentication parameters mentioned in prior sections, no other parameters are required.
Here is an example of listing the root directory for Amazon S3 (listing all buckets) with Get-CloudStorage:
Get-CloudStorage -ServiceProvider Amazons3 -AmazonS3AccessKey $accesskey -AmazonS3SecretKey $secretkey -List "/"
Downloading Files and Folders
To download a file or folder, the RemoteFile parameter should be set to the remote path and file name of the target file, and LocalFile should be set to the local path and file name where the downloaded content will be stored. Get-CloudStorage supports downloading a single file, downloading multiple files via wildcards, and downloading an entire directory. Each of these cases has different restrictions on the values for RemoteFile and LocalFile.
|Operation||RemoteFile Requirements||LocalFile Requirements|
|Download a single file||Must be set to a single remote file name||Must be set to a single local file name|
|Download multiple files||Must be set to a remote path with wildcards|
|Must be set to the local directory where the files will be stored|
|Download a directory||Must be set to a remote directory||Must be set to the local directory where the downloaded directory will be stored|
When downloading a single file, the name of the LocalFile can be different from the name of the RemoteFile. In the other two cases, downloaded files will have the same name as the corresponding files on the remote server.
Here are a few examples of downloading files with Get-CloudStorage:
#download all text files from a subfolder of Box Get-CloudStorage -ServiceProvider Box -OAuthClientId $boxid -OAuthClientSecret $boxsecret -RemoteFile "test/*.txt" -LocalFile "C:/test" #download a single file from Wasabi Get-CloudStorage -ServiceProvider Wasabi -WasabiAccessKey $accesskey -WasabiSecretKey $secretkey -RemoteFile "apple.txt" -LocalFile "C:/test/downloaded_apple.txt" #download an entire directory from Dropbox Get-CloudStorage -ServiceProvider Dropbox -OAuthClientId $dropboxid -OAuthClientSecret $dropboxsecret -RemoteFile "folder/subfolder" -LocalFile "C:/test"
The Send-CloudStorage cmdlet is used to create new directories and upload local files and folders.
Creating a New Directory
To create a new directory, the MakeDirectory parameter should be set to the remote path (including the name of the new directory) where the directory should be created.
Here is an example of creating a new directory in Dropbox with Send-CloudStorage:
Send-CloudStorage -ServiceProvider Dropbox -OAuthClientId $dropboxid -OAuthClientSecret $dropboxsecret -MakeDirectory "newDir"
Uploading Files and Folders
To upload a file or folder, the LocalFile parameter should be set to the local path and file name of the target file, and RemoteFile should be set to the remote path and file name where the uploaded content will be stored. Send-CloudStorage supports uploading a single file and uploading multiple files via wildcards. These cases has different restrictions on the values for RemoteFile and LocalFile.
|Operation||RemoteFile Requirements||LocalFile Requirements|
|Upload a single file||Must be set to a single remote file name||Must be set to a single local file name|
|Upload multiple files||Must be set to the remote directory where the uploaded files will be stored||Must be set to a local path with wildcards|
When uploading a single file, the name of the RemoteFile can be different from the name of the LocalFile. In the other two cases, uploaded files will have the same name as the corresponding files on the local machine.
Here are a few examples of uploading files with Send-CloudStorage:
#upload all text files to a folder in Box Send-CloudStorage -ServiceProvider Box -OAuthClientId $boxid -OAuthClientSecret $boxsecret -LocalFile "C:/test/*.txt" -RemoteFile "uploadDir" #upload a single file to Wasabi Send-CloudStorage -ServiceProvider Wasabi -WasabiAccessKey $accesskey -WasabiSecretKey $secretkey -LocalFile "C:/test/uploadMe.txt" -RemoteFile "uploadDir/myUpload.txt"
The Remove-CloudStorage cmdlet is used to delete files and directories from cloud storage servers.
The RemoteFile parameter should be set to the target file or folder to delete. Besides authentication, no other parameters are required.
Only empty folders can be deleted, so files within a folder must be deleted in a separate call to the cmdlet. Multiple files can be deleted by including a wildcard (*) in the file name; each file matching the pattern will be deleted.
Here are a few examples of deleting files and folders with Remove-CloudStorage:
#remove a single file from Amazon S3 Remove-CloudStorage -ServiceProvider Amazons3 -AmazonS3AccessKey $accesskey -AmazonS3SecretKey $secretkey -RemoteFile "test/delete_me.txt" #remove all text files from a subfolder in Box Remove-CloudStorage -ServiceProvider Box -OAuthClientId $boxid -OAuthClientSecret $boxsecret -RemoteFile "test/*.txt" #remove a Wasabi directory/bucket (must be empty to suceed) Remove-CloudStorage -ServiceProvider Wasabi -WasabiAccessKey $accesskey -WasabiSecretKey $secretkey -RemoteFile "myBucket"
We appreciate your feedback. If you have any questions, comments, or suggestions about this article please contact our support team at firstname.lastname@example.org.