Import & Export

Export Files

Export files from the CloudConvert service to external storages or temporary download URLs

URL

To create temporary URLs for downloading files, create a job with a export/url task.

Please note that all tasks get deleted after 24 hours automatically. Meaning, the created URLs are available for 24 hours only.

Task Parameters

operation
string required
Value is export/url.
input
string or array of task names required

The input task name(s) for this task.

inline
boolean

This option makes the export URLs return the Content-Disposition inline header, which tells browser to display the file instead of downloading it.

archive_multiple_files
boolean

By default, multiple files will create multiple export URLs. When enabling this option, one export URL with a ZIP file will be created.

Use the Job Builder to generate and try out export/url jobs.

S3

To export files to an S3 compatible object storage, create a job with a export/s3 task.

Task Parameters

operation
string required
Value is export/s3.
input
string or array of task names required

The input task name(s) for this task.

bucket
string required

The Amazon S3 bucket where to store the file(s).

region
string required

Specify the Amazon S3 endpoint, e.g. us-west-2 or eu-west-1.

endpoint
string

Use a custom S3 API endpoint. The default endpoint is built from the configured region. Makes it possible to use other S3 compatible storage services (e.g. DigitalOcean).

key
string

S3 key for storing the file (the filename in the bucket, including path). If there are multiple files to export, printf style placeholders are possible (e.g. myfile-%d.pdf produces the output files myfile-1.pdf, myfile-2.pdf and so on).

key_prefix
string

Alternatively to using key, you can specify a key prefix for exporting files.

access_key_id
string required

The Amazon S3 access key id. It needs to have the s3:PutObject permission. When using a different ACL from private, it needs to have the s3:PutObjectAcl permission.

secret_access_key
string required

The Amazon S3 secret access key.

session_token
string

Auth using temporary credentials (AWS Security Token Service).

acl
string

S3 ACL for storing the files. Possible values include: private, public-read, public-read-write, authenticated-read, bucket-owner-read, bucket-owner-full-control. Defaults to private.

cache_control
string

S3 CacheControl header to specify the lifetime of the file, for example: max-age=172800.

content_disposition
string

Specify the Content-Disposition header for the file, for example: attachment or inline.

content_type
string

Specify the Content-Type header for the file, for example: application/pdf. By default, it will automatically set the correct Content-Type based on the mime type.

metadata
dictionary

Object of additional S3 meta data.

server_side_encryption
string

Enable the Server-side encryption algorithm used when storing this object in S3. Possible values include AES256 and aws:kms.

tagging
dictionary

Object of S3 tags to add to the keys.

Use the Job Builder to generate and try out export/s3 jobs.

Azure Blob Storage

To export files to an Azure blob container, create a job with a export/azure/blob task.

Task Parameters

operation
string required
Value is export/azure/blob.
input
string or array of task names required

The input task name(s) for this task.

storage_account
string required

The name of the Azure storage account (This is the string before .blob.core.windows.net).

storage_access_key
string

The Azure secret key. Only required alternatively, if you are not providing a SAS token.

sas_token
string

The Azure SAS token.

container
string required

Azure container name.

blob
string

Blob name for storing the file (the filename in the container, including path). If there are multiple files to export, printf style placeholders are possible (e.g. myfile-%d.pdf produces the output files myfile-1.pdf, myfile-2.pdf and so on).

blob_prefix
string

Alternatively to using blob, you can specify a blob prefix for exporting files.

metadata
dictionary

Object of additional Azure meta data.

Use the Job Builder to generate and try out export/azure/blob jobs.

Google Cloud Storage

To export files to a Google Cloud Storage bucket, create a job with a export/google-cloud-storage task.

Task Parameters

operation
string required
Value is export/google-cloud-storage.
input
string or array of task names required

The input task name(s) for this task.

project_id
string required

The Google Cloud Project ID (api-project-...).

bucket
string required

The Google Cloud Storage Bucket name.

client_email
string required

The client email of the service account to use (...@api-project-....iam.gserviceaccount.com).

private_key
string required

The private key of the service account.

file
string

Filename of the file to create (the filename in the bucket, including path).

file_prefix
string

Alternatively to using file, you can specify a file prefix for exporting files.

Use the Job Builder to generate and try out export/google-cloud-storage jobs.

OpenStack

To export files to OpenStack Object Storage (Swift), create a job with a export/openstack task.

Task Parameters

operation
string required
Value is export/openstack.
input
string or array of task names required

The input task name(s) for this task.

auth_url
string required

The URL of the OpenStack Identity endpoint (without version).

region
string required

Specify the OpenStack region.

container
string required

The name of the OpenStack Storage container.

username
string required

The OpenStack username.

password
string required

The OpenStack password.

file
string

File name of the file to create (the filename in container bucket, including path).

file_prefix
string

Alternatively to using file, you can specify a file prefix for exporting files.

Use the Job Builder to generate and try out export/openstack jobs.

SFTP

To export files to your SFTP server, create a job with a export/sftp task.

Task Parameters

operation
string required
Value is export/sftp.
input
string or array of task names required

The input task name(s) for this task.

host
string required

The SFTP server hostname.

port
integer

The SFTP port. Defaults to 22.

username
string required

The SFTP username.

password
string

The SFTP password.

private_key
string

Alternatively to using password, you can provide a private key.

file
string

File name of the file to create (the filename on the server, including path).

path
string

Alternatively to using file, you can specify a path for exporting files.

Use the Job Builder to generate and try out export/sftp jobs.

Upload

To upload files to any arbitrary URL via HTTP PUT, create a job with a export/upload task.

This can be used to upload to AWS S3 using presigned URLs, for example.

Task Parameters

operation
string required
Value is export/upload.
input
string or array of task names required

The input task name(s) for this task.

url
string required

The URL to send the PUT request to.

headers
dictionary

Object of additional headers to send with the PUT request. Can be used to access URLs that require authorization.

Use the Job Builder to generate and try out export/upload jobs.