Import & Export

Import Files

Import files to the CloudConvert service as input files for other operations

URL

To import a file by downloading it from an URL, create a job with a import/url task.

Task Parameters

operation
string required
Value is import/url.
url
string required

The URL to the file.

filename
string required

The filename of the input file, including extension. If none provided we will try to detect the filename from the URL.

headers
dictionary

Object of additional headers to send with the download request. Can be used to access URLs that require authorization.

Use the Job Builder to generate and try out import/url jobs.

Base64 String

To import a file by providing a base64 encoded string with the file content, create a job with a import/base64 task.

Task Parameters

operation
string required
Value is import/base64.
file
string required

Base64 encoded file content.

filename
string required

The filename of the input file, including extension.

Do not use this import method for files >10MB. Files as embedded base64 strings blow up the request payload which might cause issues. For bigger files, use an asynchronous import method like import/upload or import/url.
Use the Job Builder to generate and try out import/base64 jobs.

Raw String

To import a file by providing a raw string with the file content, create a job with a import/raw task.

Task Parameters

operation
string required
Value is import/raw.
file
string required

File content as string.

filename
string required

The filename of the input file, including extension.

Do not use this import method for files >10MB. Files as embedded raw strings blow up the request payload which might cause issues. For bigger files, use an asynchronous import method like import/upload or import/url.
Use the Job Builder to generate and try out import/raw jobs.

Upload

Allow your users to directly upload input files to CloudConvert, without temporary storing them on your server.

First, create a job with a import/upload task:

Task Parameters

operation
string required
Value is import/upload.
redirect
string
Optionally redirect user to this URL after upload.

Uploading

The job creation response has a form object in the result key which then allows uploading the file:

{
  "data": {
    "id": "9a160154-58e2-437f-9b6b-19d63b1f59e3",
    "tag": "myjob-123",
    "status": "waiting",
    "created_at": "2018-09-19T14:42:58+00:00",
    "started_at": "2018-09-19T14:42:58+00:00",
    "tasks": [
      {
        "id": "c85f3ca9-164c-4e89-8ae2-c08192a7cb08",
        "operation": "import/upload",
        "status": "waiting",
        "message": "Waiting for upload",
        "code": null,
        "created_at": "2018-09-19T14:42:58+00:00",
        "started_at": null,
        "ended_at": null,
        "payload": {},
        "result": {
          "form": {
            "url": "https://upload.cloudconvert.com/d660c0df-d15e-468a-9554-917e0f0f3ef1/",
            "parameters": {
              "expires": 1545444403,
              "max_file_count": 1,
              "max_file_size": 10000000000,
              "signature": "d0db9b5e4ff7283xxfe0b1e3ad6x1db95c616121"
            }
          }
        }
      }
    ],
    "links": {
      "self": "https://api.cloudconvert.com/v2/jobs/Xh56hvvMhG"
    }
  }
}

As shown below, you can use the url and parameters to allow browser-based uploading. Please note that all post parameters are required and that the number and names of the post parameters might vary. file always needs to be the last post parameter.

Alteratively, our SDKs have built-in methods for uploading files:

<form action="https://upload.cloudconvert.com/d660c0df-d15e-468a-9554-917e0f0f3ef1/"
      method="POST"
      enctype="multipart/form-data">
    <input type="hidden" name="expires" value="1545444403">
    <input type="hidden" name="max_file_count" value="1">
    <input type="hidden" name="max_file_size" value="10000000000">
    <input type="hidden" name="signature" value="d0db9b5e4ff7283xxfe0b1e3ad6x1db95c616121">
    <input type="file" name="file">
    <input type="submit">
</form>

The job is stuck in waiting without input file and continues as soon as a file was uploaded.

Do not hardcode the keys / values of parameters because they are changing dynamically! The keys and values on this page are just examples.
Use the Job Builder to generate and try out import/upload jobs.

S3

To import a file from your S3 compatible object storage, create a job with a import/s3 task.

Task Parameters

operation
string required
Value is import/s3.
bucket
string required

The Amazon S3 bucket where to download the file.

region
string required

Specify the Amazon S3 endpoint, e.g. us-west-2 or eu-west-1.

endpoint
string

Use a custom S3 API endpoint. The default endpoint is built from the configured region. Makes it possible to use other S3 compatible storage services (e.g. DigitalOcean).

key
string

S3 key of the input file (the filename in the bucket, including path).

key_prefix
string

Alternatively to using key, you can specify a key prefix for importing multiple files at once.

access_key_id
string required

The Amazon S3 access key id. It needs to have the s3:GetObject permission.

secret_access_key
string required

The Amazon S3 secret access key.

session_token
string

Auth using temporary credentials (AWS Security Token Service).

Use the Job Builder to generate and try out import/s3 jobs.

Azure Blob Storage

To import a file from a Azure blob container, create a job with a import/azure/blob task.

Task Parameters

operation
string required
Value is import/azure/blob.
storage_account
string required

The name of the Azure storage account (This is the string before .blob.core.windows.net).

storage_access_key
string

The Azure secret key. Only required alternatively, if you are not providing a SAS token.

sas_token
string

The Azure SAS token.

container
string required

Azure container name.

blob
string

Azure blob name of the input file (the filename in the bucket, including path).

blob_prefix
string

Alternatively to using blob, you can specify a blob prefix for importing multiple files at once.

Use the Job Builder to generate and try out import/azure/blob jobs.

Google Cloud Storage

To import a file from a Google Cloud Storage bucket, create a job with a import/google-cloud-storage task.

Task Parameters

operation
string required
Value is import/google-cloud-storage.
project_id
string required

The Google Cloud Project ID (api-project-...).

bucket
string required

The Google Cloud Storage Bucket name.

client_email
string required

The client email of the service account to use (...@api-project-....iam.gserviceaccount.com).

private_key
string required

The private key of the service account.

file
string

Filename of the input file (the filename in the bucket, including path).

file_prefix
string

Alternatively to using file, you can specify a file prefix for importing multiple files at once.

Use the Job Builder to generate and try out import/google-cloud-storage jobs.

OpenStack

To import a file from OpenStack Object Storage (Swift), create a job with a import/openstack task.

Task Parameters

operation
string required
Value is import/openstack.
auth_url
string required

The URL of the OpenStack Identity endpoint (without version).

region
string required

Specify the OpenStack region.

container
string required

The name of the OpenStack Storage container.

username
string required

The OpenStack username.

password
string required

The OpenStack password.

file
string

File name of the input file (the filename in the container, including path).

file_prefix
string

Alternatively to using file, you can specify a file prefix for importing multiple files at once.

Use the Job Builder to generate and try out import/openstack jobs.

SFTP

To import a file from your SFTP server, create a job with a import/sftp task.

Task Parameters

operation
string required
Value is import/sftp.
host
string required

The SFTP server hostname.

port
integer

The SFTP port. Defaults to 22.

username
string required

The SFTP username.

password
string

The SFTP password.

private_key
string

Alternatively to using password, you can provide a private key.

file
string

File name of the input file (the filename on the server, including path).

path
string

Alternatively to using file, you can specify a path for importing multiple files at once.

Use the Job Builder to generate and try out import/sftp jobs.