Start an import job

This call lets you queue a job to import a file or folder from a volume into a project on the Platform. Essentially, you are importing an item from your cloud storage provider (Amazon Web Services or Google Cloud Storage) via the volume onto the Platform.

If successful, an alias will be created on the Platform. Aliases appear on the Platform and can be copied, executed, and modified as such. They refer back to the respective item on the given volume.

If you want to import multiple files, the recommended way is to do it in bulk considering the API rate limit (learn more).

Learn more about using the Volumes API for Amazon S3 and for Google Cloud Storage.


If you are using Seven Bridges Platform EU, please use the following endpoint:

Example request body

   "overwrite": true
     "source": {
       "volume": "rfranklin/my_volume",
       "location": "samples/november/"
     "destination": {
         "parent": "5bcdc233e4b0cbdd7a82e7cc"


See a list of Seven Bridges Platform-specific response codes that may be contained in the body of the response.

Response body

The response object contains information about the specified volume. The information is structured using the following key-value pairs:

KeyData type of valueDescription of value
idStringID of this import job
stateStringThe state of this import job. Possible values are:
PENDING: the import is queued;
RUNNING: the import is running;
COMPLETED: the import has completed successfully;
FAILED: the import has failed.
sourceObjectImport source, as passed when this job was started.
destinationObjectImport destination, as passed when this job was started.
resultObjectFile object that was imported.
errorObjectIn case of error in the import job, standard API error is returned here.
started_onStringTime when the import job started.
finished_onStringTime when the import job ended.