How This Helps
The local file upload workflow lets you create datasets directly from files on your machine without needing cloud storage. It’s a four-step process: create the dataset, open a transaction, upload your files, then trigger processing.
Use
status_new for all status checks. The status field is being retired. See Retrieve Dataset Status.Prerequisites
- A Visual Layer Cloud account with API access.
- A valid JWT token. See Authentication.
- Images or video files available on your local machine.
For large datasets (hundreds of files or more), uploading from an S3 bucket is simpler and more reliable. See Create a Dataset from S3. Archives (
.zip, .tar, .tar.gz) are not supported for initial dataset creation — use individual files or S3 instead.Upload Workflow
Creating a dataset from local files follows a four-step ingestion process.- Create the dataset to get a
dataset_id. - Open a transaction to get a
transaction_id. - Upload your files to the transaction — one or more requests, each with multiple files.
- Trigger processing to start indexing.
Step 1: Create the Dataset
Create a new empty dataset and receive adataset_id.
Example
Response
dataset_id — all subsequent steps require it.
Step 2: Open a Transaction
Open a file upload transaction to receive atransaction_id.
Example
Response
Step 3: Upload Files
Upload files to the open transaction. Each request uses thefiles form field. You can send multiple files per request and make multiple requests to the same transaction_id before triggering processing.
Single Request (few files)
HTTP 202 Accepted.
Large Batches (hundreds of files)
Split uploads across multiple requests to the same transaction. Send batches of approximately 50 files per request to avoid hitting request size limits.process_files until all batches are uploaded.
Step 4: Trigger Processing
Once all files are uploaded, trigger ingestion to start indexing.Example
HTTP 202 Accepted. Processing runs asynchronously.
Monitor Dataset Status
Poll the dataset status endpoint to track progress. The dataset moves throughINDEXING and reaches READY when complete.
Python Example (with batched upload)
The following example runs the complete four-step workflow with batched uploads for large file sets.Response Codes
See Error Handling for the error response format and Python handling patterns.| HTTP Code | Meaning |
|---|---|
| 200 / 202 | Request accepted successfully. |
| 400 | Bad Request — missing parameters or unsupported file format. |
| 401 | Unauthorized — check your JWT token. |
| 404 | Dataset or transaction not found. |
| 422 | Unprocessable — check that the files field name is correct (not files[]). |