Uploads API

Upload models to Xerotier using chunked, directory, or archive upload modes. The Uploads API follows the OpenAI-compatible pattern with Xerotier extensions for multi-file and archive workflows.

Overview

The Uploads API supports three upload modes for getting model files into Xerotier:

  • Single chunked upload -- Upload a single file in parts (chunks). Create an upload session, upload each chunk with a SHA256 checksum, then complete the upload. Interrupted uploads can be resumed.
  • Directory upload -- Upload an entire model directory (multiple files). Provide a file manifest on creation and upload each file individually. Large files within the directory are automatically chunked.
  • Archive upload -- Upload a compressed archive (tar, tar.gz, tar.bz2) containing the model. The archive is extracted server-side after upload completes.

All upload modes support resumability, SHA256 integrity verification, and automatic model record creation on completion. Uploaded content is stored using the platform's two-tier storage architecture. For details on storage tiers, encryption, retention, and billing, see Storage.

Create Upload

POST /v1/uploads

Creates a new chunked upload session for a single file.

Request Body

Parameter Type Description
purposerequired string The intended purpose. Currently only "model" is supported.
filenamerequired string Name of the file being uploaded.
bytesrequired integer Total file size in bytes. Must be greater than 0.
mime_typeoptional string MIME type of the file. Defaults to "application/octet-stream".
curl
curl -X POST https://api.xerotier.ai/proj_ABC123/v1/uploads \ -H "Authorization: Bearer xero_myproject_your_api_key" \ -H "Content-Type: application/json" \ -d '{ "purpose": "model", "filename": "model.safetensors", "bytes": 10737418240, "mime_type": "application/octet-stream" }'

Response

{ "id": "00000000-1111-0000-1111-000000000000", "object": "upload", "bytes": 10737418240, "created_at": 1700000000, "filename": "model.safetensors", "purpose": "model", "status": "pending", "expires_at": 1700086400, "upload_type": "single", "chunk_size": 104857600, "total_chunks": 103, "uploaded_chunks": 0, "progress": 0 }

List Uploads

GET /v1/uploads

Lists upload sessions for the authenticated project.

Query Parameters

Parameter Type Description
limitoptional integer Maximum number of sessions to return (1-100, default 20).
statusoptional string Filter by upload status (e.g., "pending", "completed", "cancelled").
curl
curl https://api.xerotier.ai/proj_ABC123/v1/uploads?limit=10 \ -H "Authorization: Bearer xero_myproject_your_api_key"

Response

{ "object": "list", "data": [ { "id": "00000000-1111-0000-1111-000000000000", "object": "upload", "bytes": 10737418240, "created_at": 1700000000, "filename": "model.safetensors", "purpose": "model", "status": "pending", "expires_at": 1700086400, "upload_type": "single", "chunk_size": 104857600, "total_chunks": 103, "uploaded_chunks": 50, "progress": 48.54 } ], "first_id": "00000000-1111-0000-1111-000000000000", "last_id": "00000000-1111-0000-1111-000000000000", "has_more": false }

Get Upload

GET /v1/uploads/{upload_id}

Retrieves the current status and progress of an upload session.

curl
curl https://api.xerotier.ai/proj_ABC123/v1/uploads/00000000-1111-0000-1111-000000000000 \ -H "Authorization: Bearer xero_myproject_your_api_key"

Response

{ "id": "00000000-1111-0000-1111-000000000000", "object": "upload", "bytes": 10737418240, "created_at": 1700000000, "filename": "model.safetensors", "purpose": "model", "status": "pending", "expires_at": 1700086400, "upload_type": "single", "chunk_size": 104857600, "total_chunks": 103, "uploaded_chunks": 75, "progress": 72.82 }

Add Part

POST /v1/uploads/{upload_id}/parts

Uploads a single chunk (part) to an active upload session. Each chunk must include a SHA256 checksum for integrity verification.

Headers

Header Description
X-Chunk-Checksumrequired SHA256 hex digest of the chunk data.

Query Parameters

Parameter Type Description
part_numberrequired integer Zero-based chunk index. Can also be provided via X-Part-Number header.

The request body is the raw binary chunk data. Maximum chunk size is 200 MB.

curl
curl -X POST "https://api.xerotier.ai/proj_ABC123/v1/uploads/00000000-1111-0000-1111-000000000000/parts?part_number=0" \ -H "Authorization: Bearer xero_myproject_your_api_key" \ -H "X-Chunk-Checksum: a1b2c3d4e5f6..." \ --data-binary @chunk_000.bin

Response

{ "id": "part_0", "object": "upload.part", "created_at": 1700000100, "upload_id": "00000000-1111-0000-1111-000000000000", "chunk_index": 0, "bytes_received": 104857600, "checksum": "a1b2c3d4e5f6..." }

Complete Upload

POST /v1/uploads/{upload_id}/complete

Marks the upload as complete. All chunks must have been uploaded before calling this endpoint. For archive uploads, the server extracts the archive. For directory uploads, all expected files must be uploaded. On success, a model record is created automatically.

curl
curl -X POST https://api.xerotier.ai/proj_ABC123/v1/uploads/00000000-1111-0000-1111-000000000000/complete \ -H "Authorization: Bearer xero_myproject_your_api_key"

Response

{ "id": "00000000-1111-0000-1111-000000000000", "object": "upload", "status": "completed", "bytes": 10737418240, "created_at": 1700000000, "filename": "model.safetensors", "upload_type": "single", "model": { "id": "660e8400-e29b-41d4-a716-446655440000", "name": "model.safetensors", "format": "safetensors", "size_bytes": 10737418240, "status": "ready", "context_length": 4096, "architecture": "llama", "quantization": null } }

Cancel Upload

POST /v1/uploads/{upload_id}/cancel

Cancels an in-progress upload session and cleans up all uploaded chunks.

curl
curl -X POST https://api.xerotier.ai/proj_ABC123/v1/uploads/00000000-1111-0000-1111-000000000000/cancel \ -H "Authorization: Bearer xero_myproject_your_api_key"

Response

{ "id": "00000000-1111-0000-1111-000000000000", "object": "upload", "bytes": 10737418240, "created_at": 1700000000, "filename": "model.safetensors", "purpose": "model", "status": "cancelled", "expires_at": 1700086400 }

Delete Upload

DELETE /v1/uploads/{upload_id}

Deletes an upload session. This is functionally identical to cancelling the upload.

curl
curl -X DELETE https://api.xerotier.ai/proj_ABC123/v1/uploads/00000000-1111-0000-1111-000000000000 \ -H "Authorization: Bearer xero_myproject_your_api_key"

Resume Upload

POST /v1/uploads/{upload_id}/resume

Retrieves information needed to resume an interrupted upload. Returns the next expected chunk index and a list of any missing chunks that need to be re-uploaded.

curl
curl -X POST https://api.xerotier.ai/proj_ABC123/v1/uploads/00000000-1111-0000-1111-000000000000/resume \ -H "Authorization: Bearer xero_myproject_your_api_key"

Response

{ "id": "00000000-1111-0000-1111-000000000000", "next_chunk_index": 52, "uploaded_chunks": 50, "missing_chunks": [5, 10] }

Directory Uploads

POST /v1/uploads/directory

Creates a directory upload session for uploading an entire model directory (multiple files). You provide a file manifest describing all files, and the server returns per-file upload paths.

Request Body

Parameter Type Description
model_namerequired string Name for the model being uploaded.
filesrequired array Array of file manifest entries, each with relative_path (string) and size (integer, bytes).
descriptionoptional string Description of the model.
workload_typeoptional string Workload type hint for the model.
quantizationoptional string Runtime quantization method to apply.
curl
curl -X POST https://api.xerotier.ai/proj_ABC123/v1/uploads/directory \ -H "Authorization: Bearer xero_myproject_your_api_key" \ -H "Content-Type: application/json" \ -d '{ "model_name": "Qwen3-0.6B", "files": [ {"relative_path": "config.json", "size": 1234}, {"relative_path": "model.safetensors", "size": 1234000000} ] }'

Response

{ "id": "770e8400-e29b-41d4-a716-446655440000", "object": "upload", "bytes": 1234001234, "created_at": 1700000000, "filename": "Qwen3-0.6B", "purpose": "model", "status": "pending", "expires_at": 1700086400, "upload_type": "directory", "chunk_size": 104857600, "uploaded_chunks": 0, "progress": 0, "files": [ { "relative_path": "config.json", "upload_path": "v1/uploads/770e8400-.../files/config.json", "size": 1234, "requires_chunking": false, "total_chunks": 0, "status": "pending" }, { "relative_path": "model.safetensors", "upload_path": "v1/uploads/770e8400-.../files/model.safetensors", "size": 1234000000, "requires_chunking": true, "total_chunks": 12, "chunk_url": "v1/uploads/770e8400-.../file-chunks", "status": "pending" } ], "chunk_upload_url": "v1/uploads/770e8400-.../file-chunks" }

Upload Individual Files

POST /v1/uploads/{upload_id}/files/{relative_path}

Uploads a single file (for small files that do not require chunking). Include the X-Chunk-Checksum header with the SHA256 hex digest.

curl
curl -X POST https://api.xerotier.ai/proj_ABC123/v1/uploads/770e8400-.../files/config.json \ -H "Authorization: Bearer xero_myproject_your_api_key" \ -H "X-Chunk-Checksum: abc123..." \ --data-binary @config.json

Upload File Chunks

POST /v1/uploads/{upload_id}/file-chunks/{chunk_index}?relative_path={path}

Uploads a single chunk of a large file within a directory upload. The relative_path query parameter identifies which file the chunk belongs to.

Complete File

POST /v1/uploads/{upload_id}/file-complete

Assembles chunks for a file in directory mode. Send a JSON body with "relative_path" to identify the file.

Archive Uploads

POST /v1/uploads/archive

Creates an archive upload session. Upload a compressed archive containing the model files. The archive is extracted server-side when the upload completes.

Request Body

Parameter Type Description
model_namerequired string Name for the model being uploaded.
archive_sizerequired integer Total archive size in bytes. Must be greater than 0.
archive_formatrequired string Archive format. Supported: "tar", "tar.gz", "tar.bz2".
descriptionoptional string Description of the model.
workload_typeoptional string Workload type hint for the model.
quantizationoptional string Runtime quantization method to apply.
curl
curl -X POST https://api.xerotier.ai/proj_ABC123/v1/uploads/archive \ -H "Authorization: Bearer xero_myproject_your_api_key" \ -H "Content-Type: application/json" \ -d '{ "model_name": "Qwen3-0.6B", "archive_size": 1234567890, "archive_format": "tar.gz" }'

Response

{ "id": "880e8400-e29b-41d4-a716-446655440000", "object": "upload", "bytes": 1234567890, "created_at": 1700000000, "filename": "Qwen3-0.6B", "purpose": "model", "status": "pending", "expires_at": 1700086400, "upload_type": "archive", "chunk_size": 104857600, "total_chunks": 12, "uploaded_chunks": 0, "progress": 0 }

After creating the archive upload session, upload chunks using the Add Part endpoint, then call Complete Upload to trigger server-side extraction.

Client Examples

Python (requests)

Python
import requests import hashlib headers = { "Authorization": "Bearer xero_myproject_your_api_key", "Content-Type": "application/json" } base = "https://api.xerotier.ai/proj_ABC123/v1" # Step 1: Create an upload session upload = requests.post(f"{base}/uploads", headers=headers, json={ "purpose": "model", "filename": "model.safetensors", "bytes": 10737418240, "mime_type": "application/octet-stream" }).json() print(f"Upload created: {upload['id']}") print(f"Chunk size: {upload['chunk_size']}, Total chunks: {upload['total_chunks']}") # Step 2: Upload chunks chunk_size = upload["chunk_size"] with open("model.safetensors", "rb") as f: part_number = 0 while True: data = f.read(chunk_size) if not data: break checksum = hashlib.sha256(data).hexdigest() requests.post( f"{base}/uploads/{upload['id']}/parts?part_number={part_number}", headers={ "Authorization": headers["Authorization"], "X-Chunk-Checksum": checksum }, data=data ) part_number += 1 print(f"Uploaded chunk {part_number}/{upload['total_chunks']}") # Step 3: Complete the upload result = requests.post( f"{base}/uploads/{upload['id']}/complete", headers=headers ).json() print(f"Upload complete: {result['status']}") # List uploads uploads = requests.get(f"{base}/uploads?limit=10", headers=headers).json() for u in uploads["data"]: print(f"{u['id']} - {u['filename']} ({u['status']}, {u['progress']}%)") # Resume an interrupted upload resume = requests.post( f"{base}/uploads/{upload['id']}/resume", headers=headers ).json() print(f"Next chunk: {resume['next_chunk_index']}, Missing: {resume['missing_chunks']}")

Node.js (fetch)

JavaScript
import { readFile } from "node:fs/promises"; import { createHash } from "node:crypto"; const base = "https://api.xerotier.ai/proj_ABC123/v1"; const headers = { "Authorization": "Bearer xero_myproject_your_api_key", "Content-Type": "application/json" }; // Step 1: Create an upload session const createRes = await fetch(`${base}/uploads`, { method: "POST", headers, body: JSON.stringify({ purpose: "model", filename: "model.safetensors", bytes: 10737418240, mime_type: "application/octet-stream" }) }); const upload = await createRes.json(); console.log(`Upload created: ${upload.id}`); // Step 2: Upload chunks const fileData = await readFile("model.safetensors"); const chunkSize = upload.chunk_size; for (let i = 0; i < upload.total_chunks; i++) { const start = i * chunkSize; const chunk = fileData.subarray(start, start + chunkSize); const checksum = createHash("sha256").update(chunk).digest("hex"); await fetch(`${base}/uploads/${upload.id}/parts?part_number=${i}`, { method: "POST", headers: { "Authorization": headers.Authorization, "X-Chunk-Checksum": checksum }, body: chunk }); console.log(`Uploaded chunk ${i + 1}/${upload.total_chunks}`); } // Step 3: Complete the upload const completeRes = await fetch(`${base}/uploads/${upload.id}/complete`, { method: "POST", headers }); const result = await completeRes.json(); console.log(`Upload complete: ${result.status}`); // List uploads const listRes = await fetch(`${base}/uploads?limit=10`, { headers }); const uploads = await listRes.json(); uploads.data.forEach(u => console.log(`${u.id} - ${u.filename} (${u.progress}%)`) );

Error Handling

HTTP Status Error Code Description
400 invalid_request Missing or invalid parameters (bad filename, zero bytes, invalid archive format, missing checksum, etc.).
401 authentication_error Invalid or missing API key.
403 forbidden Free tier model limit exceeded or storage quota exceeded.
404 not_found Upload session not found.
413 content_too_large Individual chunk exceeds 200 MB limit.
429 rate_limit_exceeded Too many requests. Check the Retry-After header.
503 service_unavailable Storage service temporarily unavailable.