Data Stores
What are Data Stores?
Section titled “What are Data Stores?”Data stores are where Uploadista saves your uploaded files. Think of them as the “hard drive” for your uploads - they handle storing the actual file content, whether that’s on cloud services like AWS S3, Azure Blob Storage, Google Cloud Storage, or your local filesystem.
Uploadista automatically handles the complexity of each storage provider, including resumable uploads, chunked transfers for large files, and progress tracking. You just configure where to store files, and Uploadista takes care of the rest.
When to Use Each Store
Section titled “When to Use Each Store”| Store | Best For | Not Recommended For |
|---|---|---|
| AWS S3 | Production apps, global CDN delivery, large files | Local development |
| Azure Blob | Microsoft ecosystem, enterprise apps | Small projects |
| Google Cloud Storage | GCP ecosystem, AI/ML workflows | Non-GCP infrastructure |
| Filesystem | Local development, single-server apps | Multi-server production |
Quick Decision Guide:
- Starting a new project? Use Filesystem for development, S3 for production
- Already on AWS? Use S3
- Already on Azure? Use Azure Blob Storage
- Already on GCP? Use Google Cloud Storage
- Running on Cloudflare Workers? Use S3-compatible storage (like Cloudflare R2)
Quick Start
Section titled “Quick Start”Here’s the simplest way to get started with each storage provider:
import { s3Store } from "@uploadista/data-store-s3";
const dataStore = s3Store({ deliveryUrl: "https://my-bucket.s3.amazonaws.com", s3ClientConfig: { region: "us-east-1", bucket: "my-uploads", },});Required environment variables:
AWS_ACCESS_KEY_ID=your-access-keyAWS_SECRET_ACCESS_KEY=your-secret-keyAWS_REGION=us-east-1import { azureStore } from "@uploadista/data-store-azure";
const dataStore = azureStore({ deliveryUrl: "https://myaccount.blob.core.windows.net", containerName: "uploads", connectionString: process.env.AZURE_STORAGE_CONNECTION_STRING,});Required environment variable:
AZURE_STORAGE_CONNECTION_STRING=DefaultEndpointsProtocol=https;AccountName=...import { gcsStoreNodejs } from "@uploadista/data-store-gcs";
const dataStore = gcsStoreNodejs({ bucketName: "my-uploads", keyFilename: "./service-account.json",});Or with environment credentials:
GOOGLE_APPLICATION_CREDENTIALS=./service-account.jsonimport { fileStore } from "@uploadista/data-store-filesystem";
const dataStore = fileStore({ directory: "./uploads", deliveryUrl: "http://localhost:3000/files",});No environment variables needed - files are stored locally.
Available Data Stores
Section titled “Available Data Stores”AWS S3
Section titled “AWS S3”The most popular choice for production deployments. Supports multipart uploads for large files (up to 5TB), automatic retries, and CDN integration via CloudFront.
Installation:
npm install @uploadista/data-store-s3 @aws-sdk/client-s3pnpm add @uploadista/data-store-s3 @aws-sdk/client-s3yarn add @uploadista/data-store-s3 @aws-sdk/client-s3Configuration options:
| Option | Description | Default |
|---|---|---|
deliveryUrl | URL where files will be accessible | Required |
s3ClientConfig.bucket | S3 bucket name | Required |
s3ClientConfig.region | AWS region | Required |
partSize | Chunk size for multipart uploads | 5 MB |
maxConcurrentPartUploads | Parallel upload threads | 60 |
Features:
- Multipart uploads for files over 5MB
- Resumable uploads (interrupted uploads can continue)
- Up to 10,000 parts per file
- Automatic retry on network failures
Azure Blob Storage
Section titled “Azure Blob Storage”Enterprise-grade storage for Microsoft Azure deployments. Uses block blobs for efficient large file handling.
Installation:
npm install @uploadista/data-store-azure @azure/storage-blobpnpm add @uploadista/data-store-azure @azure/storage-blobyarn add @uploadista/data-store-azure @azure/storage-blobConfiguration options:
| Option | Description | Default |
|---|---|---|
deliveryUrl | URL where files will be accessible | Required |
containerName | Azure container name | Required |
connectionString | Azure connection string | Required |
blockSize | Chunk size for block uploads | 4 MB |
Features:
- Block blob uploads for large files
- Up to 50,000 blocks per file
- Supports both connection string and OAuth authentication
Google Cloud Storage
Section titled “Google Cloud Storage”Native integration with Google Cloud Platform. Supports both Node.js and REST-based implementations for different runtime environments.
Installation:
npm install @uploadista/data-store-gcs @google-cloud/storagepnpm add @uploadista/data-store-gcs @google-cloud/storageyarn add @uploadista/data-store-gcs @google-cloud/storageTwo implementations available:
gcsStoreNodejs()- Best for Node.js servers, uses official Google SDKgcsStoreRest()- Works anywhere (including edge runtimes), uses REST API
Configuration options:
| Option | Description | Default |
|---|---|---|
bucketName | GCS bucket name | Required |
keyFilename | Path to service account JSON | Optional |
credentials | Service account object | Optional |
Local Filesystem
Section titled “Local Filesystem”Simple file-based storage for development and single-server deployments.
Installation:
npm install @uploadista/data-store-filesystempnpm add @uploadista/data-store-filesystemyarn add @uploadista/data-store-filesystemConfiguration options:
| Option | Description | Default |
|---|---|---|
directory | Local folder path | Required |
deliveryUrl | URL prefix for file access | Required |
Limitations:
- Not suitable for multi-server deployments
- No built-in CDN support
- Files lost if server disk fails
Common Patterns
Section titled “Common Patterns”Using with CloudFront CDN
Section titled “Using with CloudFront CDN”For production S3 deployments, serve files through CloudFront for better performance:
const dataStore = s3Store({ // Point to CloudFront, not S3 directly deliveryUrl: "https://d123456.cloudfront.net", s3ClientConfig: { region: "us-east-1", bucket: "my-uploads", },});Using Cloudflare R2 (S3-compatible)
Section titled “Using Cloudflare R2 (S3-compatible)”Cloudflare R2 uses the S3 API, so you can use the S3 store:
const dataStore = s3Store({ deliveryUrl: process.env.R2_DELIVERY_URL, s3ClientConfig: { bucket: process.env.R2_BUCKET, region: "auto", endpoint: process.env.R2_ENDPOINT, credentials: { accessKeyId: process.env.R2_ACCESS_KEY_ID, secretAccessKey: process.env.R2_SECRET_ACCESS_KEY, }, },});Large File Optimization
Section titled “Large File Optimization”For files over 100MB, increase the chunk size:
const dataStore = s3Store({ deliveryUrl: "https://cdn.example.com", s3ClientConfig: { region: "us-east-1", bucket: "my-uploads" }, partSize: 50 * 1024 * 1024, // 50MB chunks maxConcurrentPartUploads: 4, // Fewer parallel uploads for stability});Troubleshooting
Section titled “Troubleshooting””Access Denied” errors
Section titled “”Access Denied” errors”- S3: Check your IAM user has
s3:PutObjectands3:GetObjectpermissions - Azure: Verify the connection string includes write access
- GCS: Ensure the service account has Storage Object Admin role
Uploads failing for large files
Section titled “Uploads failing for large files”- Increase
partSizeto reduce the number of parts - Check your storage quota hasn’t been exceeded
- Verify network stability for long uploads
Files not accessible after upload
Section titled “Files not accessible after upload”- Verify
deliveryUrlpoints to the correct endpoint - Check bucket/container permissions allow public read (or use signed URLs)
- For S3, ensure bucket policy allows
GetObject
Related Concepts
Section titled “Related Concepts”- Upload Engine - How files get to data stores
- Uploadista Server - Configure your server
- KV Stores - Where upload metadata is stored
- Flows Engine - Process files after upload