Skip to content

Data Stores

Data stores are where Uploadista saves your uploaded files. Think of them as the “hard drive” for your uploads - they handle storing the actual file content, whether that’s on cloud services like AWS S3, Azure Blob Storage, Google Cloud Storage, or your local filesystem.

Uploadista automatically handles the complexity of each storage provider, including resumable uploads, chunked transfers for large files, and progress tracking. You just configure where to store files, and Uploadista takes care of the rest.

StoreBest ForNot Recommended For
AWS S3Production apps, global CDN delivery, large filesLocal development
Azure BlobMicrosoft ecosystem, enterprise appsSmall projects
Google Cloud StorageGCP ecosystem, AI/ML workflowsNon-GCP infrastructure
FilesystemLocal development, single-server appsMulti-server production

Quick Decision Guide:

  • Starting a new project? Use Filesystem for development, S3 for production
  • Already on AWS? Use S3
  • Already on Azure? Use Azure Blob Storage
  • Already on GCP? Use Google Cloud Storage
  • Running on Cloudflare Workers? Use S3-compatible storage (like Cloudflare R2)

Here’s the simplest way to get started with each storage provider:

import { s3Store } from "@uploadista/data-store-s3";
const dataStore = s3Store({
deliveryUrl: "https://my-bucket.s3.amazonaws.com",
s3ClientConfig: {
region: "us-east-1",
bucket: "my-uploads",
},
});

Required environment variables:

Terminal window
AWS_ACCESS_KEY_ID=your-access-key
AWS_SECRET_ACCESS_KEY=your-secret-key
AWS_REGION=us-east-1

The most popular choice for production deployments. Supports multipart uploads for large files (up to 5TB), automatic retries, and CDN integration via CloudFront.

Installation:

npm install @uploadista/data-store-s3 @aws-sdk/client-s3

Configuration options:

OptionDescriptionDefault
deliveryUrlURL where files will be accessibleRequired
s3ClientConfig.bucketS3 bucket nameRequired
s3ClientConfig.regionAWS regionRequired
partSizeChunk size for multipart uploads5 MB
maxConcurrentPartUploadsParallel upload threads60

Features:

  • Multipart uploads for files over 5MB
  • Resumable uploads (interrupted uploads can continue)
  • Up to 10,000 parts per file
  • Automatic retry on network failures

Enterprise-grade storage for Microsoft Azure deployments. Uses block blobs for efficient large file handling.

Installation:

npm install @uploadista/data-store-azure @azure/storage-blob

Configuration options:

OptionDescriptionDefault
deliveryUrlURL where files will be accessibleRequired
containerNameAzure container nameRequired
connectionStringAzure connection stringRequired
blockSizeChunk size for block uploads4 MB

Features:

  • Block blob uploads for large files
  • Up to 50,000 blocks per file
  • Supports both connection string and OAuth authentication

Native integration with Google Cloud Platform. Supports both Node.js and REST-based implementations for different runtime environments.

Installation:

npm install @uploadista/data-store-gcs @google-cloud/storage

Two implementations available:

  • gcsStoreNodejs() - Best for Node.js servers, uses official Google SDK
  • gcsStoreRest() - Works anywhere (including edge runtimes), uses REST API

Configuration options:

OptionDescriptionDefault
bucketNameGCS bucket nameRequired
keyFilenamePath to service account JSONOptional
credentialsService account objectOptional

Simple file-based storage for development and single-server deployments.

Installation:

npm install @uploadista/data-store-filesystem

Configuration options:

OptionDescriptionDefault
directoryLocal folder pathRequired
deliveryUrlURL prefix for file accessRequired

Limitations:

  • Not suitable for multi-server deployments
  • No built-in CDN support
  • Files lost if server disk fails

For production S3 deployments, serve files through CloudFront for better performance:

const dataStore = s3Store({
// Point to CloudFront, not S3 directly
deliveryUrl: "https://d123456.cloudfront.net",
s3ClientConfig: {
region: "us-east-1",
bucket: "my-uploads",
},
});

Cloudflare R2 uses the S3 API, so you can use the S3 store:

const dataStore = s3Store({
deliveryUrl: process.env.R2_DELIVERY_URL,
s3ClientConfig: {
bucket: process.env.R2_BUCKET,
region: "auto",
endpoint: process.env.R2_ENDPOINT,
credentials: {
accessKeyId: process.env.R2_ACCESS_KEY_ID,
secretAccessKey: process.env.R2_SECRET_ACCESS_KEY,
},
},
});

For files over 100MB, increase the chunk size:

const dataStore = s3Store({
deliveryUrl: "https://cdn.example.com",
s3ClientConfig: { region: "us-east-1", bucket: "my-uploads" },
partSize: 50 * 1024 * 1024, // 50MB chunks
maxConcurrentPartUploads: 4, // Fewer parallel uploads for stability
});
  • S3: Check your IAM user has s3:PutObject and s3:GetObject permissions
  • Azure: Verify the connection string includes write access
  • GCS: Ensure the service account has Storage Object Admin role
  • Increase partSize to reduce the number of parts
  • Check your storage quota hasn’t been exceeded
  • Verify network stability for long uploads
  • Verify deliveryUrl points to the correct endpoint
  • Check bucket/container permissions allow public read (or use signed URLs)
  • For S3, ensure bucket policy allows GetObject