Utility Nodes
Transform data flow and control processing logic.
Plugin Requirements
Section titled “Plugin Requirements”Available Plugin
Section titled “Available Plugin”| Plugin | Package | Operations | Requirements |
|---|---|---|---|
| zipPlugin | @uploadista/flow-utility-zipjs | Create ZIP archives | None |
Installation (for Zip Node only)
Section titled “Installation (for Zip Node only)”npm install @uploadista/flow-utility-nodes @uploadista/flow-utility-zipjspnpm add @uploadista/flow-utility-nodes @uploadista/flow-utility-zipjsyarn add @uploadista/flow-utility-nodes @uploadista/flow-utility-zipjsimport { createUploadistaServer } from "@uploadista/server";import { zipPlugin } from "@uploadista/flow-utility-zipjs";
const uploadista = await createUploadistaServer({ // ... plugins: [ zipPlugin(), ],});Conditional Node
Section titled “Conditional Node”Routes inputs based on file properties. The node has two output ports: "true" (condition matches) and "false" (condition doesn’t match).
Package: @uploadista/flow-utility-nodes
import { createConditionalNode } from '@uploadista/flow-utility-nodes';
// Route images > 1MB to compression, others pass throughconst conditionalNode = createConditionalNode("size-router", { field: "size", operator: "greaterThan", value: 1024 * 1024, // 1MB});
// Route by MIME typeconst mimeRouter = createConditionalNode("mime-router", { field: "mimeType", operator: "contains", value: "image",});
// Route by file extensionconst extensionRouter = createConditionalNode("ext-router", { field: "extension", operator: "equals", value: "pdf",});Parameters
Section titled “Parameters”| Parameter | Type | Required | Description |
|---|---|---|---|
field | "mimeType" | "size" | "width" | "height" | "extension" | Yes | File property to evaluate |
operator | "equals" | "notEquals" | "greaterThan" | "lessThan" | "contains" | "startsWith" | Yes | Comparison operator |
value | string | number | Yes | Value to compare against |
Output Ports
Section titled “Output Ports”The conditional node has two output ports that you connect using sourcePort in your edges:
| Port | Description |
|---|---|
"true" | Files that match the condition |
"false" | Files that don’t match the condition |
Edge Configuration
Section titled “Edge Configuration”Use sourcePort to specify which branch to connect:
import { createFlow, createInputNode } from "@uploadista/core";import { createConditionalNode, createPassthroughNode } from "@uploadista/flow-utility-nodes";import { createResizeNode } from "@uploadista/flow-images-nodes";
const flow = createFlow({ flowId: "conditional-routing", name: "Conditional Routing", nodes: { input: createInputNode("input"), "is-large": createConditionalNode("is-large", { field: "size", operator: "greaterThan", value: 1024 * 1024, // 1MB }), resize: createResizeNode("resize", { width: 800 }), passthrough: createPassthroughNode("store-small"), }, edges: [ { source: "input", target: "is-large" }, // Large files (condition TRUE) go to resize { source: "is-large", target: "resize", sourcePort: "true" }, // Small files (condition FALSE) go to passthrough { source: "is-large", target: "passthrough", sourcePort: "false" }, ],});Use Cases
Section titled “Use Cases”- Route images to resize, documents to compress
- Size-based routing (process large files differently)
- Format-specific processing paths
- Filter files by type before processing
Merge Node
Section titled “Merge Node”Combine multiple inputs into a batch.
Package: @uploadista/flow-utility-nodes
import { createMergeNode } from '@uploadista/flow-utility-nodes';
// Concatenate 3 files into oneconst mergeNode = createMergeNode("file-merger", { strategy: "concat", inputCount: 3,});
// Batch 5 uploads before processingconst batchNode = createMergeNode("batch-collector", { strategy: "batch", inputCount: 5, separator: "\n",});Parameters
Section titled “Parameters”| Parameter | Type | Required | Default | Description |
|---|---|---|---|---|
strategy | "concat" | "batch" | No | "batch" | Merge strategy |
inputCount | number (2-10) | No | 2 | Number of inputs to wait for |
separator | string | No | "\n" | Separator for concat strategy |
Strategies
Section titled “Strategies”| Strategy | Description |
|---|---|
batch | Collect files into an array for downstream processing |
concat | Combine file contents into a single file |
Use Cases
Section titled “Use Cases”- Batch upload processing
- Combine files before archiving
- Wait for multiple inputs before proceeding
- Aggregate results from parallel processing
Multiplex Node
Section titled “Multiplex Node”Split single input to multiple outputs for parallel processing.
Package: @uploadista/flow-utility-nodes
import { createMultiplexNode } from '@uploadista/flow-utility-nodes';
// Send to 3 different destinationsconst multiplexNode = createMultiplexNode("multi-output", { outputCount: 3, strategy: "copy",});
// Duplicate to 2 storage backendsconst backupNode = createMultiplexNode("backup-splitter", { outputCount: 2, strategy: "copy",});Parameters
Section titled “Parameters”| Parameter | Type | Required | Default | Description |
|---|---|---|---|---|
outputCount | number (1-10) | Yes | - | Number of output copies |
strategy | "copy" | "split" | No | "copy" | copy duplicates the file, split divides it |
Strategies
Section titled “Strategies”| Strategy | Description |
|---|---|
copy | Duplicate the file to each output path |
split | Divide the file into parts (for large files) |
Use Cases
Section titled “Use Cases”- Multi-destination delivery (S3 + backup)
- Create multiple sizes simultaneously
- Parallel processing paths
- Redundant storage
Passthrough Node
Section titled “Passthrough Node”Output files without transformation. Acts as a sink to capture output at a specific point in the flow.
Package: @uploadista/flow-utility-nodes
import { createPassthroughNode } from '@uploadista/flow-utility-nodes';
// Store file as-is after conditional routingconst passthroughNode = createPassthroughNode("store-as-is");
// Optionally disable keepOutput (default is true)const intermediateNode = createPassthroughNode("intermediate", { keepOutput: false,});Parameters
Section titled “Parameters”| Parameter | Type | Required | Default | Description |
|---|---|---|---|---|
keepOutput | boolean | No | true | Whether to include output in flow results |
Use Cases
Section titled “Use Cases”- Capture output after conditional routing without transformation
- Create an output sink on a specific branch
- Store original file alongside transformed versions
- Terminal node for conditional branches that don’t need processing
Example: Conditional with Passthrough
Section titled “Example: Conditional with Passthrough”Route images to resize, store non-images as-is:
import { createConditionalNode, createPassthroughNode } from '@uploadista/flow-utility-nodes';import { createResizeNode } from '@uploadista/flow-images-nodes';
// Route based on MIME typeconst conditionalNode = createConditionalNode("is-image", { field: "mimeType", operator: "startsWith", value: "image/",});
// Resize imagesconst resizeNode = createResizeNode("resize", { width: 800 });
// Store non-images without transformationconst passthroughNode = createPassthroughNode("store-non-image");
// Flow structure:// Input -> Conditional// |-- TRUE (images): Resize -> Output// |-- FALSE (non-images): Passthrough -> OutputZip Node
Section titled “Zip Node”Create ZIP archives from inputs.
Package: @uploadista/flow-utility-nodes
import { createZipNode } from '@uploadista/flow-utility-nodes';
// Archive multiple filesconst zipNode = createZipNode("archiver", { zipName: "backup.zip", includeMetadata: true, inputCount: 5,});
// Simple archive without metadataconst simpleZip = createZipNode("simple-archive", { zipName: "files.zip",});
// High compression archiveconst compressedZip = createZipNode("compressed-archive", { zipName: "archive.zip", compressionLevel: 9,});Parameters
Section titled “Parameters”| Parameter | Type | Required | Default | Description |
|---|---|---|---|---|
zipName | string | No | "archive.zip" | Output ZIP filename |
includeMetadata | boolean | No | false | Include file metadata in archive |
inputCount | number (2-10) | No | 2 | Number of files to archive |
compressionLevel | number (0-9) | No | 6 | Compression level (0=none, 9=max) |
Compression Levels
Section titled “Compression Levels”| Level | Description | Use Case |
|---|---|---|
| 0-3 | Fast, minimal compression | Network bottleneck, speed priority |
| 6 | Balanced (default) | General use |
| 9 | Maximum compression | Cold storage, size priority |
Use Cases
Section titled “Use Cases”- Archive multiple files for download
- Batch delivery
- Backup creation
- Package multiple processed files
Use Case Patterns
Section titled “Use Case Patterns”Pattern 1: Smart Routing
Section titled “Pattern 1: Smart Routing”Route different file types to appropriate processors. Use sourcePort: "true" or sourcePort: "false" in edges to connect conditional branches:
Input -> Conditional (is image?) |-- TRUE: Resize -> Optimize |-- FALSE: Conditional (is PDF?) |-- TRUE: Extract Text |-- FALSE: Passthrough (store directly)Pattern 2: Batch Processing
Section titled “Pattern 2: Batch Processing”Collect files before processing:
Input 1 -+Input 2 -+-- Merge (batch 3) -> Process -> OutputInput 3 -+Pattern 3: Multi-Destination
Section titled “Pattern 3: Multi-Destination”Store to multiple backends:
Input -> Multiplex -+-- Store to S3 (primary) +-- Store to GCS (backup) +-- Send to WebhookPattern 4: Archive Pipeline
Section titled “Pattern 4: Archive Pipeline”Process and archive:
Input -> Multiplex -+-- Optimize WebP +-- Optimize JPEG -> Merge -> Zip -> Store +-- Optimize PNGPerformance
Section titled “Performance”| Node | Speed | Memory |
|---|---|---|
| Conditional | Instant | Minimal |
| Merge | Instant (waits for inputs) | Depends on batch size |
| Multiplex | Instant | Minimal (references only) |
| Passthrough | Instant | Minimal |
| Zip | 100-500ms | Depends on total file size |
Related
Section titled “Related”- Plugins Concept - How plugins work
- Flow Nodes Overview - All available nodes