Documentation

Pipeline Builder & Automation

Build asset processing pipelines with the visual builder or text DSL. Chain select, filter, transform, and output stages to automate repetitive workflows across your entire library.

  • Visual Builder
  • Text DSL
  • Cron Scheduling
  • Event Triggers

Quick Start

Use a Template

  1. 1Navigate to Pipelines in the sidebar.
  2. 2Click New Pipeline and choose from 8 templates (e.g., "Weekly Client Deck", "Auto-Tag New Uploads").
  3. 3Customise any stages if needed, then run.

Build Custom

  1. 1Click New Pipeline, then Blank Pipeline.
  2. 2Add a select stage to choose which assets enter the pipeline.
  3. 3Add filter stages to narrow down, transform stages to modify, and an output stage to deliver results.

Stage Types

Every pipeline is a sequence of stages. There are 29 stage types across four categories. Each stage receives assets from the previous stage and passes its output to the next.

Select7 types

Choose which assets enter the pipeline.

  • By collection
  • By tag
  • By date range
  • By search query

Filter8 types

Narrow down selected assets.

  • By file type
  • By dimensions
  • By metadata presence
  • By tag match

Transform9 types

Modify assets or metadata.

  • Add/remove tags
  • Strip metadata
  • Generate thumbnails
  • Update fields

Output5 types

Deliver results.

  • Export to folder
  • Create collection
  • Send webhook notification
  • Generate report

Stage Execution Order

Stages execute left-to-right in the visual builder (top-to-bottom in the DSL). Each stage receives the output of the previous stage as its input. A pipeline must start with at least one select stage and end with at least one output stage. Filter and transform stages are optional and can appear in any order between them.

Text DSL

For power users, the text DSL offers a shorthand syntax for defining pipelines. It is faster for complex pipelines and easy to share via copy-paste.

Example: Strip metadata before sharing

select collection:"Client Work" | filter type:image | transform strip-metadata preset:share | output export

Selects all images from the "Client Work" collection, strips metadata using the "share" preset, and exports them.

Syntax Reference

|

Pipe. Separates stages. Assets flow from the stage on the left to the stage on the right.

type:value

Parameter. Each stage accepts key-value parameters. Use quotes for values containing spaces.

stage keyword

Stage type. The first word after a pipe is the stage category: select, filter, transform, or output.

Example: Tag and archive old assets

select date-range end:"2025-01-01" | filter type:image | transform add-tag tag:"archived" | output collection name:"Archive 2024"

Selects images created before 2025, tags them as "archived", and moves them into a new collection.

Built-in Templates

Numonic ships with 8 templates covering the most common automation workflows. Each template can be cloned and customised to fit your requirements.

Weekly Client Deck

Collects tagged client assets each Friday and exports them as a structured folder ready for presentation.

Auto-Tag New Uploads

Runs AI classification on every new upload and applies tags based on detected content, style, and colour palette.

Metadata Cleanup

Scans assets for missing or inconsistent metadata fields and applies corrections from configurable rules.

Export Batch

Selects assets from a collection, applies format conversion and resizing, then exports to a target folder.

Archive Old Assets

Identifies assets not accessed within a configurable period and moves them to an archive collection.

Privacy Strip for Sharing

Removes EXIF data, generation prompts, and internal metadata before exporting assets for external sharing.

Collection Summary Report

Generates a CSV or JSON summary of a collection, including file counts, tag distributions, and storage usage.

Tag Standardisation

Normalises tag casing, merges duplicates, and applies your organisation's tag taxonomy across all assets.

Customising Templates

After selecting a template, every stage is fully editable. Add, remove, or reorder stages as needed. You can also convert any customised template to DSL for version control or sharing with your team.

Scheduling and Triggers

Pipelines can run on demand, on a recurring schedule, or in response to events. Combine both for maximum flexibility.

Cron Scheduling

Set a recurring schedule using standard cron expressions. The pipeline runs automatically in the background at each scheduled time.

0 9 * * 1

Every Monday at 9:00 AM

0 0 * * *

Daily at midnight

0 9 1 * *

First day of every month at 9:00 AM

Event Triggers

Trigger a pipeline automatically when specific events occur in your workspace. Configure via the automation rules UI or API.

New asset uploaded

Fires when any asset is uploaded to the workspace.

Tag added or removed

Fires when a tag is added to or removed from an asset. Optionally filter by specific tag names.

Asset moved to collection

Fires when an asset is added to a specified collection.

Asset deleted

Fires when an asset is permanently deleted. Useful for audit logging or backup triggers.

Combining Schedules and Triggers

A single pipeline can have both a cron schedule and event triggers. For example, the "Auto-Tag New Uploads" template uses an event trigger for immediate tagging on upload, while also running a nightly cron to catch any assets that were imported via bulk upload or API.

API Management

Pipelines and automation rules can be created, updated, triggered, and deleted via REST API endpoints. This enables integration with external tools, CI/CD systems, and custom scripts.

Key Endpoints

GET /api/v1/pipelines

List all pipelines in your workspace.

POST /api/v1/pipelines

Create a new pipeline from JSON or DSL definition.

POST /api/v1/pipelines/:id/run

Trigger a pipeline run on demand.

GET /api/v1/pipelines/:id/runs

Retrieve the run history and status for a pipeline.

For full endpoint documentation, including request and response schemas, authentication, and rate limits, see the API Reference.

Automate Your Workflow

Stop repeating the same steps manually. Build a pipeline once and let it run on schedule, on events, or via API.