Use a Template
- 1Navigate to Pipelines in the sidebar.
- 2Click New Pipeline and choose from 8 templates (e.g., "Weekly Client Deck", "Auto-Tag New Uploads").
- 3Customise any stages if needed, then run.
Loading...
Documentation
Build asset processing pipelines with the visual builder or text DSL. Chain select, filter, transform, and output stages to automate repetitive workflows across your entire library.
Every pipeline is a sequence of stages. There are 29 stage types across four categories. Each stage receives assets from the previous stage and passes its output to the next.
Choose which assets enter the pipeline.
Narrow down selected assets.
Modify assets or metadata.
Deliver results.
Stages execute left-to-right in the visual builder (top-to-bottom in the DSL). Each stage receives the output of the previous stage as its input. A pipeline must start with at least one select stage and end with at least one output stage. Filter and transform stages are optional and can appear in any order between them.
For power users, the text DSL offers a shorthand syntax for defining pipelines. It is faster for complex pipelines and easy to share via copy-paste.
select collection:"Client Work" | filter type:image | transform strip-metadata preset:share | output exportSelects all images from the "Client Work" collection, strips metadata using the "share" preset, and exports them.
Pipe. Separates stages. Assets flow from the stage on the left to the stage on the right.
Parameter. Each stage accepts key-value parameters. Use quotes for values containing spaces.
Stage type. The first word after a pipe is the stage category: select, filter, transform, or output.
select date-range end:"2025-01-01" | filter type:image | transform add-tag tag:"archived" | output collection name:"Archive 2024"Selects images created before 2025, tags them as "archived", and moves them into a new collection.
Numonic ships with 8 templates covering the most common automation workflows. Each template can be cloned and customised to fit your requirements.
Collects tagged client assets each Friday and exports them as a structured folder ready for presentation.
Runs AI classification on every new upload and applies tags based on detected content, style, and colour palette.
Scans assets for missing or inconsistent metadata fields and applies corrections from configurable rules.
Selects assets from a collection, applies format conversion and resizing, then exports to a target folder.
Identifies assets not accessed within a configurable period and moves them to an archive collection.
Removes EXIF data, generation prompts, and internal metadata before exporting assets for external sharing.
Generates a CSV or JSON summary of a collection, including file counts, tag distributions, and storage usage.
Normalises tag casing, merges duplicates, and applies your organisation's tag taxonomy across all assets.
After selecting a template, every stage is fully editable. Add, remove, or reorder stages as needed. You can also convert any customised template to DSL for version control or sharing with your team.
Pipelines can run on demand, on a recurring schedule, or in response to events. Combine both for maximum flexibility.
Set a recurring schedule using standard cron expressions. The pipeline runs automatically in the background at each scheduled time.
0 9 * * 1Every Monday at 9:00 AM
0 0 * * *Daily at midnight
0 9 1 * *First day of every month at 9:00 AM
Trigger a pipeline automatically when specific events occur in your workspace. Configure via the automation rules UI or API.
New asset uploaded
Fires when any asset is uploaded to the workspace.
Tag added or removed
Fires when a tag is added to or removed from an asset. Optionally filter by specific tag names.
Asset moved to collection
Fires when an asset is added to a specified collection.
Asset deleted
Fires when an asset is permanently deleted. Useful for audit logging or backup triggers.
A single pipeline can have both a cron schedule and event triggers. For example, the "Auto-Tag New Uploads" template uses an event trigger for immediate tagging on upload, while also running a nightly cron to catch any assets that were imported via bulk upload or API.
Pipelines and automation rules can be created, updated, triggered, and deleted via REST API endpoints. This enables integration with external tools, CI/CD systems, and custom scripts.
GET /api/v1/pipelinesList all pipelines in your workspace.
POST /api/v1/pipelinesCreate a new pipeline from JSON or DSL definition.
POST /api/v1/pipelines/:id/runTrigger a pipeline run on demand.
GET /api/v1/pipelines/:id/runsRetrieve the run history and status for a pipeline.
For full endpoint documentation, including request and response schemas, authentication, and rate limits, see the API Reference.