Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.ntropii.com/llms.txt

Use this file to discover all available pages before exploring further.

The ntro CLI is a thin layer over the Python SDK. Every command parses arguments, calls the SDK, and formats output. No business logic lives in the CLI itself.

Install

pip install ntro-cli
ntro --help

First-time setup

ntro auth login
ntro auth whoami      # Confirm the key works
ntro tenant list      # Confirm you can see your tenants
auth login writes ~/.ntro/config.toml — see API keys and Configure environment for the full flow.

Global flags

All flags go before the subcommand:
ntro [OPTIONS] COMMAND [ARGS]...
FlagDescription
-c, --connection NAMEConnection from config.toml (env: NTRO_DEFAULT_CONNECTION_NAME)
--host URLOverride the API host (env: NTRO_HOST)
-o, --output FORMATtext (default — Rich tables) or json
--debugEnable debug logging
--log-level LEVELDEBUG, INFO, WARN, ERROR
ntro -c production -o json tenant list | jq '.[].slug'

Command groups

ntro auth login                           # Interactive setup
ntro auth login --no-interactive \        # CI/CD setup
  --name production \
  --host https://api.ntropii.com/v1 \
  --api-key ntro_prod_xxx \
  --default-tenant acme-fund-admin

ntro auth list                            # All configured connections
ntro auth test                            # Test the active connection
ntro auth test -c staging                 # Test a specific connection
ntro auth set-default production          # Change the default
ntro auth whoami                          # Current user identity
Used when bringing your own data platform (Snowflake or Microsoft Fabric). Skip this if you’re using managed-postgres — Ntropii provisions and manages the database for you.
# Register a Snowflake account
ntro integration add snowflake \
  --name "Acme Snowflake UK" \
  --account acme-fund.eu-west-2 \
  --warehouse fund_ops \
  --user ntropii \
  --password <secret> \
  --region UK-South

# Or use --json for the full payload
ntro integration add snowflake --json @./snowflake-config.json

ntro integration list
ntro integration info <id>
ntro integration test <id>
ntro integration discover <id>            # List schemas in the warehouse
Every tenant must declare its data-platform strategy at creation. Required values: managed-postgres, snowflake, or microsoft-fabric.
# Managed Postgres — Ntropii provisions the database
ntro tenant create \
  --name "Acme Fund" \
  --slug acme-fund \
  --data-platform managed-postgres

# BYO — register the config first, then bind it
ntro integration add snowflake --name "..." ...
# → dpc_abc123

ntro tenant create \
  --name "Acme Fund" \
  --slug acme-fund \
  --data-platform snowflake \
  --data-platform-config dpc_abc123

ntro tenant list
ntro tenant info acme-fund
The CLI fails fast when --data-platform-config is set with managed-postgres, or missing with a BYO platform. The API enforces the same checks server-side.
ntro entity create \
  --name "Acme Commercial SPV 1" \
  --slug acme-commercial-spv1 \
  --tenant acme-fund-admin \
  --type real-estate-spv \
  --jurisdiction Jersey \
  --currency GBP

ntro entity list
ntro entity list --tenant acme-fund-admin
Tenant resolution: --tenant flag → NTRO_TENANT env var → default_tenant in config.
# Register a workflow definition
ntro workflow create \
  --name nav-monthly \
  --description "Monthly NAV pipeline" \
  --schedule "0 8 5 * *" \
  --timezone Europe/London

ntro workflow list
ntro workflow info <id>

# Push a new version artifact
ntro workflow push <workflow-id> ./nav-monthly-v2.zip

# Deploy a version to a tenant
ntro workflow deploy --workflow <id> --version <version-id> --tenant acme-fund-admin
ntro workflow deploy-status <deployment-id>

# Trigger a run
ntro workflow run nav-monthly \
  --tenant acme-fund-admin \
  --entity acme-spv1 \
  --period 2026-03

ntro workflow run nav-monthly --tenant acme-fund-admin --wait     # Poll until complete
ntro workflow run nav-monthly --tenant acme-fund-admin --dry-run
ntro workflow test is covered in Testing locally.
ntro run status <task-id>                 # Single run status + step progress
ntro run list                             # Scheduled / active runs
ntro run history \                        # History for an entity
  --tenant acme-fund-admin \
  --entity acme-spv1
ntro run incoming                         # Queued runs
ntro run pending                          # Runs awaiting human action
Note the linguistic split: ntro workflow run <name> is a verb (execute the workflow). ntro run status <id> is a noun (look at the run).

The --json pattern

Write commands accept --json for complex payloads:
# Inline
ntro tenant create --json '{"name":"Acme","slug":"acme","dataPlatformConfigId":"dpc_123"}'

# From file (@ prefix, Databricks CLI convention)
ntro integration add databricks --json @./databricks.json

Output formats

# Default: Rich tables and panels
ntro tenant list

# JSON for scripting
ntro -o json tenant list | jq '.[].slug'
ntro -o json auth whoami | jq .email

MCP server

Same surface, exposed to coding agents over MCP.

Testing locally

ntro workflow test for the design-time inner loop.