Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.mantrixflow.com/llms.txt

Use this file to discover all available pages before exploring further.

This quick start uses a realistic scenario instead of placeholder demo data. Scenario: Lighthouse Commerce runs its application on Neon PostgreSQL and wants an analytics copy of the orders table in a separate PostgreSQL destination. The team wants one manual validation run first and an hourly schedule after that.

Before you begin

Have these values ready:
  • Source host, database, username, password, and schema for the operational PostgreSQL database
  • Destination host, database, username, password, and schema for the reporting database
  • A source user with read access and a destination user with write access
  • SSL enabled for both connections if they are cloud-hosted

Step by step

  1. Sign in and switch to the correct organization.
  2. Open Connections and create a Source connection with PostgreSQL.
  3. Fill the form with real values such as:
    • Connection Name: Neon Production Orders
    • Host: ep-lighthouse-prod-123456.us-east-2.aws.neon.tech
    • Port: 5432
    • Database: app
    • Username: mf_readonly
    • Schema: public
    • SSL Mode: require
  4. Click Test Connection and save the source after the test succeeds.
  5. Create a second Destination connection with values such as:
    • Connection Name: Analytics Postgres
    • Host: analytics-db.internal.lighthouse.io
    • Database: warehouse
    • Username: mf_writer
    • Schema: analytics
    • SSL Mode: require
  6. Go to Data Pipelines → + New Pipeline, name it Orders to Analytics, choose Neon Production Orders as the source, and click Create & open canvas.
  7. On the canvas, click ⚙️ on the Source node:
    • Click Discover schema to load available tables.
    • Tick Include next to public.orders.
    • Click Preview to verify rows.
  8. Optionally add a SQL transform. Use {{ source }} as the table reference:
SELECT
  id,
  customer_id,
  LOWER(order_status)          AS order_status,
  total_amount,
  created_at,
  updated_at
FROM {{ source }}
WHERE order_status IS NOT NULL
  1. Click ⚙️ on the Destination node. In the Config tab:
    • Connection: Analytics Postgres
    • Final delivery schema: analytics
    • Sync mode: FULL_TABLE
    • Write mode: Upsert
    • Click Validate config, then open the Preview tab to confirm the target schema.
  2. Click ▷ on the Destination node to start the first manual run.
  3. Click the history icon in the top bar and confirm analytics.orders_live contains the expected rows.
  4. Once the manual run succeeds, open the Scheduling tab in the Destination panel and set an hourly schedule.

What success looks like

You now have a working pipeline that uses the same flow most teams follow in production:
  • reusable source and destination connections
  • a named pipeline shell
  • a selected source table
  • an optional SQL transform (for PostgreSQL-to-PostgreSQL pipelines)
  • a configured destination branch
  • saved configuration, run history, and scheduling
From here, you can expand to more tables, more branches, or another supported SQL destination.