This quick start uses a realistic scenario instead of placeholder demo data. Scenario: Lighthouse Commerce runs its application on Neon PostgreSQL and wants an analytics copy of theDocumentation Index
Fetch the complete documentation index at: https://docs.mantrixflow.com/llms.txt
Use this file to discover all available pages before exploring further.
orders table in a separate PostgreSQL destination. The team wants one manual validation run first and an hourly schedule after that.
Before you begin
Have these values ready:- Source host, database, username, password, and schema for the operational PostgreSQL database
- Destination host, database, username, password, and schema for the reporting database
- A source user with read access and a destination user with write access
- SSL enabled for both connections if they are cloud-hosted
Step by step
- Sign in and switch to the correct organization.
- Open Connections and create a Source connection with PostgreSQL.
- Fill the form with real values such as:
Connection Name:Neon Production OrdersHost:ep-lighthouse-prod-123456.us-east-2.aws.neon.techPort:5432Database:appUsername:mf_readonlySchema:publicSSL Mode:require
- Click Test Connection and save the source after the test succeeds.
- Create a second Destination connection with values such as:
Connection Name:Analytics PostgresHost:analytics-db.internal.lighthouse.ioDatabase:warehouseUsername:mf_writerSchema:analyticsSSL Mode:require
- Go to Data Pipelines → + New Pipeline, name it
Orders to Analytics, chooseNeon Production Ordersas the source, and click Create & open canvas. - On the canvas, click ⚙️ on the Source node:
- Click Discover schema to load available tables.
- Tick Include next to
public.orders. - Click Preview to verify rows.
- Optionally add a SQL transform. Use
{{ source }}as the table reference:
- Click ⚙️ on the Destination node. In the Config tab:
- Connection:
Analytics Postgres - Final delivery schema:
analytics - Sync mode:
FULL_TABLE - Write mode:
Upsert - Click Validate config, then open the Preview tab to confirm the target schema.
- Connection:
- Click ▷ on the Destination node to start the first manual run.
- Click the history icon in the top bar and confirm
analytics.orders_livecontains the expected rows. - Once the manual run succeeds, open the Scheduling tab in the Destination panel and set an hourly schedule.
What success looks like
You now have a working pipeline that uses the same flow most teams follow in production:- reusable source and destination connections
- a named pipeline shell
- a selected source table
- an optional SQL transform (for PostgreSQL-to-PostgreSQL pipelines)
- a configured destination branch
- saved configuration, run history, and scheduling