Track Product Analytics
Collect events from your app, transform them into metrics, and query the results
Collect pageview events from your app via HTTP. Transform them into daily metrics. Query the results. No Kafka, no external tools.
1. Define the Event Schema
-- models/raw/events.sql
-- @kind: events
event_name VARCHAR NOT NULL,
page_url VARCHAR,
user_id VARCHAR,
session_id VARCHAR,
received_at TIMESTAMPTZ
2. Configure Ports
Add to .env:
COLLECT_PORT=8080
COLLECT_ADMIN_PORT=8081
The public port receives events. The admin port is used by ondatrasql run to flush events into the pipeline.
3. Start the Daemon
ondatrasql daemon
Event daemon starting...
Public: :8080 (POST /collect/{schema}/{table})
Admin: 127.0.0.1:8081 (flush endpoints)
Store: .ondatra/events
Models: raw.events
4. Send Events
From your app, tracking script, or curl:
curl -X POST localhost:8080/collect/raw/events \
-H "Content-Type: application/json" \
-d '{"event_name":"pageview","page_url":"/pricing","user_id":"u42","session_id":"s1"}'
Events are buffered durably on disk. They survive crashes.
5. Transform with SQL
Sessions:
-- models/staging/sessions.sql
-- @kind: view
SELECT
session_id,
user_id,
MIN(received_at) AS started_at,
MAX(received_at) AS ended_at,
COUNT(*) AS page_views
FROM raw.events
WHERE event_name = 'pageview'
GROUP BY session_id, user_id
Daily metrics:
-- models/mart/daily_traffic.sql
-- @kind: merge
-- @unique_key: day
SELECT
DATE_TRUNC('day', started_at) AS day,
COUNT(DISTINCT session_id) AS sessions,
COUNT(DISTINCT user_id) AS visitors,
SUM(page_views) AS total_views
FROM staging.sessions
GROUP BY 1
6. Run the Pipeline
In a separate terminal (same project directory):
ondatrasql run
Running 3 models...
[OK] raw.events (events, flush, 3 rows, 380ms — event flush)
[OK] staging.sessions (view, create, 0 rows, 158ms — first run)
[OK] mart.daily_traffic (merge, backfill, 1 rows, 182ms — first run)
Events are flushed from the buffer into raw.events. The view resolves live. The mart is materialized.
7. Query
ondatrasql sql "SELECT * FROM mart.daily_traffic ORDER BY day DESC LIMIT 7"
What You Get
- Events buffered durably — no data loss on crash
- Automatic flush on pipeline run
- One command runs everything — flush, transform, materialize
- No Kafka, no Airflow, no external database
Ondatra Labs