dbt Pipeline¶
dbt powers Kiket’s tenant-scoped analytics (we ship the Fusion binary, exposed as the dbt executable). Each organisation gets its own schema (e.g. analytics_org_acme) populated by dbt models so dashboards and alerts can query data safely.
Project layout¶
The dbt project lives in analytics/dbt/:
dbt_project.yml– project configuration, points models atanalytics_org_<slug>schemas.models/staging/– raw tables wrapped with tenancy filters.models/marts/– fact/dimension models consumed by dashboards.macros/– helpers for tenant schema resolution and filters.profiles/– exampleprofiles.ymlfor local use.
Automated Pipeline¶
dbt models are refreshed automatically on a scheduled cadence:
- The analytics runner invokes
dbt runfor every tenant (emits OpenTelemetry logs). - Optionally generates
dbt docsand publishes documentation artifacts. - Logs, manifests, run results, and catalogs are stored for audit and debugging.
- Failures are surfaced via monitoring alerts.
Running dbt¶
Workspace administrators can opt‑in to the analytics pipeline; each run materialises models into the organisation schema based on configuration (schedules, tenants). When enabled, the system automatically refreshes dbt models on the published cadence—no manual steps required. Manual refreshes can be triggered via the platform’s analytics settings (API/CLI routes described in the admin runbook).
Profiles & credentials¶
The analytics pipeline uses a dedicated read-only database user with schema-level permissions. The connection string is managed securely by the platform.
Set the following environment variables when invoking dbt directly:
DBT_HOST,DBT_PORT,DBT_DATABASEDBT_USER/DBT_PASSWORD(defaults to the app credentials unless overridden)DBT_SHARED_SCHEMA(analytics_shared)
Each run writes to analytics_org_<slug> so data remains isolated per organisation.
Scheduling & cadence controls¶
Every organisation stores analytics preferences in analytics_settings:
enabledlets admins pause scheduled analytics completely.refresh_interval_hoursoverrides the plan-based default (starter 24h, team 12h, business 6h, enterprise 1h).last_run_atis maintained automatically to prevent back-to-back executions before the interval elapses.
When a run succeeds, the runner records UsageEvent[analytics_dbt_run] (model id, status, duration) so Ops and Billing can trace high-cost tenants.
Adding models¶
When you ship new analytics content (via definition repositories or templates), dbt models and dashboards are bundled together. Kiket validates those assets during sync and publishes them automatically.
Troubleshooting¶
- Missing data – confirm the relevant dbt models are included in your definition repositories and that the latest sync completed successfully.
- Delayed refreshes – check the analytics usage dashboard for recent run status; large datasets may take longer to rebuild.
- Permission errors – ensure the analytics role has access to the required schemas; contact support if the default role was rotated or revoked.
- Changelog comparisons – use the analytics export automation to produce automated-versus-human changelog evaluations, which writes
docs-site/data/changelog_evaluations.jsonfor downstream docs dashboards (see Changelog Evaluation Metrics).