Accelerator
A landing zone tailored for AI, data-driven, data mesh, and cloud projects, so teams can experiment fast without building governance from scratch.
Cloud bills double overnight. Privacy audits land badly. Every workload reinvents the same governance, and engineers ship slower for it. By the time anyone notices the pattern, the architecture has already hardened around it, and remediation means changing things teams now depend on.
The order is predictable. Privacy gets addressed first because regulation forces the conversation. FinOps gets addressed last, after the first billing shock. AI is now landing in the same sequence, except the spend curve is steeper than the regulatory one.
”Surveyed companies reported average monthly AI budgets rising 36% in 2025, with only 51% confident they can measure the ROI of that spend.”
— CloudZero, The State of AI Costs, May 2025
The Data Landing Zone closes these gaps before the first workload lands.
Data platforms usually pick one of these audiences to serve well. The DLZ aims to serve all three.
Cost attribution is part of provisioning. Budgets and anomaly alerts ship as code alongside the workload they monitor, and every resource carries the tags needed to roll spend up to a team, product, or environment. Chargeback becomes a query rather than a reconciliation project.
The controls regulators look for are owned by the platform: GDPR alignment, data classification, access logging, audit trails. When an audit lands, the evidence is already in the same place it was last quarter.
Engineers get a paved road. Common requests like a new account, a new pipeline, or access to a dataset resolve through self-service against guardrails the platform already enforces. Policy is checked at deploy and audit time, so PRs don’t run into governance walls in the middle of code review.
Design principle. The compliant path should be the convenient one. When following policy is the fastest way to ship, engineers do it by default, and working around it stops being worth the effort.
FinOps tends to arrive last, after the first billing shock makes it urgent. The DLZ ships the cost-management capability ready to use, in four layers.
Every service emits cost and usage data in a consistent form. CUR 2.0 is the system of record, written to a dedicated FinOps account and queryable from Athena.
A mandatory tag set is enforced at deploy time. Tag policies, SCPs, and DLZ-side validation all check it before a resource ships. Anything that can’t be attributed to an owner doesn’t reach production.
Budgets alert when usage approaches the threshold and stop spend when it crosses the limit. An LLM gateway applies per-team and per-project token caps in real time, so a runaway agent burns minutes of budget rather than days.
Each role gets the view it needs: engineers see service-level burn, product owners see feature cost, finance sees the chargeback rollup. Cost-per-unit-of-work (dollars per inference, per document, per query) turns “we should optimise” into a number.
Accelerator
A landing zone tailored for AI, data-driven, data mesh, and cloud projects, so teams can experiment fast without building governance from scratch.
Lessons in the defaults
Years of operational lessons about regulators, billing, and incident response are encoded as defaults in the platform. New workloads inherit them without anyone having to remember to apply them.
All-in-one CDK component
A CDK construct published in TypeScript and Python that orchestrates AWS accounts and the resources needed for a multi-account data platform.
FinOps as a platform concern
Mandatory cost-allocation tags enforced at deploy time. Budgets and anomaly controls as code. CUR centralised in a dedicated FinOps account. An LLM gateway that blocks token spend before it reaches the model.
Standards & best practices
Aligned with the AWS Well-Architected Framework for multi-account strategy, security, and compliance.
Ease of use
Data engineers can deploy it using the defaults. Cloud engineers can customise the parts their organisation needs to fine-tune.