How a Fortune 200 global real estate leader scaled data trust with Telmai
Discover how a Fortune 200 global restate leader scaled data reliability by leveraging Telmai’s ML-powered data quality monitoring across their data lakehouse
Overview
A globally recognized Fortune 200 real estate services firm offers a broad range of services across capital markets, leasing, and work dynamics. Operating in over 80 countries, these business units manage billions of assets, complex global portfolios, and high-touch tenant relationships, relying heavily on data to make timely and informed decisions. As part of its enterprise-wide digital transformation, the organization recognized the need to modernize how data is captured and operationalized to support business growth and expansion.

To support this transformation, data teams began consolidating data across regions and verticals into a centralized platform, creating a thread of data products that sit atop this core platform. These data products enabled teams to self-serve insights and scale analytics without relying solely on engineering.
However, with data flowing in from hundreds of systems and pipelines, the lack of centralized visibility into data health trends became more than just a technical hurdle. Without a unified approach to monitoring data quality, inconsistencies and anomalies often went undetected until they disrupted or delayed critical business operations, leaving teams and stakeholders questioning the reliability of data-driven insights.
To address this, the organization partnered with Telmai to establish end-to-end visibility into data health by proactively monitoring their data pipelines and restoring trust in the data driving enterprise-wide decisions.
Notable takeaways:
- Empowered teams to self-serve data quality insights without engineering dependencies
- Integrated natively with existing Delta Lake infrastructure for immediate time-to-value
- Accelerated trust in data products by delivering granular insights across structured and semi-structured datasets in real time
- Federated visibility, enabling centralized governance while supporting decentralized validation and self-serve insights across global data product teams
- Enterprise-grade security, deployed within their Azure VPC to meet strict compliance and access control requirements
Challenge: manual data quality processes couldn’t keep pace with a complex, evolving lakehouse environment built on Delta Lake
Despite investing in a centralized data platform, the organization struggled with limited visibility into the reliability of the data powering its decisions. Data quality issues remained deeply embedded in day-to-day operations, and the governance team lacked the tools to address them at scale.
Each business unit had managed data quality independently, applying custom logic and rules that resulted in fragmented visibility across the organization. The homegrown rules engine supported basic validations, but it was limited to static checks. The system struggled to keep up with new data sources with semi-structured formats. It lacked profiling capabilities, offered no view into data drifts or historical trends, and provided little context about the overall health of the data moving through the system.
The central data governance team tasked with connecting technical systems to business users found itself increasingly constrained. The team remained reactive without the ability to evaluate or explain the state of the data independently. “We were being brought into conversations but couldn’t offer answers fast enough,” noted the head of governance.
Without the ability to independently monitor data health, the data governance team became increasingly reliant on engineering to investigate and resolve data quality issues, which created bottlenecks that delayed critical business processes. This dependency not only slowed resolution times but also undermined confidence in data-driven initiatives. Meanwhile, undetected data issues often surfaced too late, forcing the data governance teams into reactive firefighting rather than proactive prevention.
Manual, rules-based processes could no longer keep pace with the complexity of a centralized Delta Lakehouse environment, with deeply nested semi-structured, and fast-evolving data environment. The organization needed a modern solution that could continuously monitor data health across systems, detect anomalies in real-time, and empower teams to act proactively.
Evaluation: augmenting advanced solutions to scale data quality without engineering overhead
To overcome this challenge, the governance team launched a focused search for a solution that could augment their existing in-house rules engine and extend its capabilities where automation and visibility were lacking.
To define their requirements, the team established strict criteria to ensure seamless adoption. The solution needed to integrate natively with their Delta Lake architecture, handle complex nested and semi-structured datasets, and deliver real-time monitoring without requiring extensive configuration or customization.
Usability was just as critical, as the platform needed to empower non-technical users to explore and investigate datasets without relying on SQL or custom scripts.
Several vendors were evaluated and failed to meet these demands. Some offered basic alerting but relied heavily on manual rule configuration or lacked comprehensive profiling capabilities. Others struggled with schema evolution, finding it too rigid for an enterprise-scale Delta Lakehouse with evolving data structures and complex data modeling needs. Many platforms were designed primarily for engineering teams, offering limited accessibility for governance users, making them unsuitable for a team focused on delivering fast, business-aligned data quality insights without technical bottlenecks.
Solution: proactive, automated data quality monitoring integrated into a modern lakehouse environment
After evaluating multiple vendors, the organization selected Telmai because it could integrate directly into its existing data stack and provide scalable, real-time data observability with minimal engineering overhead.
Built for open table formats like Delta Lake and Apache Iceberg, Telmai’s open architecture and native support for structured and semi-structured data enables comprehensive observability while ensuring long-term architectural flexibility within the company’s modern data lakehouse environment. What stood out was Telmai’s ability to provide out-of-the-box profiling, offering granular visibility into both row- and column-level data health attributes in real-time.
One of the most immediate benefits of Telmai was the accelerated time-to-value.
The governance team leveraged Telmai to onboard data and deliver actionable data quality insights within hours of deployment, replacing manual processes that previously took weeks to complete.
Teams could define and manage rule-based validations that align with business logic and reuse them across their Delta Lakehouse architecture. This brought structure and transparency to what was previously a fragmented and ad hoc process. This transformation repositioned data governance from a reactive support function to a strategic enabler, continuously assessing data reliability and supporting business units in scaling data products with trust built in from the outset.
Why Telmai
While various vendors were considered, ultimately Telmai was selected for the following reasons:
- Ease of integration with modern data lakehouse architectures without complexity or overhead
- Out-of-the-box AI-driven profiling, anomaly detection that detects unpredictable errors without prior knowledge about the data
- Real-time visibility into data health across structured and semi-structured datasets
- Democratise data quality efforts across the organization without the need for technical expertise
- Designed to scale with enterprise growth while being cost-effective
See what’s possible with Telmai
Request a demo to see the full power of Telmai’s data observability tool for yourself.