Telmai Announces $2.8 Million Seed From .406 Partners, Zetta Venture Partners, Y Combinator & others

Telmai Announces $2.8 Million Seed From .406 Partners, Zetta Venture Partners, Y Combinator & others
Mona & Max

We have some exciting news to share - we’ve closed our Seed round with .406 Ventures, Zetta Venture Partners, Y Combinator, and some exceptional angel investors like, Manish Sood (Founder and CTO, Reltio),  Sam Ramji (Chief Strategy Officer, DataStax) and  Venkat Varadachary (CEO,  Zenon | Former EVP/CDAO @ Amex).

We feel fortunate to partner with Graham Brooks and Jocelyn Goldfein on our journey. They are the best partners we could have asked for as a data company with an ML-first approach.

This is a big milestone for our team. We wanted to take this moment to share our thoughts on Telmai’s opportunity, and our plans for the future.

Telmai is a no-code real-time data quality analysis and monitoring platform. Our product automatically and proactively detects data quality issues as data is being ingested. Our human-in-the-loop machine learning engine enables Telmai users to define correct versus incorrect data -definitions that in turn enable proactive and autonomous monitoring and alerting of data quality issues.

Company momentum

Since inception in November 2020, we’ve been working hard to build visionary products and spread the word about our unique approach to data reliability. We have been fortunate to work with early design partners like Dun & Bradstreet and Myers-Holum, who share our vision for the future of data quality. Thanks to partners like them, and the thousands of prospective users we’ve been fortunate to interview, Telmai launched its MVP this month (October 2021).

More important than our category-leading product is our world-class team. We have been fortunate to recruit a lean but exceptional technical group who share a deep belief in Telmai’s vision of the future.

The next chapter

This seed funding round has enabled us to grow our team and to continue building Telmai’s leading data-quality platform for the modern data stack.
We are excited to continue this journey with an incredible group of institutional and angel investors who believe in our mission.

Investor Quotes

"As an investor focused on data and analytics for the past 15 years, our portfolio and our network of Chief Data Officers have continually highlighted data quality and trust as key challenges to realizing the promise of big data analytics and machine learning.

The biggest challenge around data quality is the historic gap between engineering-focused data quality tools and business users, who understand the data, how it is being used and the expectations around data reliability. Telmai is not only eliminating that gap but also addressing both IT and business' most pressing needs in an infinitely-scalable, integrated data observability tool that establishes trust in data.”

- Graham Brooks, Partner, .406 Ventures

“The AI and machine learning platform shift is well under way, and ensuring data quality - efficiently, in production, and at scale - will be a prerequisite for every product and every business decision in the decades to come. We're delighted to partner with Telm.ai's founders, who acutely understand the challenge of maintaining high quality data and have built an industry leading product, itself powered by data and machine learning, to help their customers achieve it.”


- Jocelyn Goldfein, Managing Director , Zetta Venture Partners


We want to thank our customers and all the people who made it possible for us to reach this point – we are beyond excited to continue Telmai’s journey.


Onwards 🚀

– Mona & Max


Data profiling helps organizations understand their data, identify issues and discrepancies, and improve data quality. It is an essential part of any data-related project and without it data quality could impact critical business decisions, customer trust, sales and financial opportunities. 

To get started, there are four main steps in building a complete and ongoing data profiling process:

  1. Data Collection
  2. Discovery & Analysis
  3. Documenting the Findings
  4. Data Quality Monitoring

We'll explore each of these steps in detail and discuss how they contribute to the overall goal of ensuring accurate and reliable data. Before we get started, let's remind ourself of what is data profiling.

What are the different kinds of data profiling?

Data profiling falls into three major categories: structure discovery, content discovery, and relationship discovery. While they all help in gaining more understanding of the data, the type of insights they provide are different:

 

Structure discovery analyzes that data is consistent, formatted correctly, and well structured. For example, if you have a ‘Date’ field, structure discovery helps you see the various patterns of dates (e.g., YYYY-MM-DD or YYYY/DD/MM) so you can standardize your data into one format.

 

Structure discovery also examines simple and basic statistics in the data, for example, minimum and maximum values, means, medians, and standard deviations.

 

Content discovery looks more closely into the individual attributes and data values to check for data quality issues. This can help you find null values, empty fields, duplicates, incomplete values, outliers, and anomalies.

 

For example, if you are profiling address information, content discovery helps you see whether your ‘State’ field contains the two-letter abbreviation or the fully spelled out city names, both, or potentially some typos.

 

Content discovery can also be a way to validate databases with predefined rules. This process helps find ways to improve data quality by identifying instances where the data does not conform to predefined rules. For example, a transaction amount should never be less than $0.

 

Relationship discovery discovers how different datasets are related to each other. For example, key relationships between database tables, or lookup cells in a spreadsheet. Understanding relationships is most critical in designing a new database schema, a data warehouse, or an ETL flow that requires joining tables and data sets based on those key relationships.

Stay in touch

Stay updated with our progress. Sign up now

By subscribing you agree to with our Privacy Policy.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Stay in touch

Stay updated with our progress. Sign up now

By subscribing you agree to with our Privacy Policy.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Data Observability
Data Quality

Leverages ML and statistical analysis to learn from the data and identify potential issues, and can also validate data against predefined rules

Uses predefined metrics from a known set of policies to understand the health of the data

Detects, investigates the root cause of issues, and helps remediate

Detects and helps remediate.

Examples: continuous monitoring, alerting on anomalies or drifts, and operationalizing the findings into data flows

Examples: data validation, data cleansing, data standardization

Low-code / no-code to accelerate time to value and lower cost

Ongoing maintenance, tweaking, and testing data quality rules adds to its costs

Enables both business and technical teams to participate in data quality and monitoring initiatives

Designed mainly for technical teams who can implement ETL workflows or open source data validation software

Stay in touch

Stay updated with our progress. Sign up now

By subscribing you agree to with our Privacy Policy.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Start your data observibility today

Connect your data and start generating a baseline in less than 10 minutes. 

No sales call needed

Stay in touch

Stay updated with our progress. Sign up now

By subscribing you agree to with our Privacy Policy.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Start your data observability today

Connect your data and start generating a baseline in less than 10 minutes. 

Telmai is a platform for the Data Teams to proactively detect and investigate anomalies in real-time.
© 2023 Telm.ai All right reserved.