SQL-based Transformations and JSON Columns in Imply Polaris

Oct 12, 2022
Timmy Freese

Transformations

There’s no “I” in “Team”, but there is a “T” in “ETL”. An important feature of any database is the ability to transform data during ingestion and query. Imply Polaris supports both. As an example, consider the situation where a company records its Total Revenue in a ‘total_revenue’ column and its Total Expenses in a ‘total_expenses’ column. If an analyst at the company wants to know the company’s Net Profits, they can implement a simple transformation of ‘total_revenue – total_expenses’.

Older versions of Imply provided users with the ability to perform transformations during ingestion using a native ingestion spec and during query using SQL. Now, Imply also enables SQL-based transformations during ingestion time. This much simplified user flow has been carried through to Imply’s DBaaS, Polaris.

To transform data using SQL expressions during ingestion, users can POST an ingestion job using the Polaris API or use the UI to enter an expression such as ‘total_revenue – total_expenses’ while editing the table’s schema. Additionally, when working with a rollup table, you can apply a variety of aggregation functions on measures, including MIN, MAX, SUM, and COUNT.

JSON Columns

In addition to being able to transform your data at ingestion, you can also work with nested data in Polaris. The world is a messy place, and capturing events in a fully structured way is often infeasible. Sometimes data comes in a semi-structured form and users will want to give this data additional structure. Polaris supports nested data with the JSON datatype. Additionally, users can add more structure to this JSON datatype using the JSON_VALUE transformation and other SQL JSON functions, available via API and UI. For more information about nested data in Polaris, see Ingest nested data  in the Polaris documentation.

Enabling real-time analytics with transformations and JSON columns in Polaris is one of the many ways in which we are powering the next generation of m

Other blogs you might find interesting

No records found...
Jan 30, 2025

2024 Product Innovation Recap

We’ve made a lot of progress over the past decade. As we reflect upon the past year, we’re proud to share a summary of the top 2024 product updates across both Druid and Imply. 2024 was a banner year,...

Learn More
Jan 30, 2025

Druid Summit Lakehouse Panel: A Deep Dive into Data Lakehouses and Apache Druid

At the inaugural in-person Druid Summit this past October, industry leaders gathered to explore the future of data, streaming analytics, and more. In these panels industry experts answered questions about streaming...

Learn More
Nov 14, 2024

Recap: Druid Summit 2024 – A Vibrant Community Shaping the Future of Data Analytics

In today’s fast-paced world, organizations rely on real-time analytics to make critical decisions. With millions of events streaming in per second, having an intuitive, high-speed data exploration tool to...

Learn More

Let us help with your analytics apps

Request a Demo