Timeplus
This tutorial shows how to integrate Upstash Kafka with Timeplus
Timeplus is a streaming-first data analytics platform. It provides powerful end-to-end capabilities, leveraging the open source streaming engine Proton, to help teams process streaming and historical data quickly and intuitively, accessible for organizations of all sizes and industries. It enables data engineers and platform engineers to unlock streaming data value using SQL.
Upstash Kafka Setup
Create a Kafka cluster using Upstash Console or Upstash CLI by following Getting Started.
Create two topics by following the creating topic steps. Let’s name the first topic input
, since we are going to stream from this topic to Timeplus. The name of the second topic can be output
. This one is going to receive the stream from Timeplus.
Create a Kafka Source in Timeplus
Besides the Open Source engine Proton, Timeplus also offers Timeplus Cloud, a fully managed cloud service with SOC2 Type 1 Security Compliance.
To use the Timeplus Cloud, create an account and setup a new workspace.
After creating the workspace, click the Data Ingestion
in the menu bar. Click on Add Data
Choose the Apache Kafka
source.
In the wizard, specify the Kafka Brokers as <name>-<region>-kafka.upstash.io:9092
. Enable all security options and choose SASL SCRAM 256
and type the username and password.
Click the Next
button. In the dropdown list, you should be able to see all available Kafka topics. Choose the input
topic. Leave the JSON
as the Read As
option. Choose Earliest
if you already create messages in the topic. Otherwise use the default value Latest
.
Click the Next
button, it will start loading messages in the input
topic.
Let’s go to Upstash UI and post a JSON message in input
topic:
Right after the message is posted, you should be able to see it in the Timeplus UI. Since you specify JSON format, those 4 key/value pairs are read as 4 columns. Choose a name for the data stream, say input
and accept the default options.
Click the Next
button to review the settings. Finally, click the Create the source
button.
There will be a green notification message informing you the source has been created.
Run Streaming SQL
Click the Query
menu on the left and type the streaming SQL as:
Go back to the Upstash UI to post a few more messages to the topic and you can see those live events in the query result.
Apply Streaming ETL and Write Data to Upstash Kafka
Cancel the previous streaming SQL and use the following one to mask the IP addresses.
Click the Send as Sink
button. Use the default Kafka
output type and specify the broker, topic name(output
), user name and password.
Click the Send
button to create the sink.
Go back to the Upstash UI. Create a few more messages in input
topic and you should get them available in output
topic with raw IP addresses masked.
Congratulations! You just setup a streaming ETL with a single line of SQL in Timeplus Cloud. Learn more about Timeplus by visiting https://docs.timeplus.com/ or join https://timeplus.com/slack