Learn how to use Confluent ksqlDb and KStreams API to transform, enrich, filter, and aggregate streams of real-time data.
December 25 @ 9:00 am – December 27 @ 5:00 pm UTC+8
About the event
Manipulating real-time data is increasingly becoming important for organizations of all sizes. With expertise in Apache Kafka® Streams and ksqlDB, you can help meet this demand.
Learn to identify patterns and use cases for real-time data and stream processing to unlock their potential. Take a deep dive into the architecture of Apache Kafka® Streams, master building apps for real-time data transformation, aggregation, and more. Get hands-on with ksqlDB, where the simplicity of SQL seamlessly meets Kafka Streams to write powerful queries.
After attending this event, you’ll be able to:
- Develop real-time applications with Kafka Streams API.
- Explore ksqlDB’s fault-tolerant, high-performance stream processing capabilities.
- Learn to enrich real-time streaming data by ,filtering, transforming, aggregating, and joining data streams.
- Gain insights into testing, securing, deploying, and monitoring applications.
- Connect with fellow developers, architects, and data scientists.
Target Audience: This event will cater to application developers, architects, DevOps engineers, and data scientists aiming to create impactful real-time applications.
Prerequisites: Familiarity with Java (or similar languages like C# or Python) and Kafka architecture is recommended. To make the most of this event, consider taking the recommended prerequisites: Confluent Fundamentals for Apache Kafka® and Confluent Developer Skills for Building Apache Kafka®.
Don’t miss this opportunity to become a real-time data maestro. Reserve your spot today!
- Motivation and Use Cases for Real-Time Streaming
- High Level Comparison of Kafka Streams and ksqlDB
- Stream Processing Concepts
- Kafka Streams’ Place in the Kafka Ecosystem
- High Level Architecture Design
- Kafka Streams Data Types
- Get streams of data into and out of Kafka with Kafka Connect and REST Proxy
- Maintain data formats and ensure compatibility with Schema Registry and Avro
- Build real-time streaming applications with Confluent ksqlDB & Kafka Streams
- Unit Tests
- Integration Tests
- Stress Tests
- End-to-end Tests
- Sample Use Cases
- End-to-end Examples
- Interacting with ksqlDB
- Data Manipulation
- Aggregations
- Testing
- Parallelism
- Elasticity
- Fault tolerance
- Capacity planning
- Troubleshooting
- ksqlDB-specific considerations
- Security Overview
- Access Control
- Examples
- ksqlDB-specific considerations
- JMX
- Confluent Control Center
- ksqlDB-specific Considerations
Throughout the course, you will interact with hands-on lab exercises to reinforce stream processing concepts.
Exercises include:
- Anatomy of a Kafka Streams Application
- Joining Two Streams
- Using the Kafka Streams Processor API
- Testing a Kafka Streams Application
- Using ksqlDB
- Using the ksqlDB REST API
- Scaling a Kafka Streams Application
- Securing a Kafka Streams Application
- Getting Metrics from a Kafka Streams Application
- Using JConsole to monitor a Kafka Streams Application
- Monitoring a Kafka Streams Application in Confluent Control Center