Local labs / demos and other public demonstrations¶
This section lists the current demonstrations and labs in this git repository or the interesting public repositories about Flink
Local demonstrations and end to end studies¶
All the local demonstrations run on local Kubernetes, some on Confluent Cloud. Most of them are still work in progress.
See the e2e-demos folder for a set of available demos based on the Flink local deployment or using Confluent Cloud for Flink.
- Record deduplication using Flink SQL or Table API deployed on Confluent Platform
- Change Data Capture with Postgresql, CDC Debezium, Confluent Platformm v8.0+, Cloud Native for Postgresql Kuberneted Operator
- e-commerce sale
- Transform json records
- Qlik CDC emulated to Flink dedup, filtering, transform logic
- Flink to JDBC Sink connector
- Savepoint demonstration
- SQL Gateway demonstration
- Terraform deployment
- GitOps with Openshift, ArgoCD and Tekton
Public repositories with valuable demonstrations¶
- Shoes Store Labs to run demonstrations on Confluent Cloud.
- Managing you Confluent Cloud Flink project at scale with a CLI
- Confluent Flink how to
- Confluent demonstration scene: a lot of Kafka, Connect, and ksqlDB demos
- Demonstrations for Shift left project migration and for data as a product management.
Interesting Blogs¶
- Building Streaming Data Pipelines, Part 1: Data Exploration With Tableflow
- Building Streaming Data Pipelines, Part 2: Data Processing and Enrichment With SQ
Quick personal demo for Confluent Cloud for Flink¶
Using the data generator and the confluent flink shell
- Login to Confluent using cli
- Be sure to use the environment with the compute pool:
confluent environment list
confluent environment use <env_id>
confluent flink compute-pool list
# get the region and cloud and the current max CFU
confluent flink compute-pool use <pool_id>
- Start one of the Datagen in the Confluent Console.
TBC