Most stream processing use cases can be solved with continuous SQL queries. Common tasks include data transformations, enrichment, joins, and aggregations, as well as moving events from one system to another and continuously updating views with low latency. The benefits of SQL for such use cases are manifold.
SQL provides a large and well-known tool box to solve many tasks and makes stream processing accessible to a much wider audience. SQL queries can be written and deployed in a fraction of the time that is required to implement an equivalent stream processing job in Java or Scala. Moreover, query optimizers and highly-optimized execution engines ensure that most SQL queries outperform manually implemented stream processing jobs.
Ververica Platform’s SQL Service allows you to develop and deploy continuous SQL queries on Apache Flink®. Each Namespace manages its own
Configuring SQL Service¶
Enabling SQL Service¶
To enable SQL Service you need to add the following snippet to your Helm
vvp: gateway: services: sql-service-enabled: true
Since SQL Service depends on Universal Blob Storage, it must be properly configured as well. See the documentation for how SQL Service uses blob storage.
Providing Connector Dependencies¶
By default, SQL Deployments download the following connector dependencies from Maven Central:
https://repo1.maven.org/maven2/org/apache/flink/flink-connector-kafka-base_2.12/1.11.1/flink-connector-kafka-base_2.12-1.11.1.jar https://repo1.maven.org/maven2/org/apache/flink/flink-connector-kafka_2.12/1.11.1/flink-connector-kafka_2.12-1.11.1.jar https://repo1.maven.org/maven2/org/apache/flink/flink-json/1.11.1/flink-json-1.11.1.jar https://repo1.maven.org/maven2/org/apache/flink/flink-avro/1.11.1/flink-avro-1.11.1.jar https://repo1.maven.org/maven2/org/apache/flink/flink-csv/1.11.1/flink-csv-1.11.1.jar https://repo1.maven.org/maven2/org/apache/kafka/kafka-clients/2.2.0/kafka-clients-2.2.0.jar https://repo1.maven.org/maven2/org/apache/avro/avro/1.8.2/avro-1.8.2.jar https://repo1.maven.org/maven2/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar https://repo1.maven.org/maven2/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar https://repo1.maven.org/maven2/joda-time/joda-time/2.7/joda-time-2.7.jar
If your environment does not allow deployments to access the Internet, you can disable the dependency downloads with the following configuration in your Helm
vvp: gateway: services: sqlService: connector: connectorUrls: 
If you disable dependency downloads, you need to configure a Flink image for your SQL Deployments that bundles the connector dependencies in
/flink/lib, for example by providing the following properties in the Deployment Template.
spec: template: spec: artifact: flinkImageRegistry: registry.ververica.com/customer-eap/v2.2 flinkImageRepository: flink flinkImageTag: 1.11.1-stream1-scala_2.12-eap1 flinkVersion: '1.11'
It is recommended to configure these properties as Deployment Defaults.
SQL Service’s core features are summarized below.
- UI with SQL Editor and Catalog Manager
- SQL Service’s UI is integrated with Ververica Platform’s UI. It features a SQL editor with syntax highlighting, code completion, and validation of SQL queries. The UI also includes a catalog browser and simplifies adding, updating, and removing user-defined functions.
- Deployment of SQL queries
- SQL queries are configured and deployed just like regular Deployments, providing the same powerful life-cycle management for SQL queries (recovery, stop-resume via savepoints).
- Built-in Catalog
- SQL Service comes with a built-in catalog to store tables and functions metadata, rendering an external catalog service unnecessary. Currently, only external tables backed by Apache Kafka® topics are supported.
- UDF Management
- Apache Flink® requires Java or Scala user-defined functions to be packaged as JAR files. SQL Service simplifies the management (registration, update, deletion) of UDF JARs and the registration of functions in the catalog.
- REST API
- The full functionality of SQL Service is exposed via REST endpoints.
We are continuously adding new features and improvements to SQL Service.