Manage User-Defined Functions (UDFs)
This page explains how to add and use UDFs. You can register JAR/Python artifacts, update them, and delete them. There are two ways to use UDFs in SQL, and they behave differently:
- Catalog-level UDFs — registered once via the UI; available from the
system.default
catalog and callable by class name. - Deployment-level UDFs — bundled as a dependency of a specific deployment; you expose them in SQL with
CREATE TEMPORARY FUNCTION
.
Before you start
To avoid dependency conflicts when building UDFs:
- Use the same Flink version in your UDF project as the Engine Version you select in Ververica.
- Mark Flink dependencies as
<scope>provided</scope>
. - Shade any third-party libraries into your JAR (Apache Maven Shade Plugin).
- When using your own S3 connector, see Connecting with external file systems for details on
s3i://
ands3://
schemas.
Python UDFs (deployment-level): Requires Flink engine VERA 1.0.3+. Provide dependencies via
python.files
/python.archives
in Configuration → Parameters. Draft syntax checks aren’t supported. Skip the check before deploying.
Option A: Catalog-level UDF
When to use: You want to register UDFs once and call them by class name in any SQL draft in the workspace.
What happens: The platform uploads and parses your UDF artifact and registers discovered functions under system.default
. You can then call them directly in SQL by class name (no CREATE FUNCTION
needed).
1) Register via UI
Steps
-
Go to SQL Editor → Drafts.
-
Open the UDFs panel and click Register UDF Artifact.
-
Upload the UDF JAR:
- Upload the main artifact JAR.
- (Optional) Upload a separate dependency file (recommended for Python UDFs; Java UDFs typically shade dependencies).
- Or provide an accessible HTTP URL if the file is large.
-
Click Confirm. Ververica scans the artifact for UDF / UDAF / UDTF classes and lists them.
-
Select the functions to register and click Create functions.
You’ll see your artifact and functions listed in SQL Editor → UDFs:
2) Use in SQL (call by class name)
Because the functions are registered into system.default
, you can call them by class name:
-- Example: using class `TimesTwo` directly (no CREATE FUNCTION needed)
CREATE TEMPORARY TABLE s1 (
a INT,
b BIGINT,
c VARCHAR,
d VARCHAR,
PRIMARY KEY (a) NOT ENFORCED
) WITH (
'connector' = 'datagen',
'rows-per-second' = '1'
);
CREATE TEMPORARY TABLE sink (
a INT
) WITH ('connector' = 'print');
INSERT INTO sink
SELECT TimesTwo(a)
FROM s1;
Option B: Deployment-level UDF
When to use: You want a UDF to be available only to one deployment (tied to its artifact set and engine version).
What happens: You attach the UDF artifact as an Additional Dependency of the deployment. In your SQL, you create a temporary function alias that points to the class.
1) Attach the artifact to the deployment
Steps
- When creating or editing a deployment, choose the Engine Version that matches your UDF build.
- In Additional Dependencies, upload/select your UDF artifact.
2) Expose the UDF in SQL
Add a CREATE TEMPORARY FUNCTION
statement that maps an alias to the fully-qualified class name:
-- Java/Scala UDF
CREATE TEMPORARY FUNCTION two AS 'com.ververica.flink.udfs.TimesTwo';
-- Python UDF (deployment-level)
-- CREATE TEMPORARY FUNCTION my_py_udf LANGUAGE Python; -- add python.files / python.archives in Configuration
Then write your tables and queries:
CREATE TEMPORARY TABLE s1 (
a INT,
b BIGINT,
c VARCHAR,
d VARCHAR,
PRIMARY KEY (a) NOT ENFORCED
) WITH (
'connector' = 'datagen',
'rows-per-second' = '1'
);
CREATE TEMPORARY TABLE sink (
a INT
) WITH ('connector' = 'print');
INSERT INTO sink
SELECT two(a)
FROM s1;
Choosing between Catalog-level vs. Deployment-level
Aspect | Catalog-level | Deployment-level |
---|---|---|
Scope | Workspace-wide via system.default | Single deployment only |
How to register | SQL Editor → UDFs → Register UDF Artifact | Add JAR/Python in Additional Dependencies of the deployment |
How to use in SQL | Call by class name (no CREATE FUNCTION ) | Define an alias with CREATE TEMPORARY FUNCTION … AS 'FQCN' (or LANGUAGE Python ) |
Versioning | Decoupled from a single deployment; managed in UDFs panel | Tied to the deployment’s engine version and dependency set |
Python support | Catalog-level registration via artifact scan | Supported on VERA 1.0.3+; configure python.files /python.archives ; skip draft syntax check |
Manage and delete UDFs
- View catalog-level UDFs: SQL Editor → UDFs.
- Delete a catalog-level UDF: Hover the artifact in the UDFs panel and click the trash icon.
- Update a catalog-level UDF: Register a new artifact version and (optionally) remove the old one.
- Remove a deployment-level UDF: Edit the deployment and remove the artifact from Additional Dependencies.
Troubleshooting
-
Function not found
- Catalog-level: Confirm the artifact appears under SQL Editor → UDFs and use the class name exactly as registered.
- Deployment-level: Ensure the artifact appears under Additional Dependencies and that you declared
CREATE TEMPORARY FUNCTION … AS 'fully.qualified.Class'
.
-
ClassNotFound / NoClassDefFoundError
- Verify engine & build Flink versions match.
- Shade third-party libs; avoid bundling Flink itself (use
provided
scope).
-
Large artifacts
- Use the External URL option (HTTP only; ensure network connectivity from the service to the host).