Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[#6218] docs: Adding Documentation for GravitinoPaimonCatalog in Flink Connectors #6297

Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
108 changes: 108 additions & 0 deletions docs/flink-connector/flink-catalog-paimon.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,108 @@
---
title: "Flink connector paimon catalog"
slug: /flink-connector/flink-catalog-paimon
keyword: flink connector paimon catalog
license: "This software is licensed under the Apache License version 2."
---

This document provides a comprehensive guide on configuring and using Apache Gravitino Flink connector to access the Paimon catalog managed by the Gravitino server.
## Capabilities

### Supported Paimon Table Types

* AppendOnly Table

### Supported Operation Types

Supports most DDL and DML operations in Flink SQL, except such operations:

- Function operations
- Partition operations
- View operations
- Querying UDF
- `LOAD` clause
- `UNLOAD` clause
- `CREATE TABLE LIKE` clause
- `TRUCATE TABLE` clause
- `UPDATE` clause
- `DELETE` clause
- `CALL` clause

## Requirement

* Paimon 0.8

Higher version like 0.9 or above may also supported but have not been tested fully.

## Getting Started

### Prerequisites

Place the following JAR files in the lib directory of your Flink installation:

* paimon-flink-1.18-0.8.2.jar

* gravitino-flink-connector-runtime-\${flinkMajorVersion}_$scalaVersion.jar

### SQL Example

```sql

-- Suppose paimon_catalog is the Paimon catalog name managed by Gravitino
use catalog paimon_catalog;
-- Execute statement succeed.

show databases;
-- +---------------------+
-- | database name |
-- +---------------------+
-- | default |
-- | gravitino_paimon_db |
-- +---------------------+

SET 'execution.runtime-mode' = 'batch';
-- [INFO] Execute statement succeed.

SET 'sql-client.execution.result-mode' = 'tableau';
-- [INFO] Execute statement succeed.

CREATE TABLE paimon_tabla_a (
aa BIGINT,
bb BIGINT
);

show tables;
-- +----------------+
-- | table name |
-- +----------------+
-- | paimon_table_a |
-- +----------------+


select * from paimon_table_a;
-- Empty set

insert into paimon_table_a(aa,bb) values(1,2);
-- [INFO] Submitting SQL update statement to the cluster...
-- [INFO] SQL update statement has been successfully submitted to the cluster:
-- Job ID: 74c0c678124f7b452daf08c399d0fee2

select * from paimon_table_a;
-- +----+----+
-- | aa | bb |
-- +----+----+
-- | 1 | 2 |
-- +----+----+
-- 1 row in set
```

## Catalog properties

Gravitino Flink connector will transform below property names which are defined in catalog properties to Flink Paimon connector configuration.

| Gravitino catalog property name | Flink Paimon connector configuration | Description | Since Version |
|---------------------------------|----------------------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|------------------|
| `catalog-backend` | `metastore` | Catalog backend of Gravitino Paimon catalog. Supports `filesystem`. | 0.8.0-incubating |
| `warehouse` | `warehouse` | Warehouse directory of catalog. `file:///user/hive/warehouse-paimon/` for local fs, `hdfs://namespace/hdfs/path` for HDFS , `s3://{bucket-name}/path/` for S3 or `oss://{bucket-name}/path` for Aliyun OSS | 0.8.0-incubating |

Gravitino catalog property names with the prefix `flink.bypass.` are passed to Flink Paimon connector. For example, using `flink.bypass.clients` to pass the `clients` to the Flink Paimon connector.
Loading