Skip to content

Commit

Permalink
docs
Browse files Browse the repository at this point in the history
  • Loading branch information
hdygxsj committed Jan 16, 2025
1 parent a65f3f9 commit a4ab135
Showing 1 changed file with 23 additions and 7 deletions.
30 changes: 23 additions & 7 deletions docs/flink-connector/flink-catalog-paimon.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,15 +34,18 @@ Supports most DDL and DML operations in Flink SQL, except such operations:
## Getting Started

### Prerequisites

Place the following JAR files in the lib directory of your Flink installation:

* paimon-flink-1.18-0.8.2.jar

* gravitino-flink-connector-runtime-*.jar

### SQL Example

```sql

-- Suppose paimon is the Paimon catalog name managed by Gravitino
-- Suppose paimon_catalog is the Paimon catalog name managed by Gravitino
use catalog paimon_catalog;
-- Execute statement succeed.

Expand All @@ -62,8 +65,7 @@ SET 'sql-client.execution.result-mode' = 'tableau';

CREATE TABLE paimon_tabla_a (
aa BIGINT,
bb BIGINT,
PRIMARY KEY (aa) NOT ENFORCED
bb BIGINT
);

show tables;
Expand All @@ -90,9 +92,23 @@ select * from paimon_table_a;
-- +----+----+
-- 1 row in set
```
## Catalog properties

You can refer to the configuration of [PaimonCatalog](../lakehouse-paimon-catalog.md) for guidance.


## Catalog properties

Gravitino Flink connector will transform below property names which are defined in catalog properties to Flink Paimon connector configuration.

| Gravitino catalog property name | Flink Paimon connector configuration | Description | Since Version |
|---------------------------------|----------------------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|------------------|
| `catalog-backend` | `metastore` | Catalog backend of Gravitino Paimon catalog. Supports `filesystem`, `jdbc` and `hive`. | 0.8.0-incubating |
| `uri` | `uri` | The URI configuration of the Paimon catalog. `thrift://127.0.0.1:9083` or `jdbc:postgresql://127.0.0.1:5432/db_name` or `jdbc:mysql://127.0.0.1:3306/metastore_db`. It is optional for `FilesystemCatalog`. | 0.8.0-incubating |
| `warehouse` | `warehouse` | Warehouse directory of catalog. `file:///user/hive/warehouse-paimon/` for local fs, `hdfs://namespace/hdfs/path` for HDFS , `s3://{bucket-name}/path/` for S3 or `oss://{bucket-name}/path` for Aliyun OSS | 0.8.0-incubating |
| `oss-endpoint` | `fs.oss.endpoint` | The endpoint of the Aliyun OSS. | 0.8.0-incubating |
| `oss-access-key-id` | `fs.oss.accessKeyId` | The access key of the Aliyun OSS. | 0.8.0-incubating |
| `oss-accesss-key-secret` | `fs.oss.accessKeySecret` | The secret key the Aliyun OSS. | 0.8.0-incubating |
| `s3-endpoint` | `s3.endpoint` | The endpoint of the AWS S3. | 0.8.0-incubating |
| `s3-access-key-id` | `s3.access-key` | The access key of the AWS S3. | 0.8.0-incubating |
| `s3-secret-access-key` | `s3.secret-key` | The secret key of the AWS S3. | 0.8.0-incubating |
| `jdbc-user` | `jdbc.user` | The user of JDBC | 0.8.0-incubating |
| `jdbc-password` | `jdbc.password` | The password of JDBC | 0.8.0-incubating |

Gravitino catalog property names with the prefix `flink.bypass.` are passed to Flink Paimon connector. For example, using `flink.bypass.clients` to pass the `clients` to the Flink Paimon connector.

0 comments on commit a4ab135

Please sign in to comment.