diff --git a/.github/workflows/required-check.yml b/.github/workflows/required-check.yml
index 4b73698166313..60d16a00234e9 100644
--- a/.github/workflows/required-check.yml
+++ b/.github/workflows/required-check.yml
@@ -40,7 +40,7 @@ jobs:
steps:
- uses: actions/checkout@v3
- name: Run CheckStyle
- run: ./mvnw checkstyle:check -Dcheckstyle.skip=false -T1C
+ run: ./mvnw checkstyle:check -Pcheck -T1C
check-spotless:
if: ${{ needs.global-environment.outputs.GLOBAL_JOB_ENABLED == 'true' }}
@@ -51,7 +51,7 @@ jobs:
steps:
- uses: actions/checkout@v3
- name: Run Spotless
- run: ./mvnw spotless:check -T1C
+ run: ./mvnw spotless:check -Pcheck -T1C
check-license:
if: ${{ needs.global-environment.outputs.GLOBAL_JOB_ENABLED == 'true' }}
@@ -62,4 +62,4 @@ jobs:
steps:
- uses: actions/checkout@v3
- name: Run Apache Rat
- run: ./mvnw apache-rat:check -Drat.skip=false -T1C
+ run: ./mvnw apache-rat:check -Pcheck -T1C
diff --git a/.mvn/extensions.xml b/.mvn/extensions.xml
deleted file mode 100644
index 9a08540f7241c..0000000000000
--- a/.mvn/extensions.xml
+++ /dev/null
@@ -1,24 +0,0 @@
-
-
-
-
- fr.jcgay.maven
- maven-profiler
- 3.2
-
-
diff --git a/docs/community/content/involved/conduct/code.cn.md b/docs/community/content/involved/conduct/code.cn.md
index a7e198eeb8206..d195dbd5cf462 100644
--- a/docs/community/content/involved/conduct/code.cn.md
+++ b/docs/community/content/involved/conduct/code.cn.md
@@ -19,10 +19,10 @@ chapter = true
## 代码提交行为规范
- 确保遵守编码规范。
- - 确保构建流程中的各个步骤都成功完成,包括:Apache 协议文件头检查、Checkstyle 检查、编译、单元测试等。构建流程启动命令:`./mvnw clean install -B -T1C -Dmaven.javadoc.skip -Dmaven.jacoco.skip -e`。
+ - 确保构建流程中的各个步骤都成功完成,包括:Apache 协议文件头检查、Checkstyle 检查、编译、单元测试等。构建流程启动命令:`./mvnw clean install -B -T1C -Pcheck`。
- 确保覆盖率不低于 master 分支。
- 应尽量将设计精细化拆分;做到小幅度修改,多次数提交,但应保证提交的完整性。
- - 通过 Spotless 统一代码风格,执行 `mvn spotless:apply` 格式化代码。
+ - 通过 Spotless 统一代码风格,执行 `./mvnw spotless:apply -Pcheck` 格式化代码。
- 如果您使用 IDEA,可导入推荐的 `src/resources/code-style-idea.xml`。
## 编码规范
diff --git a/docs/community/content/involved/conduct/code.en.md b/docs/community/content/involved/conduct/code.en.md
index 29cc5e7952d95..546cc97c63079 100644
--- a/docs/community/content/involved/conduct/code.en.md
+++ b/docs/community/content/involved/conduct/code.en.md
@@ -16,10 +16,10 @@ The following code of conduct is based on full compliance with [ASF CODE OF COND
## Contributor Covenant Submitting of Conduct
- Conform to `Contributor Covenant Code of Conduct` below.
- - Make sure Maven build process success. Run `./mvnw clean install -B -T1C -Dmaven.javadoc.skip -Dmaven.jacoco.skip -e` command in shell to start Maven build process.
+ - Make sure Maven build process success. Run `./mvnw clean install -B -T1C -Pcheck` command in shell to start Maven build process.
- Make sure the test coverage rate is not lower than the master branch.
- Careful consideration for each `pull request`; Small and frequent `pull request` with complete unit function is welcomed.
- - Through the uniform code style of spotless, execute the `mvn spotless:apply` formatted code.
+ - Through the uniform code style of spotless, execute the `./mvnw spotless:apply -Pcheck` formatted code.
- If using IDEA, you can import the recommended `src/resources/code-style-idea.xml`.
## Contributor Covenant Code of Conduct
diff --git a/docs/community/content/involved/contribute/contributor.cn.md b/docs/community/content/involved/contribute/contributor.cn.md
index abdf941a9aa0d..64c279781dce2 100644
--- a/docs/community/content/involved/contribute/contributor.cn.md
+++ b/docs/community/content/involved/contribute/contributor.cn.md
@@ -40,7 +40,7 @@ git remote -v
```shell
cd shardingsphere
-mvn clean install -Dmaven.javadoc.skip=true -Dcheckstyle.skip=true -Dspotbugs.skip=true -Drat.skip=true -Djacoco.skip=true -DskipITs -DskipTests -Prelease
+./mvnw clean install -DskipITs -DskipTests -Prelease
```
当你以后从 ShardingSphere 拉取最新代码并新建分支,可能会遇到类似的解析器编译错误,可以重新运行这个命令来解决问题。
diff --git a/docs/community/content/involved/contribute/contributor.en.md b/docs/community/content/involved/contribute/contributor.en.md
index 44ca1777c3768..7c7cad43082b5 100644
--- a/docs/community/content/involved/contribute/contributor.en.md
+++ b/docs/community/content/involved/contribute/contributor.en.md
@@ -40,7 +40,7 @@ Build and install all modules, it'll install modules into Maven local repository
```shell
cd shardingsphere
-mvn clean install -Dmaven.javadoc.skip=true -Dcheckstyle.skip=true -Dspotbugs.skip=true -Drat.skip=true -Djacoco.skip=true -DskipITs -DskipTests -Prelease
+./mvnw clean install -DskipITs -DskipTests -Prelease
```
When you pull the latest code from ShardingSphere and create new branch later, you might get similar compile error of parser again, then you could run this command again.
diff --git a/docs/document/content/dev-manual/encrypt.cn.md b/docs/document/content/dev-manual/encrypt.cn.md
index 1fd403d0f800f..9ffedbd2cfc16 100644
--- a/docs/document/content/dev-manual/encrypt.cn.md
+++ b/docs/document/content/dev-manual/encrypt.cn.md
@@ -20,6 +20,4 @@ chapter = true
| *配置标识* | *详细说明* | *全限定类名* |
|------------------|------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| AES | 基于 AES 的数据加密算法 | [`org.apache.shardingsphere.encrypt.algorithm.encrypt.AESEncryptAlgorithm`](https://github.com/apache/shardingsphere/blob/master/features/encrypt/core/src/main/java/org/apache/shardingsphere/encrypt/algorithm/standard/AESEncryptAlgorithm.java) |
-| RC4 | 基于 RC4 的数据加密算法 | [`org.apache.shardingsphere.encrypt.algorithm.encrypt.RC4EncryptAlgorithm`](https://github.com/apache/shardingsphere/blob/master/features/encrypt/core/src/main/java/org/apache/shardingsphere/encrypt/algorithm/standard/RC4EncryptAlgorithm.java) |
| MD5 | 基于 MD5 的辅助查询加密算法 | [`org.apache.shardingsphere.encrypt.algorithm.encrypt.MD5EncryptAlgorithm`](https://github.com/apache/shardingsphere/blob/master/features/encrypt/core/src/main/java/org/apache/shardingsphere/encrypt/algorithm/assisted/MD5AssistedEncryptAlgorithm.java) |
-| CHAR_DIGEST_LIKE | 用于模糊查询的数据加密算法 | [`org.apache.shardingsphere.encrypt.algorithm.like.CharDigestLikeEncryptAlgorithm`](https://github.com/apache/shardingsphere/blob/master/features/encrypt/core/src/main/java/org/apache/shardingsphere/encrypt/algorithm/like/CharDigestLikeEncryptAlgorithm.java) |
diff --git a/docs/document/content/dev-manual/encrypt.en.md b/docs/document/content/dev-manual/encrypt.en.md
index a81d2acd3da77..caeada0038052 100644
--- a/docs/document/content/dev-manual/encrypt.en.md
+++ b/docs/document/content/dev-manual/encrypt.en.md
@@ -20,6 +20,4 @@ Data encrypt algorithm definition
| *Configuration Type* | *Description* | *Fully-qualified class name* |
|----------------------|---------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| AES | AES data encrypt algorithm | [`org.apache.shardingsphere.encrypt.algorithm.encrypt.AESEncryptAlgorithm`](https://github.com/apache/shardingsphere/blob/master/features/encrypt/core/src/main/java/org/apache/shardingsphere/encrypt/algorithm/standard/AESEncryptAlgorithm.java) |
-| RC4 | RC4 data encrypt algorithm | [`org.apache.shardingsphere.encrypt.algorithm.encrypt.RC4EncryptAlgorithm`](https://github.com/apache/shardingsphere/blob/master/features/encrypt/core/src/main/java/org/apache/shardingsphere/encrypt/algorithm/standard/RC4EncryptAlgorithm.java) |
| MD5 | MD5 assisted query encrypt algorithm | [`org.apache.shardingsphere.encrypt.algorithm.encrypt.MD5EncryptAlgorithm`](https://github.com/apache/shardingsphere/blob/master/features/encrypt/core/src/main/java/org/apache/shardingsphere/encrypt/algorithm/assisted/MD5AssistedEncryptAlgorithm.java) |
-| CHAR_DIGEST_LIKE | Data encryption algorithms for like queries | [`org.apache.shardingsphere.encrypt.algorithm.like.CharDigestLikeEncryptAlgorithm`](https://github.com/apache/shardingsphere/blob/master/features/encrypt/core/src/main/java/org/apache/shardingsphere/encrypt/algorithm/like/CharDigestLikeEncryptAlgorithm.java) |
diff --git a/docs/document/content/dev-manual/mode.cn.md b/docs/document/content/dev-manual/mode.cn.md
index 6ce9fb78ac3b4..edcda0e8cbab4 100644
--- a/docs/document/content/dev-manual/mode.cn.md
+++ b/docs/document/content/dev-manual/mode.cn.md
@@ -37,5 +37,4 @@ chapter = true
|-----------|-------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| ZooKeeper | 基于 ZooKeeper 的持久化 | [`org.apache.shardingsphere.mode.repository.cluster.zookeeper.ZookeeperRepository`](https://github.com/apache/shardingsphere/blob/master/mode/type/cluster/repository/provider/zookeeper/src/main/java/org/apache/shardingsphere/mode/repository/cluster/zookeeper/ZookeeperRepository.java) |
| etcd | 基于 Etcd 的持久化 | [`org.apache.shardingsphere.mode.repository.cluster.etcd.EtcdRepository`](https://github.com/apache/shardingsphere/blob/master/mode/type/cluster/repository/provider/etcd/src/main/java/org/apache/shardingsphere/mode/repository/cluster/etcd/EtcdRepository.java) |
-| Nacos | 基于 Nacos 的持久化 | [`org.apache.shardingsphere.mode.repository.cluster.nacos.NacosRepository`](https://github.com/apache/shardingsphere/blob/master/mode/type/cluster/repository/provider/nacos/src/main/java/org/apache/shardingsphere/mode/repository/cluster/nacos/NacosRepository.java) |
| Consul | 基于 Consul 的持久化 | [`org.apache.shardingsphere.mode.repository.cluster.consul.ConsulRepository`](https://github.com/apache/shardingsphere/blob/master/mode/type/cluster/repository/provider/consul/src/main/java/org/apache/shardingsphere/mode/repository/cluster/consul/ConsulRepository.java) |
diff --git a/docs/document/content/dev-manual/mode.en.md b/docs/document/content/dev-manual/mode.en.md
index 0f44fd25a76c0..0fab555b840fd 100644
--- a/docs/document/content/dev-manual/mode.en.md
+++ b/docs/document/content/dev-manual/mode.en.md
@@ -37,5 +37,4 @@ Cluster mode configuration information persistence definition
|----------------------|-----------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| ZooKeeper | ZooKeeper based persistence | [`org.apache.shardingsphere.mode.repository.cluster.zookeeper.ZookeeperRepository`](https://github.com/apache/shardingsphere/blob/master/mode/type/cluster/repository/provider/zookeeper/src/main/java/org/apache/shardingsphere/mode/repository/cluster/zookeeper/ZookeeperRepository.java) |
| etcd | Etcd based persistence | [`org.apache.shardingsphere.mode.repository.cluster.etcd.EtcdRepository`](https://github.com/apache/shardingsphere/blob/master/mode/type/cluster/repository/provider/etcd/src/main/java/org/apache/shardingsphere/mode/repository/cluster/etcd/EtcdRepository.java) |
-| Nacos | Nacos based persistence | [`org.apache.shardingsphere.mode.repository.cluster.nacos.NacosRepository`](https://github.com/apache/shardingsphere/blob/master/mode/type/cluster/repository/provider/nacos/src/main/java/org/apache/shardingsphere/mode/repository/cluster/nacos/NacosRepository.java) |
| Consul | Consul based persistence | [`org.apache.shardingsphere.mode.repository.cluster.consul.ConsulRepository`](https://github.com/apache/shardingsphere/blob/master/mode/type/cluster/repository/provider/consul/src/main/java/org/apache/shardingsphere/mode/repository/cluster/consul/ConsulRepository.java) |
diff --git a/docs/document/content/faq/_index.cn.md b/docs/document/content/faq/_index.cn.md
index 3797a58860e7f..24e87b55f8d94 100644
--- a/docs/document/content/faq/_index.cn.md
+++ b/docs/document/content/faq/_index.cn.md
@@ -222,7 +222,7 @@ ShardingSphere-Proxy 在部署过程中没有添加 jdbc 驱动,需要将 jdbc
ShardingSphere 使用 lombok 实现极简代码。关于更多使用和安装细节,请参考 [lombok官网](https://projectlombok.org/download.html)。
`org.apache.shardingsphere.sql.parser.autogen` 包下的代码由 ANTLR 生成,可以执行以下命令快速生成:
```bash
-./mvnw -Dcheckstyle.skip=true -Dspotbugs.skip=true -Drat.skip=true -Dmaven.javadoc.skip=true -Djacoco.skip=true -DskipITs -DskipTests install -T1C
+./mvnw -DskipITs -DskipTests install -T1C
```
生成的代码例如 `org.apache.shardingsphere.sql.parser.autogen.PostgreSQLStatementParser` 等 Java 文件由于较大,默认配置的 IDEA 可能不会索引该文件。
可以调整 IDEA 的属性:`idea.max.intellisense.filesize=10000`。
diff --git a/docs/document/content/faq/_index.en.md b/docs/document/content/faq/_index.en.md
index 1bcf0ffbbf32a..ab269dd36fe5f 100644
--- a/docs/document/content/faq/_index.en.md
+++ b/docs/document/content/faq/_index.en.md
@@ -236,7 +236,7 @@ Answer:
ShardingSphere uses lombok to enable minimal coding. For more details about using and installment, please refer to the official website of [lombok](https://projectlombok.org/download.html).
The codes under the package `org.apache.shardingsphere.sql.parser.autogen` are generated by ANTLR. You may execute the following command to generate codes:
```bash
-./mvnw -Dcheckstyle.skip=true -Dspotbugs.skip=true -Drat.skip=true -Dmaven.javadoc.skip=true -Djacoco.skip=true -DskipITs -DskipTests install -T1C
+./mvnw -DskipITs -DskipTests install -T1C
```
The generated codes such as `org.apache.shardingsphere.sql.parser.autogen.PostgreSQLStatementParser` may be too large to be indexed by the IDEA.
You may configure the IDEA's property `idea.max.intellisense.filesize=10000`.
diff --git a/docs/document/content/user-manual/common-config/builtin-algorithm/encrypt.cn.md b/docs/document/content/user-manual/common-config/builtin-algorithm/encrypt.cn.md
index 62931407f691c..68b5f19d7df89 100644
--- a/docs/document/content/user-manual/common-config/builtin-algorithm/encrypt.cn.md
+++ b/docs/document/content/user-manual/common-config/builtin-algorithm/encrypt.cn.md
@@ -22,31 +22,6 @@ weight = 5
| aes-key-value | String | AES 使用的 KEY |
| digest-algorithm-name | String | AES KEY 的摘要算法 (可选,默认值:SHA-1) |
-#### RC4 加密算法
-
-类型:RC4
-
-可配置属性:
-
-| *名称* | *数据类型* | *说明* |
-|---------------|--------|-------------|
-| rc4-key-value | String | RC4 使用的 KEY |
-
-### 模糊加密算法
-
-#### 单字符摘要模糊加密算法
-
-类型:CHAR_DIGEST_LIKE
-
-可配置属性:
-
-| *名称* | *数据类型* | *说明* |
-|-------|--------|--------------------|
-| delta | int | 字符Unicode码偏移量(十进制) |
-| mask | int | 字符加密掩码(十进制) |
-| start | int | 密文Unicode初始码(十进制) |
-| dict | String | 常见字 |
-
### 辅助查询加密算法
#### MD5 辅助查询加密算法
diff --git a/docs/document/content/user-manual/common-config/builtin-algorithm/encrypt.en.md b/docs/document/content/user-manual/common-config/builtin-algorithm/encrypt.en.md
index 58cad71e6f7ff..ac09a77ccf472 100644
--- a/docs/document/content/user-manual/common-config/builtin-algorithm/encrypt.en.md
+++ b/docs/document/content/user-manual/common-config/builtin-algorithm/encrypt.en.md
@@ -22,31 +22,6 @@ Attributes:
| aes-key-value | String | AES KEY |
| digest-algorithm-name | String | AES KEY DIGEST ALGORITHM (optional, default: SHA-1) |
-#### RC4 Encrypt Algorithm
-
-Type: RC4
-
-Attributes:
-
-| *Name* | *DataType* | *Description* |
-|---------------|------------|---------------|
-| rc4-key-value | String | RC4 KEY |
-
-### Like Encrypt Algorithm
-
-#### CharDigestLike Encrypt Algorithm
-
-Type:CHAR_DIGEST_LIKE
-
-Attributes:
-
-| *Name* | *DataType* | *Description* |
-|--------|------------|-------------------------------------------------|
-| delta | int | Character Unicode offset(decimal number) |
-| mask | int | Character encryption mask(decimal number) |
-| start | int | Ciphertext Unicode initial code(decimal number) |
-| dict | String | Common words |
-
### Assisted Encrypt Algorithm
#### MD5 Assisted Encrypt Algorithm
diff --git a/docs/document/content/user-manual/common-config/builtin-algorithm/metadata-repository.cn.md b/docs/document/content/user-manual/common-config/builtin-algorithm/metadata-repository.cn.md
index 40cdac63afecf..5181f984e9c27 100644
--- a/docs/document/content/user-manual/common-config/builtin-algorithm/metadata-repository.cn.md
+++ b/docs/document/content/user-manual/common-config/builtin-algorithm/metadata-repository.cn.md
@@ -54,21 +54,6 @@ Apache ShardingSphere 为不同的运行模式提供了不同的元数据持久
| timeToLiveSeconds | long | 临时数据失效的秒数 | 30 |
| connectionTimeout | long | 连接超时秒数 | 30 |
-### Nacos 持久化
-
-类型:Nacos
-
-适用模式:Cluster
-
-可配置属性:
-
-| *名称* | *数据类型* | *说明* | *默认值* |
-|---------------------------|--------|-------------------|--------|
-| clusterIp | String | 集群中的唯一标识 | 真实主机IP |
-| retryIntervalMilliseconds | long | 重试间隔毫秒数 | 500 |
-| maxRetries | int | 客户端检查数据可用性的最大重试次数 | 3 |
-| timeToLiveSeconds | int | 临时实例失效的秒数 | 30 |
-
### Consul 持久化
类型:Consul
diff --git a/docs/document/content/user-manual/common-config/builtin-algorithm/metadata-repository.en.md b/docs/document/content/user-manual/common-config/builtin-algorithm/metadata-repository.en.md
index 3a39940202906..d2ad3e30abadf 100644
--- a/docs/document/content/user-manual/common-config/builtin-algorithm/metadata-repository.en.md
+++ b/docs/document/content/user-manual/common-config/builtin-algorithm/metadata-repository.en.md
@@ -54,21 +54,6 @@ Attributes:
| timeToLiveSeconds | long | Seconds of ephemeral data live | 30 |
| connectionTimeout | long | Seconds of connection timeout | 30 |
-### Nacos Repository
-
-Type: Nacos
-
-Mode: Cluster
-
-Attributes:
-
-| *Name* | *Type* | *Description* | *Default Value* |
-|---------------------------|--------|---------------------------------------------------|-----------------|
-| clusterIp | String | Unique identifier in cluster | Host IP |
-| retryIntervalMilliseconds | long | Milliseconds of retry interval | 500 |
-| maxRetries | int | Max retries for client to check data availability | 3 |
-| timeToLiveSeconds | int | Seconds of ephemeral instance live | 30 |
-
### Consul Repository
Type: Consul
diff --git a/docs/document/content/user-manual/shardingsphere-jdbc/observability/_index.cn.md b/docs/document/content/user-manual/shardingsphere-jdbc/observability/_index.cn.md
index 76c8138e4d451..cb610cce420f9 100644
--- a/docs/document/content/user-manual/shardingsphere-jdbc/observability/_index.cn.md
+++ b/docs/document/content/user-manual/shardingsphere-jdbc/observability/_index.cn.md
@@ -12,7 +12,7 @@ weight = 7
```shell
git clone --depth 1 https://github.com/apache/shardingsphere.git
cd shardingsphere
-mvn clean install -Dmaven.javadoc.skip=true -Dcheckstyle.skip=true -Dspotbugs.skip=true -Drat.skip=true -Djacoco.skip=true -DskipITs -DskipTests -Prelease
+mvn clean install -DskipITs -DskipTests -Prelease
```
agent 包输出目录为 distribution/agent/target/apache-shardingsphere-${latest.release.version}-shardingsphere-agent-bin.tar.gz
diff --git a/docs/document/content/user-manual/shardingsphere-jdbc/observability/_index.en.md b/docs/document/content/user-manual/shardingsphere-jdbc/observability/_index.en.md
index e132844bb5d00..5032a54cb0a20 100644
--- a/docs/document/content/user-manual/shardingsphere-jdbc/observability/_index.en.md
+++ b/docs/document/content/user-manual/shardingsphere-jdbc/observability/_index.en.md
@@ -12,7 +12,7 @@ Download Apache ShardingSphere from GitHub,Then compile.
```shell
git clone --depth 1 https://github.com/apache/shardingsphere.git
cd shardingsphere
-mvn clean install -Dmaven.javadoc.skip=true -Dcheckstyle.skip=true -Dspotbugs.skip=true -Drat.skip=true -Djacoco.skip=true -DskipITs -DskipTests -Prelease
+mvn clean install -DskipITs -DskipTests -Prelease
```
Artifact is distribution/agent/target/apache-shardingsphere-${latest.release.version}-shardingsphere-agent-bin.tar.gz
diff --git a/docs/document/content/user-manual/shardingsphere-jdbc/optional-plugins/_index.cn.md b/docs/document/content/user-manual/shardingsphere-jdbc/optional-plugins/_index.cn.md
index c71734713decd..be62924a0bd7b 100644
--- a/docs/document/content/user-manual/shardingsphere-jdbc/optional-plugins/_index.cn.md
+++ b/docs/document/content/user-manual/shardingsphere-jdbc/optional-plugins/_index.cn.md
@@ -46,7 +46,6 @@ ShardingSphere 默认情况下仅包含核心 SPI 的实现,在 Git Source 存
- 集群模式配置信息持久化定义
- `org.apache.shardingsphere:shardingsphere-cluster-mode-repository-zookeeper`,基于 Zookeeper 的持久化实现
- `org.apache.shardingsphere:shardingsphere-cluster-mode-repository-etcd`,基于 Etcd 的持久化实现
- - `org.apache.shardingsphere:shardingsphere-cluster-mode-repository-nacos`,基于 Nacos 的持久化实现
- `org.apache.shardingsphere:shardingsphere-cluster-mode-repository-consul`,基于 Consul 的持久化实现
- XA 分布式事务管理器
- `org.apache.shardingsphere:shardingsphere-transaction-xa-narayana`,基于 Narayana 的 XA 分布式事务管理器
diff --git a/docs/document/content/user-manual/shardingsphere-jdbc/optional-plugins/_index.en.md b/docs/document/content/user-manual/shardingsphere-jdbc/optional-plugins/_index.en.md
index f07d87c6b40f8..ddae142a7b5ce 100644
--- a/docs/document/content/user-manual/shardingsphere-jdbc/optional-plugins/_index.en.md
+++ b/docs/document/content/user-manual/shardingsphere-jdbc/optional-plugins/_index.en.md
@@ -46,7 +46,6 @@ All optional plugins are listed below in the form of `groupId:artifactId`.
- Cluster mode configuration information persistence definition
- `org.apache.shardingsphere:shardingsphere-cluster-mode-repository-zookeeper`, Zookeeper based persistence
- `org.apache.shardingsphere:shardingsphere-cluster-mode-repository-etcd`, Etcd based persistence
- - `org.apache.shardingsphere:shardingsphere-cluster-mode-repository-nacos`, Nacos based persistence
- `org.apache.shardingsphere:shardingsphere-cluster-mode-repository-consul`, Consul based persistence
- XA transaction manager provider definition
- `org.apache.shardingsphere:shardingsphere-transaction-xa-narayana`, XA distributed transaction manager based on Narayana
diff --git a/docs/document/content/user-manual/shardingsphere-jdbc/yaml-config/jdbc-driver/_index.cn.md b/docs/document/content/user-manual/shardingsphere-jdbc/yaml-config/jdbc-driver/_index.cn.md
index 71d8a5cc5718c..3f3b58f7a9a6f 100644
--- a/docs/document/content/user-manual/shardingsphere-jdbc/yaml-config/jdbc-driver/_index.cn.md
+++ b/docs/document/content/user-manual/shardingsphere-jdbc/yaml-config/jdbc-driver/_index.cn.md
@@ -21,7 +21,6 @@ ShardingSphere-JDBC 提供了 JDBC 驱动,可以仅通过配置变更即可使
- 配置文件加载规则:
- `absolutepath:` 前缀表示从绝对路径中加载配置文件
- `classpath:` 前缀表示从类路径中加载配置文件
- - `apollo:` 前缀表示从 apollo 中加载配置文件
## 操作步骤
@@ -35,16 +34,6 @@ ShardingSphere-JDBC 提供了 JDBC 驱动,可以仅通过配置变更即可使
```
-如果使用 apollo 配置方式,还需要引入 `apollo-client` 依赖:
-
-```xml
-
- com.ctrip.framework.apollo
- apollo-client
- ${apollo-client.version}
-
-```
-
2. 使用驱动
* 使用原生驱动:
@@ -103,8 +92,3 @@ jdbc:shardingsphere:classpath:config.yaml
```
jdbc:shardingsphere:absolutepath:/path/to/config.yaml
```
-
-加载 apollo 指定 namespace 中的 yaml 配置文件的 JDBC URL:
-```
-jdbc:shardingsphere:apollo:TEST.test_namespace
-```
diff --git a/docs/document/content/user-manual/shardingsphere-jdbc/yaml-config/jdbc-driver/_index.en.md b/docs/document/content/user-manual/shardingsphere-jdbc/yaml-config/jdbc-driver/_index.en.md
index 1cf11a2dc5cb2..cc72a35f61d27 100644
--- a/docs/document/content/user-manual/shardingsphere-jdbc/yaml-config/jdbc-driver/_index.en.md
+++ b/docs/document/content/user-manual/shardingsphere-jdbc/yaml-config/jdbc-driver/_index.en.md
@@ -21,7 +21,6 @@ ShardingSphere-JDBC provides a JDBC Driver, which can be used only through confi
- Configuration file loading rule:
- `absolutepath:` prefix means to load the configuration file from the absolute path
- `classpath:` prefix indicates that the configuration file is loaded from the classpath
- - `apollo:` prefix means to load the configuration file from apollo
## Procedure
@@ -35,16 +34,6 @@ ShardingSphere-JDBC provides a JDBC Driver, which can be used only through confi
```
-If you use the apollo configuration method, you also need to introduce the `apollo-client` dependency:
-
-```xml
-
- com.ctrip.framework.apollo
- apollo-client
- ${apollo-client.version}
-
-```
-
2. Use drive
* Use native drivers:
@@ -103,8 +92,3 @@ Load JDBC URL of config.yaml profile in absolute path
```
jdbc:shardingsphere:absolutepath:/path/to/config.yaml
```
-
-Load JDBC URL of the yaml configuration file in the specified namespace of apollo:
-```
-jdbc:shardingsphere:apollo:TEST.test_namespace
-```
diff --git a/docs/document/content/user-manual/shardingsphere-jdbc/yaml-config/jdbc-driver/spring-boot/_index.cn.md b/docs/document/content/user-manual/shardingsphere-jdbc/yaml-config/jdbc-driver/spring-boot/_index.cn.md
index d3e5fd7e12cb2..d88b9e8fc9af9 100644
--- a/docs/document/content/user-manual/shardingsphere-jdbc/yaml-config/jdbc-driver/spring-boot/_index.cn.md
+++ b/docs/document/content/user-manual/shardingsphere-jdbc/yaml-config/jdbc-driver/spring-boot/_index.cn.md
@@ -29,7 +29,7 @@ spring.datasource.driver-class-name=org.apache.shardingsphere.driver.ShardingSph
spring.datasource.url=jdbc:shardingsphere:classpath:xxx.yaml
```
-`spring.datasource.url` 中的 YAML 配置文件当前支持通过三种方式获取,绝对路径 `absolutepath:`、Apollo 配置中心 `apollo:` 以及 CLASSPATH `classpath:`,具体可参考 `org.apache.shardingsphere.driver.jdbc.core.driver.ShardingSphereURLProvider` 的实现。
+`spring.datasource.url` 中的 YAML 配置文件当前支持通过两种方式获取,绝对路径 `absolutepath:` 以及 CLASSPATH `classpath:`,具体可参考 `org.apache.shardingsphere.driver.jdbc.core.driver.ShardingSphereURLProvider` 的实现。
### 使用数据源
diff --git a/docs/document/content/user-manual/shardingsphere-jdbc/yaml-config/jdbc-driver/spring-boot/_index.en.md b/docs/document/content/user-manual/shardingsphere-jdbc/yaml-config/jdbc-driver/spring-boot/_index.en.md
index 30c796166788e..1e57790429afe 100644
--- a/docs/document/content/user-manual/shardingsphere-jdbc/yaml-config/jdbc-driver/spring-boot/_index.en.md
+++ b/docs/document/content/user-manual/shardingsphere-jdbc/yaml-config/jdbc-driver/spring-boot/_index.en.md
@@ -29,7 +29,7 @@ spring.datasource.driver-class-name=org.apache.shardingsphere.driver.ShardingSph
spring.datasource.url=jdbc:shardingsphere:classpath:xxx.yaml
```
-The YAML configuration file in 'spring.datasource.url' currently support in three ways, the absolute path 'absolutepath:', Apollo configuration center 'apollo:', and CLASSPATH 'classpath:', which can be referred to `org.apache.shardingsphere.driver.jdbc.core.driver.ShardingSphereURLProvider`'s implementation for details.
+The YAML configuration file in 'spring.datasource.url' currently support in two ways, the absolute path 'absolutepath:' and CLASSPATH 'classpath:', which can be referred to `org.apache.shardingsphere.driver.jdbc.core.driver.ShardingSphereURLProvider`'s implementation for details.
### Use Data Source
diff --git a/docs/document/content/user-manual/shardingsphere-proxy/observability/_index.cn.md b/docs/document/content/user-manual/shardingsphere-proxy/observability/_index.cn.md
index 3cfee3c2bafd0..5d397782f94a0 100644
--- a/docs/document/content/user-manual/shardingsphere-proxy/observability/_index.cn.md
+++ b/docs/document/content/user-manual/shardingsphere-proxy/observability/_index.cn.md
@@ -12,7 +12,7 @@ weight = 5
```shell
git clone --depth 1 https://github.com/apache/shardingsphere.git
cd shardingsphere
-mvn clean install -Dmaven.javadoc.skip=true -Dcheckstyle.skip=true -Dspotbugs.skip=true -Drat.skip=true -Djacoco.skip=true -DskipITs -DskipTests -Prelease
+mvn clean install -DskipITs -DskipTests -Prelease
```
agent 包输出目录为 distribution/agent/target/apache-shardingsphere-${latest.release.version}-shardingsphere-agent-bin.tar.gz
diff --git a/docs/document/content/user-manual/shardingsphere-proxy/observability/_index.en.md b/docs/document/content/user-manual/shardingsphere-proxy/observability/_index.en.md
index dbb46ed2f127c..5463b81faf050 100644
--- a/docs/document/content/user-manual/shardingsphere-proxy/observability/_index.en.md
+++ b/docs/document/content/user-manual/shardingsphere-proxy/observability/_index.en.md
@@ -12,7 +12,7 @@ Download Apache ShardingSphere from GitHub,Then compile.
```shell
git clone --depth 1 https://github.com/apache/shardingsphere.git
cd shardingsphere
-mvn clean install -Dmaven.javadoc.skip=true -Dcheckstyle.skip=true -Dspotbugs.skip=true -Drat.skip=true -Djacoco.skip=true -DskipITs -DskipTests -Prelease
+mvn clean install -DskipITs -DskipTests -Prelease
```
Artifact is distribution/agent/target/apache-shardingsphere-${latest.release.version}-shardingsphere-agent-bin.tar.gz
diff --git a/docs/document/content/user-manual/shardingsphere-proxy/optional-plugins/_index.cn.md b/docs/document/content/user-manual/shardingsphere-proxy/optional-plugins/_index.cn.md
index ad289661b7a40..7ebe065776752 100644
--- a/docs/document/content/user-manual/shardingsphere-proxy/optional-plugins/_index.cn.md
+++ b/docs/document/content/user-manual/shardingsphere-proxy/optional-plugins/_index.cn.md
@@ -33,7 +33,6 @@ ShardingSphere 默认情况下仅包含核心 SPI 的实现,在 Git Source 存
- 单机模式配置信息持久化定义
- `org.apache.shardingsphere:shardingsphere-standalone-mode-repository-jdbc`,基于 JDBC 的持久化
- 集群模式配置信息持久化定义
- - `org.apache.shardingsphere:shardingsphere-cluster-mode-repository-nacos`,基于 Nacos 的持久化实现
- `org.apache.shardingsphere:shardingsphere-cluster-mode-repository-consul`,基于 Consul 的持久化实现
- XA 分布式事务管理器
- `org.apache.shardingsphere:shardingsphere-transaction-xa-narayana`,基于 Narayana 的 XA 分布式事务管理器
diff --git a/docs/document/content/user-manual/shardingsphere-proxy/optional-plugins/_index.en.md b/docs/document/content/user-manual/shardingsphere-proxy/optional-plugins/_index.en.md
index d10f6a21f1701..f2d85e42cdceb 100644
--- a/docs/document/content/user-manual/shardingsphere-proxy/optional-plugins/_index.en.md
+++ b/docs/document/content/user-manual/shardingsphere-proxy/optional-plugins/_index.en.md
@@ -33,7 +33,6 @@ All optional plugins are listed below in the form of `groupId:artifactId`.
- Standalone mode configuration information persistence definition
- `org.apache.shardingsphere:shardingsphere-standalone-mode-repository-jdbc`, JDBC based persistence
- Cluster mode configuration information persistence definition
- - `org.apache.shardingsphere:shardingsphere-cluster-mode-repository-nacos`, Nacos based persistence
- `org.apache.shardingsphere:shardingsphere-cluster-mode-repository-consul`, Consul based persistence
- XA transaction manager provider definition
- `org.apache.shardingsphere:shardingsphere-transaction-xa-narayana`, XA distributed transaction manager based on Narayana
diff --git a/docs/document/content/user-manual/shardingsphere-proxy/startup/docker.cn.md b/docs/document/content/user-manual/shardingsphere-proxy/startup/docker.cn.md
index 874773b82307d..cef07fb99dfe8 100644
--- a/docs/document/content/user-manual/shardingsphere-proxy/startup/docker.cn.md
+++ b/docs/document/content/user-manual/shardingsphere-proxy/startup/docker.cn.md
@@ -25,9 +25,9 @@ docker pull apache/shardingsphere-proxy
* 方式三:自行构建镜像
```bash
git clone https://github.com/apache/shardingsphere
-mvn clean install
+./mvnw clean install
cd shardingsphere-distribution/shardingsphere-proxy-distribution
-mvn clean package -Prelease,docker
+./mvnw clean package -Prelease,docker
```
如果遇到以下问题,请确保 Docker daemon 进程已经运行。
diff --git a/docs/document/content/user-manual/shardingsphere-proxy/startup/docker.en.md b/docs/document/content/user-manual/shardingsphere-proxy/startup/docker.en.md
index fa6c061061896..01e3060de0799 100644
--- a/docs/document/content/user-manual/shardingsphere-proxy/startup/docker.en.md
+++ b/docs/document/content/user-manual/shardingsphere-proxy/startup/docker.en.md
@@ -25,9 +25,9 @@ docker pull apache/shardingsphere-proxy
* Method 3: Build your own image
```bash
git clone https://github.com/apache/shardingsphere
-mvn clean install
+./mvnw clean install
cd shardingsphere-distribution/shardingsphere-proxy-distribution
-mvn clean package -Prelease,docker
+./mvnw clean package -Prelease,docker
```
If the following problems emerge, please make sure Docker daemon Process is running.
diff --git a/examples/README.md b/examples/README.md
index 6ed4bcd8691d3..443ae116e4bd3 100644
--- a/examples/README.md
+++ b/examples/README.md
@@ -25,7 +25,7 @@ git clone https://github.com/apache/shardingsphere.git
## compile source code
cd shardingsphere
-mvn clean install -Prelease
+./mvnw clean install -Prelease
```
## Module design
diff --git a/examples/docker/docker-compose-zh.md b/examples/docker/docker-compose-zh.md
deleted file mode 100644
index 3ed0ae6d39d90..0000000000000
--- a/examples/docker/docker-compose-zh.md
+++ /dev/null
@@ -1,21 +0,0 @@
-## 使用docker-compose初始化开始环境
-
-在开始使用docker compose之前,根据下述参考网址安装docker和docker-compose:https://docs.docker.com/compose/install/
-
-#### ShardingSphere-JDBC
-
-1. 运行 'cd docker/shardingsphere-jdbc/sharding',进入 docker 文件夹
-2. 运行 'docker-compose up -d',启动 docker compose 环境
-3. 根据需要,开启 mysql/etcd/zookeeper
-4. 如果有端口冲突,在 docker-compose.yml 中修改相应的端口,然后再次使用 'docker-compose up -d' 启动 docker compose
-5. 如果需要关闭程序,请使用命令 'docker-compose down'
-
-#### ShardingSphere-Proxy
-
-1. 运行 'cd docker/shardingsphere-proxy/sharding',进入 docker 文件夹
-2. 运行 'docker-compose up -d',启动 docker compose 环境
-3. 运行 `psql -d sharding_db -h 127.0.0.1 -U root -p 3308 -W` 登录代理, 示例的默认密码:root
-4. 如果有端口冲突,在docker-compose.yml中修改相应的端口,然后再次使用 'docker-compose up -d'启动docker compose
-5. 如果需要关闭程序,请使用命令 'docker-compose down'
-
-需要注意,请谨慎使用 docker 删除指令`docker ps -a -q`去删除docker容器。
diff --git a/examples/docker/docker-compose.md b/examples/docker/docker-compose.md
deleted file mode 100644
index ceda43fabffbe..0000000000000
--- a/examples/docker/docker-compose.md
+++ /dev/null
@@ -1,21 +0,0 @@
-## Using docker-compose to config startup environment
-
-before we use docker compose, please install docker and docker-compose first : https://docs.docker.com/compose/install/
-
-#### ShardingSphere-JDBC
-
-1. access the docker folder (cd docker/shardingsphere-jdbc/sharding)
-2. launch the environment by docker compose (docker-compose up -d)
-3. access mysql / etcd / zookeeper as you want
-4. if there is conflict on port, just modify the corresponding port defined in docker-compose.yml and then launch docker compose again(docker-compose up -d)
-5. if you want to stop the environment, use command docker-compose down
-
-#### ShardingSphere-Proxy
-
-1. access the docker folder (cd docker/shardingsphere-proxy/sharding)
-2. launch the environment by docker compose (docker-compose up -d)
-3. access proxy by `psql -d sharding_db -h 127.0.0.1 -U root -p 3308 -W`, default password for example: root
-4. if there is conflict on port, just modify the corresponding port defined in docker-compose.yml and then launch docker compose again(docker-compose up -d)
-5. if you want to stop the environment, use command docker-compose down
-
-to clean the docker container , you could use docker rm `docker ps -a -q` (be careful)
diff --git a/examples/docker/shardingsphere-proxy/governance/docker-compose.yml b/examples/docker/shardingsphere-proxy/governance/docker-compose.yml
deleted file mode 100644
index 7acfd46ad1bb6..0000000000000
--- a/examples/docker/shardingsphere-proxy/governance/docker-compose.yml
+++ /dev/null
@@ -1,42 +0,0 @@
-#
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements. See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License. You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-
-version: '3'
-services:
- zookeeper:
- ## get more versions of zookeeper here : https://hub.docker.com/_/zookeeper?tab=tags
- image: "zookeeper:3.4"
- ports:
- - "2181:2181"
- container_name: shardingsphere-example-zookeeper
-
- etcd:
- ## get more versions of etcd here : https://quay.io/repository/coreos/etcd?tag=latest&tab=tags
- image: "quay.io/coreos/etcd:v3.3.12"
- ports:
- - "2379:2379"
- - "2380:2380"
- - "4001:4001"
- container_name: shardingsphere-example-etcd
- entrypoint: /usr/local/bin/etcd
- command:
- - '--advertise-client-urls=http://0.0.0.0:2379'
- - '--listen-client-urls=http://0.0.0.0:2379'
- - '--initial-advertise-peer-urls=http://0.0.0.0:2380'
- - '--listen-peer-urls=http://0.0.0.0:2380'
- - '--initial-cluster'
- - 'default=http://0.0.0.0:2380'
diff --git a/examples/docker/shardingsphere-proxy/governance/stop.sh b/examples/docker/shardingsphere-proxy/governance/stop.sh
deleted file mode 100644
index fcf30fdd28f4e..0000000000000
--- a/examples/docker/shardingsphere-proxy/governance/stop.sh
+++ /dev/null
@@ -1,19 +0,0 @@
-#!/usr/bin/env bash
-#
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements. See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License. You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-
-docker-compose down
diff --git a/examples/docker/shardingsphere-proxy/sharding/conf/config-sharding.yaml b/examples/docker/shardingsphere-proxy/sharding/conf/config-sharding.yaml
deleted file mode 100644
index 4a9a1b5531c10..0000000000000
--- a/examples/docker/shardingsphere-proxy/sharding/conf/config-sharding.yaml
+++ /dev/null
@@ -1,94 +0,0 @@
-#
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements. See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License. You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-
-######################################################################################################
-#
-# Here you can configure the rules for the proxy.
-# This example is configuration of sharding rule.
-#
-######################################################################################################
-
-databaseName: sharding_db
-
-dataSources:
- ds_0:
- url: jdbc:postgresql://shardingsphere-example-postgres:5432/demo_ds_0?useUnicode=true&characterEncoding=utf-8&allowEncodingChanges=true
- username: postgres
- password: postgres
- connectionTimeoutMilliseconds: 30000
- idleTimeoutMilliseconds: 60000
- maxLifetimeMilliseconds: 1800000
- maxPoolSize: 50
- ds_1:
- url: jdbc:postgresql://shardingsphere-example-postgres:5432/demo_ds_1?useUnicode=true&characterEncoding=utf-8&allowEncodingChanges=true
- username: postgres
- password: postgres
- connectionTimeoutMilliseconds: 30000
- idleTimeoutMilliseconds: 60000
- maxLifetimeMilliseconds: 1800000
- maxPoolSize: 50
-
-rules:
-- !SHARDING
- tables:
- t_order:
- actualDataNodes: ds_${0..1}.t_order_${0..1}
- tableStrategy:
- standard:
- shardingColumn: order_id
- shardingAlgorithmName: t_order_inline
- keyGenerateStrategy:
- column: order_id
- keyGeneratorName: snowflake
- t_order_item:
- actualDataNodes: ds_${0..1}.t_order_item_${0..1}
- tableStrategy:
- standard:
- shardingColumn: order_id
- shardingAlgorithmName: t_order_item_inline
- keyGenerateStrategy:
- column: order_id
- keyGeneratorName: snowflake
-
- bindingTables:
- - t_order,t_order_item
-
- defaultDatabaseStrategy:
- standard:
- shardingColumn: user_id
- shardingAlgorithmName: database_inline
-
- defaultTableStrategy:
- none:
-
- shardingAlgorithms:
- database_inline:
- type: INLINE
- props:
- algorithm-expression: ds_${user_id % 2}
- t_order_inline:
- type: INLINE
- props:
- algorithm-expression: t_order_${order_id % 2}
- t_order_item_inline:
- type: INLINE
- props:
- algorithm-expression: t_order_item_${order_id % 2}
-
- keyGenerators:
- snowflake:
- type: SNOWFLAKE
\ No newline at end of file
diff --git a/examples/docker/shardingsphere-proxy/sharding/conf/server.yaml b/examples/docker/shardingsphere-proxy/sharding/conf/server.yaml
deleted file mode 100644
index ffd2f1e2c785a..0000000000000
--- a/examples/docker/shardingsphere-proxy/sharding/conf/server.yaml
+++ /dev/null
@@ -1,43 +0,0 @@
-#
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements. See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License. You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-
-######################################################################################################
-#
-# If you want to configure governance, authorization and proxy properties, please refer to this file.
-#
-######################################################################################################
-
-#mode:
-# type: Cluster
-# repository:
-# type: ZooKeeper
-# props:
-# namespace: governance_ds
-# server-lists: localhost:2181
-
-authority:
- users:
- - user: root
- password: root
- privilege:
- type: ALL_PERMITTED
-
-props:
- max-connections-size-per-query: 1
- kernel-executor-size: 16 # Infinite by default.
- proxy-frontend-flush-threshold: 128 # The default value is 128.
- sql-show: true
diff --git a/examples/docker/shardingsphere-proxy/sharding/docker-compose.yml b/examples/docker/shardingsphere-proxy/sharding/docker-compose.yml
deleted file mode 100644
index bb0153ac4e1cd..0000000000000
--- a/examples/docker/shardingsphere-proxy/sharding/docker-compose.yml
+++ /dev/null
@@ -1,57 +0,0 @@
-#
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements. See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License. You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-
-version: '3'
-services:
- postgres:
- ## for different postgres version, you could get more at here : https://hub.docker.com/_/postgres?tab=tags
- image: "postgres:latest"
- ## port binding to 5432, you could change to 15432:5432 or any other available port you want
- ports:
- - "5432:5432"
- container_name: shardingsphere-example-postgres
- ## you could access Postgres like `psql -d your_database_name -h 127.0.0.1 -U root -p 5432 -W`
- environment:
- - TZ=Asia/Shanghai
- - POSTGRES_USER=postgres
- - POSTGRES_PASSWORD=postgres
- ## copy the manual_schema.sql to /docker-entrypoint-initdb.d/ . this will init the sql file when the Postgres in container start up
- volumes:
- - ../../../src/resources/manual_schema.sql:/docker-entrypoint-initdb.d/manual_schema.sql
-
- proxy:
- ## get more versions of proxy here : https://hub.docker.com/r/apache/shardingsphere-proxy/tags
- image: "apache/shardingsphere-proxy:latest"
- container_name: shardingsphere-example-proxy
- depends_on:
- - postgres
- ## wait-for-it.sh will make proxy entry point wait until Postgres container 5432 port open
- entrypoint: >
- /bin/sh -c "/opt/wait-for-it.sh shardingsphere-example-postgres:5432 --timeout=20 --strict --
- && /opt/shardingsphere-proxy/bin/start.sh 3308
- && tail -f /opt/shardingsphere-proxy/logs/stdout.log"
- ports:
- - "3308:3308"
- links:
- - "postgres:postgres"
- volumes:
- - ./conf/:/opt/shardingsphere-proxy/conf
- - ../../tools/wait-for-it.sh:/opt/wait-for-it.sh
- environment:
- - JVM_OPTS="-Djava.awt.headless=true"
-
-
diff --git a/examples/docker/tools/wait-for-it.sh b/examples/docker/tools/wait-for-it.sh
deleted file mode 100755
index b4bf36b306f2b..0000000000000
--- a/examples/docker/tools/wait-for-it.sh
+++ /dev/null
@@ -1,195 +0,0 @@
-#!/usr/bin/env bash
-#
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements. See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License. You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-
-# Use this script to test if a given TCP host/port are available
-
-WAITFORIT_cmdname=${0##*/}
-
-echoerr() { if [[ $WAITFORIT_QUIET -ne 1 ]]; then echo "$@" 1>&2; fi }
-
-usage()
-{
- cat << USAGE >&2
-Usage:
- $WAITFORIT_cmdname host:port [-s] [-t timeout] [-- command args]
- -h HOST | --host=HOST Host or IP under test
- -p PORT | --port=PORT TCP port under test
- Alternatively, you specify the host and port as host:port
- -s | --strict Only execute subcommand if the test succeeds
- -q | --quiet Don't output any status messages
- -t TIMEOUT | --timeout=TIMEOUT
- Timeout in seconds, zero for no timeout
- -- COMMAND ARGS Execute command with args after the test finishes
-USAGE
- exit 1
-}
-
-wait_for()
-{
- if [[ $WAITFORIT_TIMEOUT -gt 0 ]]; then
- echoerr "$WAITFORIT_cmdname: waiting $WAITFORIT_TIMEOUT seconds for $WAITFORIT_HOST:$WAITFORIT_PORT"
- else
- echoerr "$WAITFORIT_cmdname: waiting for $WAITFORIT_HOST:$WAITFORIT_PORT without a timeout"
- fi
- WAITFORIT_start_ts=$(date +%s)
- while :
- do
- if [[ $WAITFORIT_ISBUSY -eq 1 ]]; then
- nc -z $WAITFORIT_HOST $WAITFORIT_PORT
- WAITFORIT_result=$?
- else
- (echo > /dev/tcp/$WAITFORIT_HOST/$WAITFORIT_PORT) >/dev/null 2>&1
- WAITFORIT_result=$?
- fi
- if [[ $WAITFORIT_result -eq 0 ]]; then
- WAITFORIT_end_ts=$(date +%s)
- echoerr "$WAITFORIT_cmdname: $WAITFORIT_HOST:$WAITFORIT_PORT is available after $((WAITFORIT_end_ts - WAITFORIT_start_ts)) seconds"
- break
- fi
- sleep 1
- done
- return $WAITFORIT_result
-}
-
-wait_for_wrapper()
-{
- # In order to support SIGINT during timeout: http://unix.stackexchange.com/a/57692
- if [[ $WAITFORIT_QUIET -eq 1 ]]; then
- timeout $WAITFORIT_BUSYTIMEFLAG $WAITFORIT_TIMEOUT $0 --quiet --child --host=$WAITFORIT_HOST --port=$WAITFORIT_PORT --timeout=$WAITFORIT_TIMEOUT &
- else
- timeout $WAITFORIT_BUSYTIMEFLAG $WAITFORIT_TIMEOUT $0 --child --host=$WAITFORIT_HOST --port=$WAITFORIT_PORT --timeout=$WAITFORIT_TIMEOUT &
- fi
- WAITFORIT_PID=$!
- trap "kill -INT -$WAITFORIT_PID" INT
- wait $WAITFORIT_PID
- WAITFORIT_RESULT=$?
- if [[ $WAITFORIT_RESULT -ne 0 ]]; then
- echoerr "$WAITFORIT_cmdname: timeout occurred after waiting $WAITFORIT_TIMEOUT seconds for $WAITFORIT_HOST:$WAITFORIT_PORT"
- fi
- return $WAITFORIT_RESULT
-}
-
-# process arguments
-while [[ $# -gt 0 ]]
-do
- case "$1" in
- *:* )
- WAITFORIT_hostport=(${1//:/ })
- WAITFORIT_HOST=${WAITFORIT_hostport[0]}
- WAITFORIT_PORT=${WAITFORIT_hostport[1]}
- shift 1
- ;;
- --child)
- WAITFORIT_CHILD=1
- shift 1
- ;;
- -q | --quiet)
- WAITFORIT_QUIET=1
- shift 1
- ;;
- -s | --strict)
- WAITFORIT_STRICT=1
- shift 1
- ;;
- -h)
- WAITFORIT_HOST="$2"
- if [[ $WAITFORIT_HOST == "" ]]; then break; fi
- shift 2
- ;;
- --host=*)
- WAITFORIT_HOST="${1#*=}"
- shift 1
- ;;
- -p)
- WAITFORIT_PORT="$2"
- if [[ $WAITFORIT_PORT == "" ]]; then break; fi
- shift 2
- ;;
- --port=*)
- WAITFORIT_PORT="${1#*=}"
- shift 1
- ;;
- -t)
- WAITFORIT_TIMEOUT="$2"
- if [[ $WAITFORIT_TIMEOUT == "" ]]; then break; fi
- shift 2
- ;;
- --timeout=*)
- WAITFORIT_TIMEOUT="${1#*=}"
- shift 1
- ;;
- --)
- shift
- WAITFORIT_CLI=("$@")
- break
- ;;
- --help)
- usage
- ;;
- *)
- echoerr "Unknown argument: $1"
- usage
- ;;
- esac
-done
-
-if [[ "$WAITFORIT_HOST" == "" || "$WAITFORIT_PORT" == "" ]]; then
- echoerr "Error: you need to provide a host and port to test."
- usage
-fi
-
-WAITFORIT_TIMEOUT=${WAITFORIT_TIMEOUT:-15}
-WAITFORIT_STRICT=${WAITFORIT_STRICT:-0}
-WAITFORIT_CHILD=${WAITFORIT_CHILD:-0}
-WAITFORIT_QUIET=${WAITFORIT_QUIET:-0}
-
-# check to see if timeout is from busybox?
-WAITFORIT_TIMEOUT_PATH=$(type -p timeout)
-WAITFORIT_TIMEOUT_PATH=$(realpath $WAITFORIT_TIMEOUT_PATH 2>/dev/null || readlink -f $WAITFORIT_TIMEOUT_PATH)
-if [[ $WAITFORIT_TIMEOUT_PATH =~ "busybox" ]]; then
- WAITFORIT_ISBUSY=1
- WAITFORIT_BUSYTIMEFLAG="-t"
-
-else
- WAITFORIT_ISBUSY=0
- WAITFORIT_BUSYTIMEFLAG=""
-fi
-
-if [[ $WAITFORIT_CHILD -gt 0 ]]; then
- wait_for
- WAITFORIT_RESULT=$?
- exit $WAITFORIT_RESULT
-else
- if [[ $WAITFORIT_TIMEOUT -gt 0 ]]; then
- wait_for_wrapper
- WAITFORIT_RESULT=$?
- else
- wait_for
- WAITFORIT_RESULT=$?
- fi
-fi
-
-if [[ $WAITFORIT_CLI != "" ]]; then
- if [[ $WAITFORIT_RESULT -ne 0 && $WAITFORIT_STRICT -eq 1 ]]; then
- echoerr "$WAITFORIT_cmdname: strict mode, refusing to execute subprocess"
- exit $WAITFORIT_RESULT
- fi
- exec "${WAITFORIT_CLI[@]}"
-else
- exit $WAITFORIT_RESULT
-fi
diff --git a/examples/pom.xml b/examples/pom.xml
index add9a8d68b6fc..4c4921f69fb41 100644
--- a/examples/pom.xml
+++ b/examples/pom.xml
@@ -117,11 +117,6 @@
shardingsphere-cluster-mode-repository-etcd
${project.version}
-
- org.apache.shardingsphere
- shardingsphere-cluster-mode-repository-nacos
- ${project.version}
-
org.apache.shardingsphere
shardingsphere-infra-common
diff --git a/examples/shardingsphere-jdbc-example-generator/src/main/resources/template/resources/registry.ftl b/examples/shardingsphere-jdbc-example-generator/src/main/resources/template/resources/registry.ftl
index 3be5402d98f94..2ed8a6903a08b 100644
--- a/examples/shardingsphere-jdbc-example-generator/src/main/resources/template/resources/registry.ftl
+++ b/examples/shardingsphere-jdbc-example-generator/src/main/resources/template/resources/registry.ftl
@@ -16,14 +16,9 @@
#
registry {
- # file 、nacos 、eureka、redis、zk
+ # file 、eureka、redis、zk
type = "file"
- nacos {
- serverAddr = "localhost"
- namespace = "public"
- cluster = "default"
- }
eureka {
serviceUrl = "http://localhost:1001/eureka"
application = "default"
@@ -45,18 +40,9 @@ registry {
}
config {
- # file、nacos 、apollo、zk
+ # file、zk
type = "file"
- nacos {
- serverAddr = "localhost"
- namespace = "public"
- cluster = "default"
- }
- apollo {
- app.id = "fescar-server"
- apollo.meta = "http://192.168.1.204:8801"
- }
zk {
serverAddr = "127.0.0.1:2181"
session.timeout = 6000
diff --git a/features/encrypt/core/src/main/java/org/apache/shardingsphere/encrypt/algorithm/standard/RC4EncryptAlgorithm.java b/features/encrypt/core/src/main/java/org/apache/shardingsphere/encrypt/algorithm/standard/RC4EncryptAlgorithm.java
deleted file mode 100644
index 1bc3e2169e56f..0000000000000
--- a/features/encrypt/core/src/main/java/org/apache/shardingsphere/encrypt/algorithm/standard/RC4EncryptAlgorithm.java
+++ /dev/null
@@ -1,110 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements. See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.shardingsphere.encrypt.algorithm.standard;
-
-import lombok.EqualsAndHashCode;
-import org.apache.commons.codec.binary.Base64;
-import org.apache.shardingsphere.encrypt.api.context.EncryptContext;
-import org.apache.shardingsphere.encrypt.api.encrypt.standard.StandardEncryptAlgorithm;
-import org.apache.shardingsphere.encrypt.exception.algorithm.EncryptAlgorithmInitializationException;
-import org.apache.shardingsphere.infra.exception.core.ShardingSpherePreconditions;
-
-import java.nio.charset.StandardCharsets;
-import java.util.Properties;
-
-/**
- * RC4 encrypt algorithm.
- */
-@EqualsAndHashCode
-public final class RC4EncryptAlgorithm implements StandardEncryptAlgorithm {
-
- private static final String RC4_KEY = "rc4-key-value";
-
- private static final int KEY_MIN_LENGTH = 5;
-
- private static final int SBOX_LENGTH = 256;
-
- private byte[] key;
-
- @Override
- public void init(final Properties props) {
- key = getKey(props);
- }
-
- private byte[] getKey(final Properties props) {
- byte[] result = props.getProperty(RC4_KEY, "").getBytes(StandardCharsets.UTF_8);
- ShardingSpherePreconditions.checkState(KEY_MIN_LENGTH <= result.length && SBOX_LENGTH > result.length,
- () -> new EncryptAlgorithmInitializationException(getType(), "Key length has to be between " + KEY_MIN_LENGTH + " and " + (SBOX_LENGTH - 1)));
- return result;
- }
-
- @Override
- public String encrypt(final Object plainValue, final EncryptContext encryptContext) {
- return null == plainValue ? null : Base64.encodeBase64String(crypt(String.valueOf(plainValue).getBytes(StandardCharsets.UTF_8)));
- }
-
- @Override
- public Object decrypt(final Object cipherValue, final EncryptContext encryptContext) {
- return null == cipherValue ? null : new String(crypt(Base64.decodeBase64(cipherValue.toString())), StandardCharsets.UTF_8);
- }
-
- /*
- * @see Pseudo-random generation algorithm
- */
- private byte[] crypt(final byte[] message) {
- int[] sBox = getSBox();
- byte[] result = new byte[message.length];
- int i = 0;
- int j = 0;
- for (int n = 0; n < message.length; n++) {
- i = (i + 1) % SBOX_LENGTH;
- j = (j + sBox[i]) % SBOX_LENGTH;
- swap(i, j, sBox);
- int rand = sBox[(sBox[i] + sBox[j]) % SBOX_LENGTH];
- result[n] = (byte) (rand ^ message[n]);
- }
- return result;
- }
-
- /*
- * @see Wikipedia. Init sBox
- */
- private int[] getSBox() {
- int[] result = new int[SBOX_LENGTH];
- int j = 0;
- for (int i = 0; i < SBOX_LENGTH; i++) {
- result[i] = i;
- }
- for (int i = 0; i < SBOX_LENGTH; i++) {
- j = (j + result[i] + (key[i % key.length]) & 0xFF) % SBOX_LENGTH;
- swap(i, j, result);
- }
- return result;
- }
-
- private void swap(final int i, final int j, final int[] sBox) {
- int temp = sBox[i];
- sBox[i] = sBox[j];
- sBox[j] = temp;
- }
-
- @Override
- public String getType() {
- return "RC4";
- }
-}
diff --git a/features/encrypt/core/src/main/resources/META-INF/services/org.apache.shardingsphere.encrypt.spi.EncryptAlgorithm b/features/encrypt/core/src/main/resources/META-INF/services/org.apache.shardingsphere.encrypt.spi.EncryptAlgorithm
index 9f336b168092f..8cb0319280d90 100644
--- a/features/encrypt/core/src/main/resources/META-INF/services/org.apache.shardingsphere.encrypt.spi.EncryptAlgorithm
+++ b/features/encrypt/core/src/main/resources/META-INF/services/org.apache.shardingsphere.encrypt.spi.EncryptAlgorithm
@@ -16,6 +16,4 @@
#
org.apache.shardingsphere.encrypt.algorithm.standard.AESEncryptAlgorithm
-org.apache.shardingsphere.encrypt.algorithm.standard.RC4EncryptAlgorithm
-org.apache.shardingsphere.encrypt.algorithm.like.CharDigestLikeEncryptAlgorithm
org.apache.shardingsphere.encrypt.algorithm.assisted.MD5AssistedEncryptAlgorithm
diff --git a/features/encrypt/core/src/test/java/org/apache/shardingsphere/encrypt/algorithm/like/CharDigestLikeEncryptAlgorithmTest.java b/features/encrypt/core/src/test/java/org/apache/shardingsphere/encrypt/algorithm/like/CharDigestLikeEncryptAlgorithmTest.java
deleted file mode 100644
index 63e3b8f4b09a1..0000000000000
--- a/features/encrypt/core/src/test/java/org/apache/shardingsphere/encrypt/algorithm/like/CharDigestLikeEncryptAlgorithmTest.java
+++ /dev/null
@@ -1,71 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements. See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.shardingsphere.encrypt.algorithm.like;
-
-import org.apache.shardingsphere.encrypt.api.context.EncryptContext;
-import org.apache.shardingsphere.encrypt.api.encrypt.like.LikeEncryptAlgorithm;
-import org.apache.shardingsphere.encrypt.spi.EncryptAlgorithm;
-import org.apache.shardingsphere.infra.spi.type.typed.TypedSPILoader;
-import org.apache.shardingsphere.test.util.PropertiesBuilder;
-import org.apache.shardingsphere.test.util.PropertiesBuilder.Property;
-import org.junit.jupiter.api.BeforeEach;
-import org.junit.jupiter.api.Test;
-
-import static org.hamcrest.CoreMatchers.is;
-import static org.hamcrest.MatcherAssert.assertThat;
-import static org.junit.jupiter.api.Assertions.assertNull;
-import static org.mockito.Mockito.mock;
-
-class CharDigestLikeEncryptAlgorithmTest {
-
- private LikeEncryptAlgorithm englishLikeEncryptAlgorithm;
-
- private LikeEncryptAlgorithm chineseLikeEncryptAlgorithm;
-
- private LikeEncryptAlgorithm koreanLikeEncryptAlgorithm;
-
- @BeforeEach
- void setUp() {
- englishLikeEncryptAlgorithm = (LikeEncryptAlgorithm) TypedSPILoader.getService(EncryptAlgorithm.class, "CHAR_DIGEST_LIKE");
- chineseLikeEncryptAlgorithm = (LikeEncryptAlgorithm) TypedSPILoader.getService(EncryptAlgorithm.class, "CHAR_DIGEST_LIKE");
- koreanLikeEncryptAlgorithm = (LikeEncryptAlgorithm) TypedSPILoader.getService(EncryptAlgorithm.class,
- "CHAR_DIGEST_LIKE", PropertiesBuilder.build(new Property("dict", "한국어시험"), new Property("start", "44032")));
- }
-
- @Test
- void assertEncrypt() {
- assertThat(englishLikeEncryptAlgorithm.encrypt("1234567890%abcdefghijklmnopqrstuvwxyz%ABCDEFGHIJKLMNOPQRSTUVWXYZ",
- mock(EncryptContext.class)), is("0145458981%`adedehihilmlmpqpqtutuxyxy%@ADEDEHIHILMLMPQPQTUTUXYXY"));
- assertThat(englishLikeEncryptAlgorithm.encrypt("_1234__5678__", mock(EncryptContext.class)), is("_0145__4589__"));
- }
-
- @Test
- void assertEncryptWithChineseChar() {
- assertThat(chineseLikeEncryptAlgorithm.encrypt("中国", mock(EncryptContext.class)), is("婝估"));
- }
-
- @Test
- void assertEncryptWithKoreanChar() {
- assertThat(koreanLikeEncryptAlgorithm.encrypt("한국", mock(EncryptContext.class)), is("각가"));
- }
-
- @Test
- void assertEncryptWithNullPlaintext() {
- assertNull(englishLikeEncryptAlgorithm.encrypt(null, mock(EncryptContext.class)));
- }
-}
diff --git a/features/encrypt/core/src/test/java/org/apache/shardingsphere/encrypt/algorithm/standard/RC4EncryptAlgorithmTest.java b/features/encrypt/core/src/test/java/org/apache/shardingsphere/encrypt/algorithm/standard/RC4EncryptAlgorithmTest.java
deleted file mode 100644
index a87b3591f29a7..0000000000000
--- a/features/encrypt/core/src/test/java/org/apache/shardingsphere/encrypt/algorithm/standard/RC4EncryptAlgorithmTest.java
+++ /dev/null
@@ -1,79 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements. See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.shardingsphere.encrypt.algorithm.standard;
-
-import org.apache.shardingsphere.encrypt.api.context.EncryptContext;
-import org.apache.shardingsphere.encrypt.api.encrypt.standard.StandardEncryptAlgorithm;
-import org.apache.shardingsphere.encrypt.exception.algorithm.EncryptAlgorithmInitializationException;
-import org.apache.shardingsphere.encrypt.spi.EncryptAlgorithm;
-import org.apache.shardingsphere.infra.spi.type.typed.TypedSPILoader;
-import org.apache.shardingsphere.test.util.PropertiesBuilder;
-import org.apache.shardingsphere.test.util.PropertiesBuilder.Property;
-import org.junit.jupiter.api.BeforeEach;
-import org.junit.jupiter.api.Test;
-
-import java.util.stream.Collectors;
-import java.util.stream.IntStream;
-
-import static org.hamcrest.CoreMatchers.is;
-import static org.hamcrest.MatcherAssert.assertThat;
-import static org.junit.jupiter.api.Assertions.assertNull;
-import static org.junit.jupiter.api.Assertions.assertThrows;
-import static org.mockito.Mockito.mock;
-
-class RC4EncryptAlgorithmTest {
-
- private StandardEncryptAlgorithm encryptAlgorithm;
-
- @BeforeEach
- void setUp() {
- encryptAlgorithm = (StandardEncryptAlgorithm) TypedSPILoader.getService(EncryptAlgorithm.class, "RC4", PropertiesBuilder.build(new Property("rc4-key-value", "test-sharding")));
- }
-
- @Test
- void assertEncode() {
- assertThat(encryptAlgorithm.encrypt("test", mock(EncryptContext.class)), is("4Tn7lQ=="));
- }
-
- @Test
- void assertEncryptNullValue() {
- assertNull(encryptAlgorithm.encrypt(null, mock(EncryptContext.class)));
- }
-
- @Test
- void assertKeyIsTooLong() {
- assertThrows(EncryptAlgorithmInitializationException.class,
- () -> encryptAlgorithm.init(PropertiesBuilder.build(new Property("rc4-key-value", IntStream.range(0, 100).mapToObj(each -> "test").collect(Collectors.joining())))));
- }
-
- @Test
- void assertKeyIsTooShort() {
- assertThrows(EncryptAlgorithmInitializationException.class,
- () -> encryptAlgorithm.init(PropertiesBuilder.build(new Property("rc4-key-value", "test"))));
- }
-
- @Test
- void assertDecode() {
- assertThat(encryptAlgorithm.decrypt("4Tn7lQ==", mock(EncryptContext.class)).toString(), is("test"));
- }
-
- @Test
- void assertDecryptNullValue() {
- assertNull(encryptAlgorithm.decrypt(null, mock(EncryptContext.class)));
- }
-}
diff --git a/features/encrypt/core/src/test/resources/yaml/encrypt-rule.yaml b/features/encrypt/core/src/test/resources/yaml/encrypt-rule.yaml
index 27939450a09b3..e8b9f209e6ad6 100644
--- a/features/encrypt/core/src/test/resources/yaml/encrypt-rule.yaml
+++ b/features/encrypt/core/src/test/resources/yaml/encrypt-rule.yaml
@@ -27,9 +27,6 @@ rules:
assistedQuery:
name: assisted_query_username
encryptorName: assisted_encryptor
- likeQuery:
- name: like_query_username
- encryptorName: like_encryptor
encryptors:
aes_encryptor:
type: AES
@@ -39,5 +36,3 @@ rules:
type: AES
props:
aes-key-value: 123456abc
- like_encryptor:
- type: CHAR_DIGEST_LIKE
diff --git a/features/encrypt/distsql/parser/src/main/antlr4/imports/encrypt/BaseRule.g4 b/features/encrypt/distsql/parser/src/main/antlr4/imports/encrypt/BaseRule.g4
index a8e8c47f9cb86..6cb605cd6b51a 100644
--- a/features/encrypt/distsql/parser/src/main/antlr4/imports/encrypt/BaseRule.g4
+++ b/features/encrypt/distsql/parser/src/main/antlr4/imports/encrypt/BaseRule.g4
@@ -34,25 +34,16 @@ algorithmTypeName
buildInEncryptAlgorithmType
: standardEncryptAlgorithm
| assistedEncryptAlgorithm
- | likeEncryptAlgorithm
;
standardEncryptAlgorithm
- : MD5
- | AES
- | RC4
- | SM3
- | SM4
+ : AES
;
assistedEncryptAlgorithm
: MD5
;
-likeEncryptAlgorithm
- : CHAR_DIGEST_LIKE
- ;
-
propertiesDefinition
: PROPERTIES LP_ properties? RP_
;
diff --git a/features/encrypt/distsql/parser/src/main/antlr4/imports/encrypt/Keyword.g4 b/features/encrypt/distsql/parser/src/main/antlr4/imports/encrypt/Keyword.g4
index be4b1ca11cbd2..8137363e58740 100644
--- a/features/encrypt/distsql/parser/src/main/antlr4/imports/encrypt/Keyword.g4
+++ b/features/encrypt/distsql/parser/src/main/antlr4/imports/encrypt/Keyword.g4
@@ -143,22 +143,6 @@ AES
: A E S
;
-RC4
- : R C [4]
- ;
-
-SM3
- : S M [3]
- ;
-
-SM4
- : S M [4]
- ;
-
-CHAR_DIGEST_LIKE
- : C H A R UL_ D I G E S T UL_ L I K E
- ;
-
NOT
: N O T
;
diff --git a/infra/common/pom.xml b/infra/common/pom.xml
index 95e8d9bc3811c..f8c529333b49d 100644
--- a/infra/common/pom.xml
+++ b/infra/common/pom.xml
@@ -83,16 +83,6 @@
shardingsphere-infra-data-source-pool-hikari
${project.version}
-
- org.apache.shardingsphere
- shardingsphere-infra-data-source-pool-dbcp
- ${project.version}
-
-
- org.apache.shardingsphere
- shardingsphere-infra-data-source-pool-c3p0
- ${project.version}
-
org.apache.shardingsphere
shardingsphere-parser-sql-engine
diff --git a/infra/common/src/main/java/org/apache/shardingsphere/infra/metadata/database/resource/StorageResource.java b/infra/common/src/main/java/org/apache/shardingsphere/infra/metadata/database/resource/StorageResource.java
index d5a41e404848c..593f9f80e570e 100644
--- a/infra/common/src/main/java/org/apache/shardingsphere/infra/metadata/database/resource/StorageResource.java
+++ b/infra/common/src/main/java/org/apache/shardingsphere/infra/metadata/database/resource/StorageResource.java
@@ -19,10 +19,12 @@
import lombok.Getter;
import org.apache.shardingsphere.infra.datasource.pool.CatalogSwitchableDataSource;
+import org.apache.shardingsphere.infra.datasource.pool.props.domain.DataSourcePoolProperties;
import org.apache.shardingsphere.infra.metadata.database.resource.node.StorageNode;
import org.apache.shardingsphere.infra.metadata.database.resource.unit.StorageUnitNodeMapper;
import javax.sql.DataSource;
+import java.util.Collections;
import java.util.LinkedHashMap;
import java.util.Map;
import java.util.Map.Entry;
@@ -31,7 +33,7 @@
* Storage resource.
*/
@Getter
-public class StorageResource {
+public final class StorageResource {
private final Map dataSourceMap;
@@ -39,10 +41,18 @@ public class StorageResource {
private final Map wrappedDataSources;
+ private final Map dataSourcePoolPropertiesMap;
+
public StorageResource(final Map dataSourceMap, final Map storageUnitNodeMappers) {
+ this(dataSourceMap, storageUnitNodeMappers, Collections.emptyMap());
+ }
+
+ public StorageResource(final Map dataSourceMap,
+ final Map storageUnitNodeMappers, final Map dataSourcePoolPropertiesMap) {
this.dataSourceMap = dataSourceMap;
this.storageUnitNodeMappers = storageUnitNodeMappers;
wrappedDataSources = createWrappedDataSources();
+ this.dataSourcePoolPropertiesMap = dataSourcePoolPropertiesMap;
}
private Map createWrappedDataSources() {
diff --git a/infra/common/src/main/java/org/apache/shardingsphere/infra/metadata/database/resource/StorageResourceCreator.java b/infra/common/src/main/java/org/apache/shardingsphere/infra/metadata/database/resource/StorageResourceCreator.java
index 703ac4677f09d..8f1bf5c93a071 100644
--- a/infra/common/src/main/java/org/apache/shardingsphere/infra/metadata/database/resource/StorageResourceCreator.java
+++ b/infra/common/src/main/java/org/apache/shardingsphere/infra/metadata/database/resource/StorageResourceCreator.java
@@ -22,14 +22,11 @@
import org.apache.shardingsphere.infra.database.core.connector.url.JdbcUrl;
import org.apache.shardingsphere.infra.database.core.connector.url.StandardJdbcUrlParser;
import org.apache.shardingsphere.infra.database.core.connector.url.UnrecognizedDatabaseURLException;
-import org.apache.shardingsphere.infra.database.core.metadata.database.DialectDatabaseMetaData;
-import org.apache.shardingsphere.infra.database.core.type.DatabaseType;
import org.apache.shardingsphere.infra.database.core.type.DatabaseTypeFactory;
import org.apache.shardingsphere.infra.database.core.type.DatabaseTypeRegistry;
import org.apache.shardingsphere.infra.datasource.pool.creator.DataSourcePoolCreator;
import org.apache.shardingsphere.infra.datasource.pool.props.domain.DataSourcePoolProperties;
import org.apache.shardingsphere.infra.metadata.database.resource.node.StorageNode;
-import org.apache.shardingsphere.infra.metadata.database.resource.node.StorageNodeProperties;
import org.apache.shardingsphere.infra.metadata.database.resource.unit.StorageUnitNodeMapper;
import javax.sql.DataSource;
@@ -51,16 +48,38 @@ public final class StorageResourceCreator {
*/
public static StorageResource createStorageResource(final Map propsMap) {
Map storageNodes = new LinkedHashMap<>();
- Map storageUnitNodeMappers = new LinkedHashMap<>();
+ Map mappers = new LinkedHashMap<>();
for (Entry entry : propsMap.entrySet()) {
- StorageNodeProperties storageNodeProps = getStorageNodeProperties(entry.getKey(), entry.getValue());
- StorageNode storageNode = new StorageNode(storageNodeProps.getName());
+ String storageUnitName = entry.getKey();
+ Map standardProps = entry.getValue().getConnectionPropertySynonyms().getStandardProperties();
+ String url = standardProps.get("url").toString();
+ boolean isInstanceConnectionAvailable = new DatabaseTypeRegistry(DatabaseTypeFactory.get(url)).getDialectDatabaseMetaData().isInstanceConnectionAvailable();
+ StorageNode storageNode = new StorageNode(getStorageNodeName(storageUnitName, url, standardProps.get("username").toString(), isInstanceConnectionAvailable));
if (!storageNodes.containsKey(storageNode)) {
- storageNodes.put(storageNode, DataSourcePoolCreator.create(entry.getKey(), entry.getValue(), true, storageNodes.values()));
+ storageNodes.put(storageNode, DataSourcePoolCreator.create(storageUnitName, entry.getValue(), true, storageNodes.values()));
}
- appendStorageUnitNodeMapper(storageUnitNodeMappers, storageNodeProps, entry.getKey(), entry.getValue());
+ mappers.put(storageUnitName, getStorageUnitNodeMapper(storageNode, storageUnitName, url, isInstanceConnectionAvailable));
}
- return new StorageResource(storageNodes, storageUnitNodeMappers);
+ return new StorageResource(storageNodes, mappers);
+ }
+
+ private static String getStorageNodeName(final String dataSourceName, final String url, final String username, final boolean isInstanceConnectionAvailable) {
+ try {
+ JdbcUrl jdbcUrl = new StandardJdbcUrlParser().parse(url);
+ return isInstanceConnectionAvailable ? generateStorageNodeName(jdbcUrl.getHostname(), jdbcUrl.getPort(), username) : dataSourceName;
+ } catch (final UnrecognizedDatabaseURLException ex) {
+ return dataSourceName;
+ }
+ }
+
+ private static String generateStorageNodeName(final String hostname, final int port, final String username) {
+ return String.format("%s_%s_%s", hostname, port, username);
+ }
+
+ private static StorageUnitNodeMapper getStorageUnitNodeMapper(final StorageNode storageNode, final String storageUnitName, final String url, final boolean isInstanceConnectionAvailable) {
+ return isInstanceConnectionAvailable
+ ? new StorageUnitNodeMapper(storageUnitName, storageNode, new StandardJdbcUrlParser().parse(url).getDatabase(), url)
+ : new StorageUnitNodeMapper(storageUnitName, storageNode, url);
}
/**
@@ -69,57 +88,22 @@ public static StorageResource createStorageResource(final Map propsMap) {
+ public static StorageResource createStorageResourceWithoutDataSource(final Map propsMap) {
Map storageNodes = new LinkedHashMap<>();
- Map storageUnitNodeMappers = new LinkedHashMap<>();
+ Map mappers = new LinkedHashMap<>();
Map newPropsMap = new LinkedHashMap<>();
for (Entry entry : propsMap.entrySet()) {
- StorageNodeProperties storageNodeProps = getStorageNodeProperties(entry.getKey(), entry.getValue());
- StorageNode storageNode = new StorageNode(storageNodeProps.getName());
- if (storageNodes.containsKey(storageNode)) {
- appendStorageUnitNodeMapper(storageUnitNodeMappers, storageNodeProps, entry.getKey(), entry.getValue());
- continue;
+ String storageUnitName = entry.getKey();
+ Map standardProps = entry.getValue().getConnectionPropertySynonyms().getStandardProperties();
+ String url = standardProps.get("url").toString();
+ boolean isInstanceConnectionAvailable = new DatabaseTypeRegistry(DatabaseTypeFactory.get(url)).getDialectDatabaseMetaData().isInstanceConnectionAvailable();
+ StorageNode storageNode = new StorageNode(getStorageNodeName(storageUnitName, url, standardProps.get("username").toString(), isInstanceConnectionAvailable));
+ if (!storageNodes.containsKey(storageNode)) {
+ storageNodes.put(storageNode, null);
+ newPropsMap.put(storageNode.getName(), entry.getValue());
}
- storageNodes.put(storageNode, null);
- appendStorageUnitNodeMapper(storageUnitNodeMappers, storageNodeProps, entry.getKey(), entry.getValue());
- newPropsMap.put(storageNodeProps.getName(), entry.getValue());
+ mappers.put(storageUnitName, getStorageUnitNodeMapper(storageNode, storageUnitName, url, isInstanceConnectionAvailable));
}
- return new StorageResourceWithProperties(storageNodes, storageUnitNodeMappers, newPropsMap);
- }
-
- private static void appendStorageUnitNodeMapper(final Map storageUnitNodeMappers, final StorageNodeProperties storageNodeProps,
- final String unitName, final DataSourcePoolProperties props) {
- String url = props.getConnectionPropertySynonyms().getStandardProperties().get("url").toString();
- storageUnitNodeMappers.put(unitName, getStorageUnitNodeMapper(storageNodeProps, unitName, url));
- }
-
- private static StorageUnitNodeMapper getStorageUnitNodeMapper(final StorageNodeProperties storageNodeProps, final String unitName, final String url) {
- DialectDatabaseMetaData dialectDatabaseMetaData = new DatabaseTypeRegistry(storageNodeProps.getDatabaseType()).getDialectDatabaseMetaData();
- return dialectDatabaseMetaData.isInstanceConnectionAvailable()
- ? new StorageUnitNodeMapper(unitName, new StorageNode(storageNodeProps.getName()), storageNodeProps.getCatalog(), url)
- : new StorageUnitNodeMapper(unitName, new StorageNode(storageNodeProps.getName()), url);
- }
-
- private static StorageNodeProperties getStorageNodeProperties(final String dataSourceName, final DataSourcePoolProperties storageNodeProps) {
- Map standardProps = storageNodeProps.getConnectionPropertySynonyms().getStandardProperties();
- String url = standardProps.get("url").toString();
- String username = standardProps.get("username").toString();
- DatabaseType databaseType = DatabaseTypeFactory.get(url);
- return getStorageNodeProperties(dataSourceName, url, username, databaseType);
- }
-
- private static StorageNodeProperties getStorageNodeProperties(final String dataSourceName, final String url, final String username, final DatabaseType databaseType) {
- try {
- JdbcUrl jdbcUrl = new StandardJdbcUrlParser().parse(url);
- DialectDatabaseMetaData dialectDatabaseMetaData = new DatabaseTypeRegistry(databaseType).getDialectDatabaseMetaData();
- String nodeName = dialectDatabaseMetaData.isInstanceConnectionAvailable() ? generateStorageNodeName(jdbcUrl.getHostname(), jdbcUrl.getPort(), username) : dataSourceName;
- return new StorageNodeProperties(nodeName, databaseType, jdbcUrl.getDatabase());
- } catch (final UnrecognizedDatabaseURLException ex) {
- return new StorageNodeProperties(dataSourceName, databaseType, null);
- }
- }
-
- private static String generateStorageNodeName(final String hostname, final int port, final String username) {
- return String.format("%s_%s_%s", hostname, port, username);
+ return new StorageResource(storageNodes, mappers, newPropsMap);
}
}
diff --git a/infra/common/src/main/java/org/apache/shardingsphere/infra/metadata/database/resource/StorageResourceWithProperties.java b/infra/common/src/main/java/org/apache/shardingsphere/infra/metadata/database/resource/StorageResourceWithProperties.java
deleted file mode 100644
index 6496f07e74490..0000000000000
--- a/infra/common/src/main/java/org/apache/shardingsphere/infra/metadata/database/resource/StorageResourceWithProperties.java
+++ /dev/null
@@ -1,41 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements. See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.shardingsphere.infra.metadata.database.resource;
-
-import lombok.Getter;
-import org.apache.shardingsphere.infra.datasource.pool.props.domain.DataSourcePoolProperties;
-import org.apache.shardingsphere.infra.metadata.database.resource.node.StorageNode;
-import org.apache.shardingsphere.infra.metadata.database.resource.unit.StorageUnitNodeMapper;
-
-import javax.sql.DataSource;
-import java.util.Map;
-
-/**
- * Storage resource with data source properties.
- */
-@Getter
-public final class StorageResourceWithProperties extends StorageResource {
-
- private final Map dataSourcePoolPropertiesMap;
-
- public StorageResourceWithProperties(final Map storageNodes,
- final Map storageUnitNodeMappers, final Map dataSourcePoolPropertiesMap) {
- super(storageNodes, storageUnitNodeMappers);
- this.dataSourcePoolPropertiesMap = dataSourcePoolPropertiesMap;
- }
-}
diff --git a/infra/common/src/main/java/org/apache/shardingsphere/infra/metadata/database/resource/node/StorageNodeProperties.java b/infra/common/src/main/java/org/apache/shardingsphere/infra/metadata/database/resource/node/StorageNodeProperties.java
deleted file mode 100644
index 93aa06f6ca771..0000000000000
--- a/infra/common/src/main/java/org/apache/shardingsphere/infra/metadata/database/resource/node/StorageNodeProperties.java
+++ /dev/null
@@ -1,51 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements. See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.shardingsphere.infra.metadata.database.resource.node;
-
-import com.google.common.base.Objects;
-import lombok.Getter;
-import lombok.RequiredArgsConstructor;
-import org.apache.shardingsphere.infra.database.core.type.DatabaseType;
-
-/**
- * Storage node properties.
- */
-@RequiredArgsConstructor
-@Getter
-public final class StorageNodeProperties {
-
- private final String name;
-
- private final DatabaseType databaseType;
-
- private final String catalog;
-
- @Override
- public boolean equals(final Object obj) {
- if (obj instanceof StorageNodeProperties) {
- StorageNodeProperties storageNodeProps = (StorageNodeProperties) obj;
- return storageNodeProps.name.equals(name);
- }
- return false;
- }
-
- @Override
- public int hashCode() {
- return Objects.hashCode(name.toUpperCase());
- }
-}
diff --git a/infra/data-source-pool/type/c3p0/pom.xml b/infra/data-source-pool/type/c3p0/pom.xml
deleted file mode 100644
index 5b074c97578df..0000000000000
--- a/infra/data-source-pool/type/c3p0/pom.xml
+++ /dev/null
@@ -1,54 +0,0 @@
-
-
-
-
- 4.0.0
-
- org.apache.shardingsphere
- shardingsphere-infra-data-source-pool-type
- 5.4.1-SNAPSHOT
-
- shardingsphere-infra-data-source-pool-c3p0
- ${project.artifactId}
-
-
-
- org.apache.shardingsphere
- shardingsphere-infra-data-source-pool-core
- ${project.version}
-
-
-
- com.mchange
- c3p0
-
-
-
- org.apache.shardingsphere
- shardingsphere-test-fixture-database
- ${project.version}
- test
-
-
- org.apache.shardingsphere
- shardingsphere-test-util
- ${project.version}
- test
-
-
-
diff --git a/infra/data-source-pool/type/c3p0/src/main/java/org/apache/shardingsphere/infra/datasource/pool/c3p0/metadata/C3P0DataSourcePoolFieldMetaData.java b/infra/data-source-pool/type/c3p0/src/main/java/org/apache/shardingsphere/infra/datasource/pool/c3p0/metadata/C3P0DataSourcePoolFieldMetaData.java
deleted file mode 100644
index 959e61c488a8d..0000000000000
--- a/infra/data-source-pool/type/c3p0/src/main/java/org/apache/shardingsphere/infra/datasource/pool/c3p0/metadata/C3P0DataSourcePoolFieldMetaData.java
+++ /dev/null
@@ -1,36 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements. See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.shardingsphere.infra.datasource.pool.c3p0.metadata;
-
-import org.apache.shardingsphere.infra.datasource.pool.metadata.DataSourcePoolFieldMetaData;
-
-/**
- * C3P0 data source pool field meta data.
- */
-public final class C3P0DataSourcePoolFieldMetaData implements DataSourcePoolFieldMetaData {
-
- @Override
- public String getJdbcUrlFieldName() {
- return "jdbcUrl";
- }
-
- @Override
- public String getJdbcUrlPropertiesFieldName() {
- return "properties";
- }
-}
diff --git a/infra/data-source-pool/type/c3p0/src/main/java/org/apache/shardingsphere/infra/datasource/pool/c3p0/metadata/C3P0DataSourcePoolMetaData.java b/infra/data-source-pool/type/c3p0/src/main/java/org/apache/shardingsphere/infra/datasource/pool/c3p0/metadata/C3P0DataSourcePoolMetaData.java
deleted file mode 100644
index 66027a8688ae5..0000000000000
--- a/infra/data-source-pool/type/c3p0/src/main/java/org/apache/shardingsphere/infra/datasource/pool/c3p0/metadata/C3P0DataSourcePoolMetaData.java
+++ /dev/null
@@ -1,103 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements. See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.shardingsphere.infra.datasource.pool.c3p0.metadata;
-
-import org.apache.shardingsphere.infra.datasource.pool.metadata.DataSourcePoolMetaData;
-
-import java.util.Collection;
-import java.util.HashMap;
-import java.util.LinkedList;
-import java.util.Map;
-
-/**
- * C3P0 data source pool meta data.
- */
-public final class C3P0DataSourcePoolMetaData implements DataSourcePoolMetaData {
-
- private static final Map DEFAULT_PROPS = new HashMap<>(6, 1F);
-
- private static final Map SKIPPED_PROPS = new HashMap<>(2, 1F);
-
- private static final Map PROP_SYNONYMS = new HashMap<>(5, 1F);
-
- private static final Collection TRANSIENT_FIELD_NAMES = new LinkedList<>();
-
- static {
- buildDefaultProperties();
- buildInvalidProperties();
- buildPropertySynonyms();
- buildTransientFieldNames();
- }
-
- private static void buildDefaultProperties() {
- DEFAULT_PROPS.put("checkoutTimeout", 20 * 1000L);
- DEFAULT_PROPS.put("maxIdleTime", 60 * 1000L);
- DEFAULT_PROPS.put("maxIdleTimeExcessConnections", 30 * 70 * 1000L);
- DEFAULT_PROPS.put("maxPoolSize", 15);
- DEFAULT_PROPS.put("minPoolSize", 3);
- DEFAULT_PROPS.put("readOnly", false);
- }
-
- private static void buildInvalidProperties() {
- SKIPPED_PROPS.put("minPoolSize", -1);
- SKIPPED_PROPS.put("maxPoolSize", -1);
- }
-
- private static void buildPropertySynonyms() {
- PROP_SYNONYMS.put("username", "user");
- PROP_SYNONYMS.put("url", "jdbcUrl");
- PROP_SYNONYMS.put("connectionTimeoutMilliseconds", "checkoutTimeout");
- PROP_SYNONYMS.put("idleTimeoutMilliseconds", "maxIdleTime");
- PROP_SYNONYMS.put("maxLifetimeMilliseconds", "maxIdleTimeExcessConnections");
- }
-
- private static void buildTransientFieldNames() {
- TRANSIENT_FIELD_NAMES.add("running");
- TRANSIENT_FIELD_NAMES.add("closed");
- }
-
- @Override
- public Map getDefaultProperties() {
- return DEFAULT_PROPS;
- }
-
- @Override
- public Map getSkippedProperties() {
- return SKIPPED_PROPS;
- }
-
- @Override
- public Map getPropertySynonyms() {
- return PROP_SYNONYMS;
- }
-
- @Override
- public Collection getTransientFieldNames() {
- return TRANSIENT_FIELD_NAMES;
- }
-
- @Override
- public C3P0DataSourcePoolFieldMetaData getFieldMetaData() {
- return new C3P0DataSourcePoolFieldMetaData();
- }
-
- @Override
- public String getType() {
- return "com.mchange.v2.c3p0.ComboPooledDataSource";
- }
-}
diff --git a/infra/data-source-pool/type/c3p0/src/main/resources/META-INF/services/org.apache.shardingsphere.infra.datasource.pool.metadata.DataSourcePoolMetaData b/infra/data-source-pool/type/c3p0/src/main/resources/META-INF/services/org.apache.shardingsphere.infra.datasource.pool.metadata.DataSourcePoolMetaData
deleted file mode 100644
index 7aaac6f4b147e..0000000000000
--- a/infra/data-source-pool/type/c3p0/src/main/resources/META-INF/services/org.apache.shardingsphere.infra.datasource.pool.metadata.DataSourcePoolMetaData
+++ /dev/null
@@ -1,18 +0,0 @@
-#
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements. See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License. You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-
-org.apache.shardingsphere.infra.datasource.pool.c3p0.metadata.C3P0DataSourcePoolMetaData
diff --git a/infra/data-source-pool/type/c3p0/src/test/java/org/apache/shardingsphere/infra/datasource/pool/c3p0/creator/C3P0DataSourcePoolCreatorTest.java b/infra/data-source-pool/type/c3p0/src/test/java/org/apache/shardingsphere/infra/datasource/pool/c3p0/creator/C3P0DataSourcePoolCreatorTest.java
deleted file mode 100644
index 1fb156bd1e973..0000000000000
--- a/infra/data-source-pool/type/c3p0/src/test/java/org/apache/shardingsphere/infra/datasource/pool/c3p0/creator/C3P0DataSourcePoolCreatorTest.java
+++ /dev/null
@@ -1,54 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements. See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.shardingsphere.infra.datasource.pool.c3p0.creator;
-
-import com.mchange.v2.c3p0.ComboPooledDataSource;
-import org.apache.shardingsphere.infra.datasource.pool.creator.DataSourcePoolCreator;
-import org.apache.shardingsphere.infra.datasource.pool.props.domain.DataSourcePoolProperties;
-import org.apache.shardingsphere.test.fixture.jdbc.MockedDataSource;
-import org.apache.shardingsphere.test.util.PropertiesBuilder;
-import org.junit.jupiter.api.Test;
-
-import java.util.HashMap;
-import java.util.Map;
-
-import static org.hamcrest.CoreMatchers.is;
-import static org.hamcrest.MatcherAssert.assertThat;
-
-class C3P0DataSourcePoolCreatorTest {
-
- @Test
- void assertCreateDataSource() {
- ComboPooledDataSource actual = (ComboPooledDataSource) DataSourcePoolCreator.create(new DataSourcePoolProperties(ComboPooledDataSource.class.getName(), createDataSourcePoolProperties()));
- assertThat(actual.getJdbcUrl(), is("jdbc:mock://127.0.0.1/foo_ds"));
- assertThat(actual.getUser(), is("root"));
- assertThat(actual.getPassword(), is("root"));
- assertThat(actual.getProperties(), is(PropertiesBuilder.build(new PropertiesBuilder.Property("foo", "foo_value"), new PropertiesBuilder.Property("bar", "bar_value"),
- new PropertiesBuilder.Property("password", "root"), new PropertiesBuilder.Property("user", "root"))));
- }
-
- private Map createDataSourcePoolProperties() {
- Map result = new HashMap<>();
- result.put("url", "jdbc:mock://127.0.0.1/foo_ds");
- result.put("driverClassName", MockedDataSource.class.getName());
- result.put("username", "root");
- result.put("password", "root");
- result.put("properties", PropertiesBuilder.build(new PropertiesBuilder.Property("foo", "foo_value"), new PropertiesBuilder.Property("bar", "bar_value")));
- return result;
- }
-}
diff --git a/infra/data-source-pool/type/dbcp/pom.xml b/infra/data-source-pool/type/dbcp/pom.xml
deleted file mode 100644
index b3e5a8f86d872..0000000000000
--- a/infra/data-source-pool/type/dbcp/pom.xml
+++ /dev/null
@@ -1,54 +0,0 @@
-
-
-
-
- 4.0.0
-
- org.apache.shardingsphere
- shardingsphere-infra-data-source-pool-type
- 5.4.1-SNAPSHOT
-
- shardingsphere-infra-data-source-pool-dbcp
- ${project.artifactId}
-
-
-
- org.apache.shardingsphere
- shardingsphere-infra-data-source-pool-core
- ${project.version}
-
-
-
- org.apache.commons
- commons-dbcp2
-
-
-
- org.apache.shardingsphere
- shardingsphere-test-fixture-database
- ${project.version}
- test
-
-
- org.apache.shardingsphere
- shardingsphere-test-util
- ${project.version}
- test
-
-
-
diff --git a/infra/data-source-pool/type/dbcp/src/main/java/org/apache/shardingsphere/infra/datasource/pool/dbcp/metadata/DBCPDataSourcePoolFieldMetaData.java b/infra/data-source-pool/type/dbcp/src/main/java/org/apache/shardingsphere/infra/datasource/pool/dbcp/metadata/DBCPDataSourcePoolFieldMetaData.java
deleted file mode 100644
index cec9416df9f1e..0000000000000
--- a/infra/data-source-pool/type/dbcp/src/main/java/org/apache/shardingsphere/infra/datasource/pool/dbcp/metadata/DBCPDataSourcePoolFieldMetaData.java
+++ /dev/null
@@ -1,36 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements. See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.shardingsphere.infra.datasource.pool.dbcp.metadata;
-
-import org.apache.shardingsphere.infra.datasource.pool.metadata.DataSourcePoolFieldMetaData;
-
-/**
- * DBCP data source pool field meta data.
- */
-public final class DBCPDataSourcePoolFieldMetaData implements DataSourcePoolFieldMetaData {
-
- @Override
- public String getJdbcUrlFieldName() {
- return "url";
- }
-
- @Override
- public String getJdbcUrlPropertiesFieldName() {
- return "connectionProperties";
- }
-}
diff --git a/infra/data-source-pool/type/dbcp/src/main/java/org/apache/shardingsphere/infra/datasource/pool/dbcp/metadata/DBCPDataSourcePoolMetaData.java b/infra/data-source-pool/type/dbcp/src/main/java/org/apache/shardingsphere/infra/datasource/pool/dbcp/metadata/DBCPDataSourcePoolMetaData.java
deleted file mode 100644
index d0dd5581ce3df..0000000000000
--- a/infra/data-source-pool/type/dbcp/src/main/java/org/apache/shardingsphere/infra/datasource/pool/dbcp/metadata/DBCPDataSourcePoolMetaData.java
+++ /dev/null
@@ -1,77 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements. See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.shardingsphere.infra.datasource.pool.dbcp.metadata;
-
-import org.apache.shardingsphere.infra.datasource.pool.metadata.DataSourcePoolMetaData;
-
-import java.util.Arrays;
-import java.util.Collection;
-import java.util.Collections;
-import java.util.LinkedList;
-import java.util.Map;
-
-/**
- * DBCP data source pool meta data.
- */
-public final class DBCPDataSourcePoolMetaData implements DataSourcePoolMetaData {
-
- private static final Collection TRANSIENT_FIELD_NAMES = new LinkedList<>();
-
- static {
- buildTransientFieldNames();
- }
-
- private static void buildTransientFieldNames() {
- TRANSIENT_FIELD_NAMES.add("closed");
- }
-
- @Override
- public Map getDefaultProperties() {
- return Collections.emptyMap();
- }
-
- @Override
- public Map getSkippedProperties() {
- return Collections.emptyMap();
- }
-
- @Override
- public Map getPropertySynonyms() {
- return Collections.emptyMap();
- }
-
- @Override
- public Collection getTransientFieldNames() {
- return TRANSIENT_FIELD_NAMES;
- }
-
- @Override
- public DBCPDataSourcePoolFieldMetaData getFieldMetaData() {
- return new DBCPDataSourcePoolFieldMetaData();
- }
-
- @Override
- public String getType() {
- return "org.apache.commons.dbcp2.BasicDataSource";
- }
-
- @Override
- public Collection
-
-
- com.ctrip.framework.apollo
- apollo-client
-
-
diff --git a/jdbc/core/src/main/java/org/apache/shardingsphere/driver/jdbc/core/driver/spi/ApolloURLProvider.java b/jdbc/core/src/main/java/org/apache/shardingsphere/driver/jdbc/core/driver/spi/ApolloURLProvider.java
deleted file mode 100644
index f267cf12f72a7..0000000000000
--- a/jdbc/core/src/main/java/org/apache/shardingsphere/driver/jdbc/core/driver/spi/ApolloURLProvider.java
+++ /dev/null
@@ -1,49 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements. See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.shardingsphere.driver.jdbc.core.driver.spi;
-
-import com.ctrip.framework.apollo.ConfigFile;
-import com.ctrip.framework.apollo.ConfigService;
-import com.ctrip.framework.apollo.core.enums.ConfigFileFormat;
-import com.google.common.base.Preconditions;
-import com.google.common.base.Strings;
-import org.apache.shardingsphere.driver.jdbc.core.driver.ShardingSphereURLProvider;
-
-import java.nio.charset.StandardCharsets;
-
-/**
- * Apollo URL provider.
- */
-public final class ApolloURLProvider implements ShardingSphereURLProvider {
-
- private static final String APOLLO_TYPE = "apollo:";
-
- @Override
- public boolean accept(final String url) {
- return !Strings.isNullOrEmpty(url) && url.contains(APOLLO_TYPE);
- }
-
- @Override
- public byte[] getContent(final String url, final String urlPrefix) {
- String configPath = url.substring(urlPrefix.length(), url.contains("?") ? url.indexOf('?') : url.length());
- String namespace = configPath.substring(APOLLO_TYPE.length());
- Preconditions.checkArgument(!namespace.isEmpty(), "Apollo namespace is required in ShardingSphere URL.");
- ConfigFile configFile = ConfigService.getConfigFile(namespace, ConfigFileFormat.YAML);
- return configFile.getContent().getBytes(StandardCharsets.UTF_8);
- }
-}
diff --git a/jdbc/core/src/main/resources/META-INF/services/org.apache.shardingsphere.driver.jdbc.core.driver.ShardingSphereURLProvider b/jdbc/core/src/main/resources/META-INF/services/org.apache.shardingsphere.driver.jdbc.core.driver.ShardingSphereURLProvider
index 5ccd96e77e8cd..df99d5c8e8e50 100644
--- a/jdbc/core/src/main/resources/META-INF/services/org.apache.shardingsphere.driver.jdbc.core.driver.ShardingSphereURLProvider
+++ b/jdbc/core/src/main/resources/META-INF/services/org.apache.shardingsphere.driver.jdbc.core.driver.ShardingSphereURLProvider
@@ -17,4 +17,3 @@
org.apache.shardingsphere.driver.jdbc.core.driver.spi.AbsolutePathURLProvider
org.apache.shardingsphere.driver.jdbc.core.driver.spi.ClasspathURLProvider
-org.apache.shardingsphere.driver.jdbc.core.driver.spi.ApolloURLProvider
diff --git a/jdbc/core/src/test/java/org/apache/shardingsphere/driver/jdbc/core/driver/ShardingSphereURLManagerTest.java b/jdbc/core/src/test/java/org/apache/shardingsphere/driver/jdbc/core/driver/ShardingSphereURLManagerTest.java
index ce12dec26a0b5..6e27d45c13007 100644
--- a/jdbc/core/src/test/java/org/apache/shardingsphere/driver/jdbc/core/driver/ShardingSphereURLManagerTest.java
+++ b/jdbc/core/src/test/java/org/apache/shardingsphere/driver/jdbc/core/driver/ShardingSphereURLManagerTest.java
@@ -17,28 +17,18 @@
package org.apache.shardingsphere.driver.jdbc.core.driver;
-import com.ctrip.framework.apollo.ConfigFile;
-import com.ctrip.framework.apollo.ConfigService;
-import com.ctrip.framework.apollo.core.enums.ConfigFileFormat;
import org.apache.shardingsphere.driver.jdbc.exception.syntax.URLProviderNotFoundException;
import org.apache.shardingsphere.test.mock.AutoMockExtension;
-import org.apache.shardingsphere.test.mock.StaticMockSettings;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.ExtendWith;
-import java.nio.charset.StandardCharsets;
import java.util.Objects;
import static org.hamcrest.CoreMatchers.is;
import static org.hamcrest.MatcherAssert.assertThat;
import static org.junit.jupiter.api.Assertions.assertThrows;
-import static org.mockito.ArgumentMatchers.any;
-import static org.mockito.ArgumentMatchers.anyString;
-import static org.mockito.Mockito.mock;
-import static org.mockito.Mockito.when;
@ExtendWith(AutoMockExtension.class)
-@StaticMockSettings(ConfigService.class)
class ShardingSphereURLManagerTest {
private final int fooDriverConfigLength = 999;
@@ -62,14 +52,4 @@ void assertToAbsolutePathConfigurationFile() {
byte[] actual = ShardingSphereURLManager.getContent("jdbc:shardingsphere:absolutepath:" + absolutePath, urlPrefix);
assertThat(actual.length, is(fooDriverConfigLength));
}
-
- @Test
- void assertToApolloConfigurationFile() {
- ConfigFile configFile = mock(ConfigFile.class);
- when(configFile.getContent()).thenReturn("config content");
- when(ConfigService.getConfigFile(anyString(), any(ConfigFileFormat.class))).thenReturn(configFile);
- String url = "jdbc:shardingsphere:apollo:namespace";
- byte[] content = ShardingSphereURLManager.getContent(url, urlPrefix);
- assertThat("config content".getBytes(StandardCharsets.UTF_8), is(content));
- }
}
diff --git a/kernel/sql-federation/core/src/main/java/org/apache/shardingsphere/sqlfederation/compiler/converter/segment/with/WithConverter.java b/kernel/sql-federation/core/src/main/java/org/apache/shardingsphere/sqlfederation/compiler/converter/segment/with/WithConverter.java
new file mode 100644
index 0000000000000..326ef647bbb18
--- /dev/null
+++ b/kernel/sql-federation/core/src/main/java/org/apache/shardingsphere/sqlfederation/compiler/converter/segment/with/WithConverter.java
@@ -0,0 +1,66 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.shardingsphere.sqlfederation.compiler.converter.segment.with;
+
+import org.apache.calcite.sql.SqlIdentifier;
+import org.apache.calcite.sql.SqlNode;
+import org.apache.calcite.sql.SqlNodeList;
+import org.apache.calcite.sql.SqlWithItem;
+import org.apache.calcite.sql.SqlWith;
+import org.apache.calcite.sql.parser.SqlParserPos;
+import org.apache.shardingsphere.sql.parser.sql.common.segment.dml.column.ColumnSegment;
+import org.apache.shardingsphere.sql.parser.sql.common.segment.generic.WithSegment;
+import org.apache.shardingsphere.sqlfederation.compiler.converter.segment.expression.ExpressionConverter;
+import org.apache.shardingsphere.sqlfederation.compiler.converter.statement.select.SelectStatementConverter;
+
+import java.util.Collection;
+import java.util.Optional;
+import java.util.stream.Collectors;
+
+/**
+ * With converter.
+ */
+public final class WithConverter {
+
+ /**
+ * Convert the given WithSegment and query into an SqlNodeList.
+ *
+ * @param withSegment with segment
+ * @param query SqlNode
+ * @return SqlNodeList
+ */
+ public Optional convert(final WithSegment withSegment, final SqlNode query) {
+ SqlIdentifier name = new SqlIdentifier(withSegment.getCommonTableExpressions().iterator().next().getIdentifier().getValue(), SqlParserPos.ZERO);
+ SqlNode selectSubquery = new SelectStatementConverter().convert(withSegment.getCommonTableExpressions().iterator().next().getSubquery().getSelect());
+ ExpressionConverter converter = new ExpressionConverter();
+ Collection collectionColumns = withSegment.getCommonTableExpressions().iterator().next().getColumns();
+ Collection convertedColumns;
+ SqlNodeList columns = null;
+ if (!collectionColumns.isEmpty()) {
+ convertedColumns = collectionColumns.stream().map(converter::convert).filter(Optional::isPresent).map(Optional::get).collect(Collectors.toList());
+ columns = new SqlNodeList(convertedColumns, SqlParserPos.ZERO);
+ }
+ SqlWithItem sqlWithItem = new SqlWithItem(SqlParserPos.ZERO, name, columns, selectSubquery);
+ SqlNodeList sqlWithItems = new SqlNodeList(SqlParserPos.ZERO);
+ sqlWithItems.add(sqlWithItem);
+ SqlWith sqlWith = new SqlWith(SqlParserPos.ZERO, sqlWithItems, query);
+ SqlNodeList result = new SqlNodeList(SqlParserPos.ZERO);
+ result.add(sqlWith);
+ return Optional.of(result);
+ }
+}
diff --git a/kernel/sql-federation/core/src/main/java/org/apache/shardingsphere/sqlfederation/compiler/converter/statement/delete/DeleteStatementConverter.java b/kernel/sql-federation/core/src/main/java/org/apache/shardingsphere/sqlfederation/compiler/converter/statement/delete/DeleteStatementConverter.java
index 154bb26f6db22..a60ed85ce220a 100644
--- a/kernel/sql-federation/core/src/main/java/org/apache/shardingsphere/sqlfederation/compiler/converter/statement/delete/DeleteStatementConverter.java
+++ b/kernel/sql-federation/core/src/main/java/org/apache/shardingsphere/sqlfederation/compiler/converter/statement/delete/DeleteStatementConverter.java
@@ -24,12 +24,14 @@
import org.apache.calcite.sql.SqlOrderBy;
import org.apache.calcite.sql.parser.SqlParserPos;
import org.apache.shardingsphere.sql.parser.sql.common.segment.dml.pagination.limit.LimitSegment;
+import org.apache.shardingsphere.sql.parser.sql.common.segment.generic.WithSegment;
import org.apache.shardingsphere.sql.parser.sql.common.statement.dml.DeleteStatement;
import org.apache.shardingsphere.sql.parser.sql.dialect.handler.dml.DeleteStatementHandler;
import org.apache.shardingsphere.sqlfederation.compiler.converter.segment.from.TableConverter;
import org.apache.shardingsphere.sqlfederation.compiler.converter.segment.limit.PaginationValueSQLConverter;
import org.apache.shardingsphere.sqlfederation.compiler.converter.segment.orderby.OrderByConverter;
import org.apache.shardingsphere.sqlfederation.compiler.converter.segment.where.WhereConverter;
+import org.apache.shardingsphere.sqlfederation.compiler.converter.segment.with.WithConverter;
import org.apache.shardingsphere.sqlfederation.compiler.converter.statement.SQLStatementConverter;
import java.util.Optional;
@@ -41,7 +43,7 @@ public final class DeleteStatementConverter implements SQLStatementConverter new OrderByConverter().convert(optional)).orElse(SqlNodeList.EMPTY);
Optional limit = DeleteStatementHandler.getLimitSegment(deleteStatement);
if (limit.isPresent()) {
@@ -52,10 +54,15 @@ public SqlNode convert(final DeleteStatement deleteStatement) {
return orderBy.isEmpty() ? sqlDelete : new SqlOrderBy(SqlParserPos.ZERO, sqlDelete, orderBy, null, null);
}
- private SqlDelete convertDelete(final DeleteStatement deleteStatement) {
+ private SqlNode convertDelete(final DeleteStatement deleteStatement) {
SqlNode deleteTable = new TableConverter().convert(deleteStatement.getTable()).orElseThrow(IllegalStateException::new);
SqlNode condition = deleteStatement.getWhere().flatMap(optional -> new WhereConverter().convert(optional)).orElse(null);
SqlIdentifier alias = deleteStatement.getTable().getAliasName().map(optional -> new SqlIdentifier(optional, SqlParserPos.ZERO)).orElse(null);
- return new SqlDelete(SqlParserPos.ZERO, deleteTable, condition, null, alias);
+ SqlDelete sqlDelete = new SqlDelete(SqlParserPos.ZERO, deleteTable, condition, null, alias);
+ Optional with = DeleteStatementHandler.getWithSegment(deleteStatement);
+ if (with.isPresent()) {
+ return new WithConverter().convert(DeleteStatementHandler.getWithSegment(deleteStatement).get(), sqlDelete).get();
+ }
+ return sqlDelete;
}
}
diff --git a/kernel/transaction/type/base/seata-at/src/test/resources/registry.conf b/kernel/transaction/type/base/seata-at/src/test/resources/registry.conf
index 3be5402d98f94..2ed8a6903a08b 100644
--- a/kernel/transaction/type/base/seata-at/src/test/resources/registry.conf
+++ b/kernel/transaction/type/base/seata-at/src/test/resources/registry.conf
@@ -16,14 +16,9 @@
#
registry {
- # file 、nacos 、eureka、redis、zk
+ # file 、eureka、redis、zk
type = "file"
- nacos {
- serverAddr = "localhost"
- namespace = "public"
- cluster = "default"
- }
eureka {
serviceUrl = "http://localhost:1001/eureka"
application = "default"
@@ -45,18 +40,9 @@ registry {
}
config {
- # file、nacos 、apollo、zk
+ # file、zk
type = "file"
- nacos {
- serverAddr = "localhost"
- namespace = "public"
- cluster = "default"
- }
- apollo {
- app.id = "fescar-server"
- apollo.meta = "http://192.168.1.204:8801"
- }
zk {
serverAddr = "127.0.0.1:2181"
session.timeout = 6000
diff --git a/mode/core/src/main/java/org/apache/shardingsphere/mode/manager/switcher/NewResourceSwitchManager.java b/mode/core/src/main/java/org/apache/shardingsphere/mode/manager/switcher/NewResourceSwitchManager.java
index 61a0a09eb6e2f..96e8310f3a4e2 100644
--- a/mode/core/src/main/java/org/apache/shardingsphere/mode/manager/switcher/NewResourceSwitchManager.java
+++ b/mode/core/src/main/java/org/apache/shardingsphere/mode/manager/switcher/NewResourceSwitchManager.java
@@ -19,12 +19,11 @@
import org.apache.shardingsphere.infra.datasource.pool.creator.DataSourcePoolCreator;
import org.apache.shardingsphere.infra.datasource.pool.props.domain.DataSourcePoolProperties;
-import org.apache.shardingsphere.infra.metadata.database.resource.node.StorageNode;
+import org.apache.shardingsphere.infra.metadata.database.resource.ResourceMetaData;
import org.apache.shardingsphere.infra.metadata.database.resource.StorageResource;
import org.apache.shardingsphere.infra.metadata.database.resource.StorageResourceCreator;
-import org.apache.shardingsphere.infra.metadata.database.resource.StorageResourceWithProperties;
+import org.apache.shardingsphere.infra.metadata.database.resource.node.StorageNode;
import org.apache.shardingsphere.infra.metadata.database.resource.unit.StorageUnitNodeMapper;
-import org.apache.shardingsphere.infra.metadata.database.resource.ResourceMetaData;
import javax.sql.DataSource;
import java.util.Collections;
@@ -49,12 +48,12 @@ public final class NewResourceSwitchManager {
public SwitchingResource registerStorageUnit(final ResourceMetaData resourceMetaData, final Map propsMap) {
Map mergedPropsMap = new HashMap<>(resourceMetaData.getStorageUnitMetaData().getDataSourcePoolPropertiesMap());
mergedPropsMap.putAll(propsMap);
- StorageResourceWithProperties toBeCreatedStorageResource = StorageResourceCreator.createStorageResourceWithoutDataSource(propsMap);
+ StorageResource toBeCreatedStorageResource = StorageResourceCreator.createStorageResourceWithoutDataSource(propsMap);
return new SwitchingResource(resourceMetaData, getRegisterNewStorageResource(resourceMetaData, toBeCreatedStorageResource),
new StorageResource(Collections.emptyMap(), Collections.emptyMap()), mergedPropsMap);
}
- private StorageResource getRegisterNewStorageResource(final ResourceMetaData resourceMetaData, final StorageResourceWithProperties toBeCreatedStorageResource) {
+ private StorageResource getRegisterNewStorageResource(final ResourceMetaData resourceMetaData, final StorageResource toBeCreatedStorageResource) {
Map storageNodes = new LinkedHashMap<>(toBeCreatedStorageResource.getDataSourceMap().size(), 1F);
for (StorageNode each : toBeCreatedStorageResource.getDataSourceMap().keySet()) {
if (!resourceMetaData.getDataSourceMap().containsKey(each)) {
@@ -74,12 +73,12 @@ private StorageResource getRegisterNewStorageResource(final ResourceMetaData res
public SwitchingResource alterStorageUnit(final ResourceMetaData resourceMetaData, final Map propsMap) {
Map mergedDataSourcePoolPropertiesMap = new HashMap<>(resourceMetaData.getStorageUnitMetaData().getDataSourcePoolPropertiesMap());
mergedDataSourcePoolPropertiesMap.putAll(propsMap);
- StorageResourceWithProperties toBeAlteredStorageResource = StorageResourceCreator.createStorageResourceWithoutDataSource(mergedDataSourcePoolPropertiesMap);
+ StorageResource toBeAlteredStorageResource = StorageResourceCreator.createStorageResourceWithoutDataSource(mergedDataSourcePoolPropertiesMap);
return new SwitchingResource(resourceMetaData, getAlterNewStorageResource(toBeAlteredStorageResource),
getStaleStorageResource(resourceMetaData, toBeAlteredStorageResource), mergedDataSourcePoolPropertiesMap);
}
- private StorageResource getAlterNewStorageResource(final StorageResourceWithProperties toBeAlteredStorageResource) {
+ private StorageResource getAlterNewStorageResource(final StorageResource toBeAlteredStorageResource) {
Map storageNodes = new LinkedHashMap<>(toBeAlteredStorageResource.getDataSourceMap().size(), 1F);
for (StorageNode each : toBeAlteredStorageResource.getDataSourceMap().keySet()) {
storageNodes.put(each, DataSourcePoolCreator.create(toBeAlteredStorageResource.getDataSourcePoolPropertiesMap().get(each.getName())));
@@ -87,7 +86,7 @@ private StorageResource getAlterNewStorageResource(final StorageResourceWithProp
return new StorageResource(storageNodes, toBeAlteredStorageResource.getStorageUnitNodeMappers());
}
- private StorageResource getStaleStorageResource(final ResourceMetaData resourceMetaData, final StorageResourceWithProperties toBeAlteredStorageResource) {
+ private StorageResource getStaleStorageResource(final ResourceMetaData resourceMetaData, final StorageResource toBeAlteredStorageResource) {
Map storageNodes = new LinkedHashMap<>(toBeAlteredStorageResource.getDataSourceMap().size(), 1F);
for (Entry entry : resourceMetaData.getDataSourceMap().entrySet()) {
if (toBeAlteredStorageResource.getDataSourceMap().containsKey(entry.getKey())) {
diff --git a/mode/core/src/main/java/org/apache/shardingsphere/mode/manager/switcher/ResourceSwitchManager.java b/mode/core/src/main/java/org/apache/shardingsphere/mode/manager/switcher/ResourceSwitchManager.java
index 209dc2d7dc6ea..65e5fbdf8a77a 100644
--- a/mode/core/src/main/java/org/apache/shardingsphere/mode/manager/switcher/ResourceSwitchManager.java
+++ b/mode/core/src/main/java/org/apache/shardingsphere/mode/manager/switcher/ResourceSwitchManager.java
@@ -20,13 +20,12 @@
import org.apache.shardingsphere.infra.datasource.pool.creator.DataSourcePoolCreator;
import org.apache.shardingsphere.infra.datasource.pool.props.creator.DataSourcePoolPropertiesCreator;
import org.apache.shardingsphere.infra.datasource.pool.props.domain.DataSourcePoolProperties;
-import org.apache.shardingsphere.infra.metadata.database.resource.node.StorageNode;
+import org.apache.shardingsphere.infra.metadata.database.resource.ResourceMetaData;
import org.apache.shardingsphere.infra.metadata.database.resource.StorageResource;
import org.apache.shardingsphere.infra.metadata.database.resource.StorageResourceCreator;
-import org.apache.shardingsphere.infra.metadata.database.resource.StorageResourceWithProperties;
+import org.apache.shardingsphere.infra.metadata.database.resource.node.StorageNode;
import org.apache.shardingsphere.infra.metadata.database.resource.unit.StorageUnit;
import org.apache.shardingsphere.infra.metadata.database.resource.unit.StorageUnitNodeMapper;
-import org.apache.shardingsphere.infra.metadata.database.resource.ResourceMetaData;
import javax.sql.DataSource;
import java.util.Collection;
@@ -52,7 +51,7 @@ public final class ResourceSwitchManager {
public SwitchingResource create(final ResourceMetaData resourceMetaData, final Map toBeChangedPropsMap) {
Map mergedPropsMap = new HashMap<>(resourceMetaData.getStorageUnitMetaData().getDataSourcePoolPropertiesMap());
mergedPropsMap.putAll(toBeChangedPropsMap);
- StorageResourceWithProperties toBeChangedStorageResource = StorageResourceCreator.createStorageResourceWithoutDataSource(toBeChangedPropsMap);
+ StorageResource toBeChangedStorageResource = StorageResourceCreator.createStorageResourceWithoutDataSource(toBeChangedPropsMap);
return new SwitchingResource(resourceMetaData, createNewStorageResource(resourceMetaData, toBeChangedStorageResource),
getStaleDataSources(resourceMetaData, toBeChangedStorageResource), mergedPropsMap);
}
@@ -67,7 +66,7 @@ public SwitchingResource create(final ResourceMetaData resourceMetaData, final M
public SwitchingResource createByDropResource(final ResourceMetaData resourceMetaData, final Map toBeDeletedPropsMap) {
Map mergedDataSourcePoolPropertiesMap = new HashMap<>(resourceMetaData.getStorageUnitMetaData().getDataSourcePoolPropertiesMap());
mergedDataSourcePoolPropertiesMap.keySet().removeIf(toBeDeletedPropsMap::containsKey);
- StorageResourceWithProperties toToBeRemovedStorageResource = StorageResourceCreator.createStorageResourceWithoutDataSource(toBeDeletedPropsMap);
+ StorageResource toToBeRemovedStorageResource = StorageResourceCreator.createStorageResourceWithoutDataSource(toBeDeletedPropsMap);
return new SwitchingResource(resourceMetaData, new StorageResource(Collections.emptyMap(), Collections.emptyMap()),
getToBeRemovedStaleDataSources(resourceMetaData, toToBeRemovedStorageResource), mergedDataSourcePoolPropertiesMap);
}
@@ -83,7 +82,7 @@ public SwitchingResource createByAlterDataSourcePoolProperties(final ResourceMet
Map mergedDataSourcePoolPropertiesMap = new HashMap<>(resourceMetaData.getStorageUnitMetaData().getDataSourcePoolPropertiesMap());
mergedDataSourcePoolPropertiesMap.keySet().removeIf(each -> !toBeChangedPropsMap.containsKey(each));
mergedDataSourcePoolPropertiesMap.putAll(toBeChangedPropsMap);
- StorageResourceWithProperties toBeChangedStorageResource = StorageResourceCreator.createStorageResourceWithoutDataSource(toBeChangedPropsMap);
+ StorageResource toBeChangedStorageResource = StorageResourceCreator.createStorageResourceWithoutDataSource(toBeChangedPropsMap);
StorageResource staleStorageResource = getStaleDataSources(resourceMetaData, toBeChangedStorageResource);
staleStorageResource.getDataSourceMap()
.putAll(getToBeDeletedDataSources(resourceMetaData.getDataSourceMap(), toBeChangedStorageResource.getDataSourceMap().keySet()));
@@ -92,7 +91,7 @@ public SwitchingResource createByAlterDataSourcePoolProperties(final ResourceMet
return new SwitchingResource(resourceMetaData, createNewStorageResource(resourceMetaData, toBeChangedStorageResource), staleStorageResource, mergedDataSourcePoolPropertiesMap);
}
- private StorageResource createNewStorageResource(final ResourceMetaData resourceMetaData, final StorageResourceWithProperties toBeChangedStorageResource) {
+ private StorageResource createNewStorageResource(final ResourceMetaData resourceMetaData, final StorageResource toBeChangedStorageResource) {
Map storageNodes =
getNewStorageNodes(resourceMetaData, toBeChangedStorageResource.getDataSourceMap(), toBeChangedStorageResource.getDataSourcePoolPropertiesMap());
Map storageUnitNodeMappers = getNewStorageUnitNodeMappers(resourceMetaData, toBeChangedStorageResource.getStorageUnitNodeMappers());
@@ -141,7 +140,7 @@ private Map getToBeAddedDataSources(final Map reservedStorageUnitNodeMappers = resourceMetaData.getStorageUnitMetaData().getStorageUnits().entrySet().stream()
.filter(entry -> !toBeRemovedStorageResource.getStorageUnitNodeMappers().containsKey(entry.getKey()))
.collect(Collectors.toMap(Entry::getKey, entry -> entry.getValue().getUnitNodeMapper()));
@@ -154,7 +153,7 @@ private StorageResource getToBeRemovedStaleDataSources(final ResourceMetaData re
return new StorageResource(staleStorageNodes, staleStorageUnitNodeMappers);
}
- private StorageResource getStaleDataSources(final ResourceMetaData resourceMetaData, final StorageResourceWithProperties toBeChangedStorageResource) {
+ private StorageResource getStaleDataSources(final ResourceMetaData resourceMetaData, final StorageResource toBeChangedStorageResource) {
Map storageNodes = new LinkedHashMap<>(resourceMetaData.getDataSourceMap().size(), 1F);
Map storageUnitNodeMappers = new LinkedHashMap<>(resourceMetaData.getStorageUnitMetaData().getUnitNodeMappers().size(), 1F);
storageNodes.putAll(getToBeChangedDataSources(resourceMetaData.getDataSourceMap(), toBeChangedStorageResource.getDataSourcePoolPropertiesMap()));
diff --git a/mode/type/cluster/repository/provider/nacos/pom.xml b/mode/type/cluster/repository/provider/nacos/pom.xml
deleted file mode 100644
index 15f90bd3c676a..0000000000000
--- a/mode/type/cluster/repository/provider/nacos/pom.xml
+++ /dev/null
@@ -1,53 +0,0 @@
-
-
-
-
- 4.0.0
-
- org.apache.shardingsphere
- shardingsphere-cluster-mode-repository-provider
- 5.4.1-SNAPSHOT
-
- shardingsphere-cluster-mode-repository-nacos
- ${project.artifactId}
-
-
- 1.4.2
-
-
-
-
- org.apache.shardingsphere
- shardingsphere-cluster-mode-repository-api
- ${project.version}
-
-
-
- org.apache.shardingsphere
- shardingsphere-test-util
- ${project.version}
- test
-
-
-
- com.alibaba.nacos
- nacos-client
- ${nacos.version}
-
-
-
diff --git a/mode/type/cluster/repository/provider/nacos/src/main/java/org/apache/shardingsphere/mode/repository/cluster/nacos/NacosRepository.java b/mode/type/cluster/repository/provider/nacos/src/main/java/org/apache/shardingsphere/mode/repository/cluster/nacos/NacosRepository.java
deleted file mode 100644
index 0a076804be0f7..0000000000000
--- a/mode/type/cluster/repository/provider/nacos/src/main/java/org/apache/shardingsphere/mode/repository/cluster/nacos/NacosRepository.java
+++ /dev/null
@@ -1,363 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements. See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.shardingsphere.mode.repository.cluster.nacos;
-
-import com.alibaba.nacos.api.exception.NacosException;
-import com.alibaba.nacos.api.naming.NamingFactory;
-import com.alibaba.nacos.api.naming.NamingService;
-import com.alibaba.nacos.api.naming.PreservedMetadataKeys;
-import com.alibaba.nacos.api.naming.pojo.Instance;
-import com.google.common.base.Preconditions;
-import com.google.common.base.Strings;
-import lombok.SneakyThrows;
-import org.apache.shardingsphere.infra.instance.util.IpUtils;
-import org.apache.shardingsphere.mode.repository.cluster.ClusterPersistRepository;
-import org.apache.shardingsphere.mode.repository.cluster.ClusterPersistRepositoryConfiguration;
-import org.apache.shardingsphere.mode.repository.cluster.exception.ClusterPersistRepositoryException;
-import org.apache.shardingsphere.mode.repository.cluster.listener.DataChangedEventListener;
-import org.apache.shardingsphere.mode.repository.cluster.lock.holder.DistributedLockHolder;
-import org.apache.shardingsphere.mode.repository.cluster.nacos.entity.KeyValue;
-import org.apache.shardingsphere.mode.repository.cluster.nacos.entity.ServiceController;
-import org.apache.shardingsphere.mode.repository.cluster.nacos.entity.ServiceMetaData;
-import org.apache.shardingsphere.mode.repository.cluster.nacos.listener.NamingEventListener;
-import org.apache.shardingsphere.mode.repository.cluster.nacos.props.NacosProperties;
-import org.apache.shardingsphere.mode.repository.cluster.nacos.props.NacosPropertyKey;
-import org.apache.shardingsphere.mode.repository.cluster.nacos.util.NacosMetaDataUtils;
-
-import java.security.SecureRandom;
-import java.util.Collection;
-import java.util.Comparator;
-import java.util.HashMap;
-import java.util.LinkedList;
-import java.util.List;
-import java.util.Map;
-import java.util.Map.Entry;
-import java.util.Objects;
-import java.util.Optional;
-import java.util.Properties;
-import java.util.Random;
-import java.util.concurrent.atomic.AtomicInteger;
-import java.util.stream.Collectors;
-import java.util.stream.Stream;
-
-/**
- * Registry repository of Nacos.
- */
-public final class NacosRepository implements ClusterPersistRepository {
-
- private final Random random = new SecureRandom();
-
- private NamingService client;
-
- private NacosProperties nacosProps;
-
- private ServiceController serviceController;
-
- @Override
- public void init(final ClusterPersistRepositoryConfiguration config) {
- nacosProps = new NacosProperties(config.getProps());
- client = createClient(config);
- initServiceMetaData();
- }
-
- private NamingService createClient(final ClusterPersistRepositoryConfiguration config) {
- Properties props = new Properties();
- props.setProperty("serverAddr", config.getServerLists());
- props.setProperty("namespace", config.getNamespace());
- props.setProperty("username", nacosProps.getValue(NacosPropertyKey.USERNAME));
- props.setProperty("password", nacosProps.getValue(NacosPropertyKey.PASSWORD));
- try {
- return NamingFactory.createNamingService(props);
- } catch (final NacosException ex) {
- throw new ClusterPersistRepositoryException(ex);
- }
- }
-
- private void initServiceMetaData() {
- try {
- String clusterIp = nacosProps.getValue(NacosPropertyKey.CLUSTER_IP);
- String ip = Strings.isNullOrEmpty(clusterIp) ? IpUtils.getIp() : clusterIp;
- serviceController = new ServiceController();
- for (ServiceMetaData each : serviceController.getAllServices()) {
- Integer port = client.getAllInstances(each.getServiceName(), false).stream()
- .filter(instance -> ip.equals(instance.getIp())).map(Instance::getPort).max(Comparator.naturalOrder()).orElse(Integer.MIN_VALUE);
- each.setIp(ip);
- each.setPort(new AtomicInteger(port));
- }
- } catch (final NacosException ex) {
- throw new ClusterPersistRepositoryException(ex);
- }
- }
-
- @Override
- public void persistEphemeral(final String key, final String value) {
- try {
- Preconditions.checkNotNull(value, "Value can not be null");
- if (!findExistedInstance(key, true).isEmpty()) {
- delete(key);
- }
- put(key, value, true);
- } catch (final NacosException ex) {
- throw new ClusterPersistRepositoryException(ex);
- }
- }
-
- @Override
- public void persistExclusiveEphemeral(final String key, final String value) {
- try {
- Preconditions.checkState(findExistedInstance(key, true).isEmpty(), "Key `%s` already exists", key);
- put(key, value, true);
- } catch (final NacosException ex) {
- throw new ClusterPersistRepositoryException(ex);
- }
- }
-
- @Override
- public DistributedLockHolder getDistributedLockHolder() {
- return null;
- }
-
- @Override
- public void watch(final String key, final DataChangedEventListener listener) {
- try {
- for (ServiceMetaData each : serviceController.getAllServices()) {
- NamingEventListener eventListener = each.getListener();
- if (null != eventListener) {
- eventListener.put(key, listener);
- return;
- }
- eventListener = new NamingEventListener();
- eventListener.put(key, listener);
- each.setListener(eventListener);
- client.subscribe(each.getServiceName(), eventListener);
- }
- } catch (final NacosException ex) {
- throw new ClusterPersistRepositoryException(ex);
- }
- }
-
- @Override
- public String getDirectly(final String key) {
- try {
- for (ServiceMetaData each : serviceController.getAllServices()) {
- Optional instance = findExistedInstance(key, each.isEphemeral()).stream().max(Comparator.comparing(NacosMetaDataUtils::getTimestamp));
- if (instance.isPresent()) {
- return NacosMetaDataUtils.getValue(instance.get());
- }
- }
- return null;
- } catch (final NacosException ex) {
- throw new ClusterPersistRepositoryException(ex);
- }
- }
-
- @Override
- public List getChildrenKeys(final String key) {
- try {
- Stream concatKeys = Stream.empty();
- for (ServiceMetaData each : serviceController.getAllServices()) {
- Stream keys = findExistedInstance(each.isEphemeral()).stream()
- .map(instance -> {
- String fullPath = NacosMetaDataUtils.getKey(instance);
- if (fullPath.startsWith(key + PATH_SEPARATOR)) {
- String pathWithoutPrefix = fullPath.substring((key + PATH_SEPARATOR).length());
- return pathWithoutPrefix.contains(PATH_SEPARATOR) ? pathWithoutPrefix.substring(0, pathWithoutPrefix.indexOf(PATH_SEPARATOR)) : pathWithoutPrefix;
- }
- return null;
- }).filter(Objects::nonNull);
- concatKeys = Stream.concat(concatKeys, keys);
- }
- return concatKeys.distinct().sorted(Comparator.reverseOrder()).collect(Collectors.toList());
- } catch (final NacosException ex) {
- throw new ClusterPersistRepositoryException(ex);
- }
- }
-
- @Override
- public boolean isExisted(final String key) {
- return false;
- }
-
- @Override
- public void persist(final String key, final String value) {
- try {
- Preconditions.checkNotNull(value, "Value can not be null");
- Optional instance = findExistedInstance(key, false).stream().max(Comparator.comparing(NacosMetaDataUtils::getTimestamp));
- if (instance.isPresent()) {
- update(instance.get(), value);
- } else {
- put(key, value, false);
- }
- } catch (final NacosException ex) {
- throw new ClusterPersistRepositoryException(ex);
- }
- }
-
- @Override
- public void update(final String key, final String value) {
- // TODO
- }
-
- private void update(final Instance instance, final String value) throws NacosException {
- Map metaDataMap = instance.getMetadata();
- String key = NacosMetaDataUtils.getKey(instance);
- metaDataMap.put(key, value);
- metaDataMap.put(NacosMetaDataUtils.UTC_ZONE_OFFSET.toString(), String.valueOf(NacosMetaDataUtils.getTimestamp()));
- instance.setMetadata(metaDataMap);
- ServiceMetaData persistentService = serviceController.getPersistentService();
- client.registerInstance(persistentService.getServiceName(), instance);
- Collection keyValues = new LinkedList<>();
- keyValues.add(new KeyValue(key, value, instance.isEphemeral()));
- waitValue(keyValues);
- }
-
- private void put(final String key, final String value, final boolean ephemeral) throws NacosException {
- final Collection keyValues = buildParentPath(key);
- ServiceMetaData serviceMetaData = serviceController.getService(ephemeral);
- Instance instance = new Instance();
- instance.setIp(serviceMetaData.getIp());
- instance.setPort(serviceMetaData.getPort());
- instance.setEphemeral(ephemeral);
- Map metadataMap = new HashMap<>(5, 1F);
- if (ephemeral) {
- fillEphemeralMetaData(metadataMap);
- }
- metadataMap.put(key, value);
- metadataMap.put(NacosMetaDataUtils.UTC_ZONE_OFFSET.toString(), String.valueOf(NacosMetaDataUtils.getTimestamp()));
- instance.setMetadata(metadataMap);
- client.registerInstance(serviceMetaData.getServiceName(), instance);
- keyValues.add(new KeyValue(key, value, ephemeral));
- waitValue(keyValues);
- }
-
- private Collection buildParentPath(final String key) throws NacosException {
- Collection result = new LinkedList<>();
- StringBuilder parentPath = new StringBuilder();
- String[] partPath = key.split(PATH_SEPARATOR);
- for (int index = 1; index < partPath.length - 1; index++) {
- String path = parentPath.append(PATH_SEPARATOR).append(partPath[index]).toString();
- if (findExistedInstance(path, false).isEmpty()) {
- result.addAll(build(path));
- }
- }
- return result;
- }
-
- private Collection build(final String key) throws NacosException {
- Collection result = new LinkedList<>();
- if (findExistedInstance(key, false).isEmpty()) {
- Instance instance = new Instance();
- ServiceMetaData persistentService = serviceController.getPersistentService();
- instance.setIp(persistentService.getIp());
- instance.setPort(persistentService.getPort());
- instance.setEphemeral(false);
- Map metaDataMap = new HashMap<>(2, 1F);
- metaDataMap.put(key, "");
- metaDataMap.put(NacosMetaDataUtils.UTC_ZONE_OFFSET.toString(), String.valueOf(NacosMetaDataUtils.getTimestamp()));
- instance.setMetadata(metaDataMap);
- client.registerInstance(persistentService.getServiceName(), instance);
- result.add(new KeyValue(key, "", false));
- }
- return result;
- }
-
- private void fillEphemeralMetaData(final Map metaDataMap) {
- int timeToLiveSeconds = nacosProps.getValue(NacosPropertyKey.TIME_TO_LIVE_SECONDS);
- metaDataMap.put(PreservedMetadataKeys.HEART_BEAT_INTERVAL, String.valueOf(timeToLiveSeconds * 1000 / 3));
- metaDataMap.put(PreservedMetadataKeys.HEART_BEAT_TIMEOUT, String.valueOf(timeToLiveSeconds * 1000 * 2 / 3));
- metaDataMap.put(PreservedMetadataKeys.IP_DELETE_TIMEOUT, String.valueOf(timeToLiveSeconds * 1000));
- }
-
- @Override
- public void delete(final String key) {
- try {
- for (ServiceMetaData each : serviceController.getAllServices()) {
- Collection instances = findExistedInstance(each.isEphemeral()).stream()
- .filter(instance -> {
- String fullPath = NacosMetaDataUtils.getKey(instance);
- return fullPath.startsWith(key + PATH_SEPARATOR) || key.equals(fullPath);
- })
- .sorted(Comparator.comparing(NacosMetaDataUtils::getKey).reversed()).collect(Collectors.toList());
- Collection keyValues = new LinkedList<>();
- for (Instance instance : instances) {
- client.deregisterInstance(each.getServiceName(), instance);
- keyValues.add(new KeyValue(NacosMetaDataUtils.getKey(instance), null, each.isEphemeral()));
- }
- waitValue(keyValues);
- }
- } catch (final NacosException ex) {
- throw new ClusterPersistRepositoryException(ex);
- }
- }
-
- private Collection findExistedInstance(final String key, final boolean ephemeral) throws NacosException {
- return client.getAllInstances(serviceController.getService(ephemeral).getServiceName(), false).stream()
- .filter(each -> Objects.equals(key, NacosMetaDataUtils.getKey(each))).collect(Collectors.toList());
- }
-
- private Collection findExistedInstance(final boolean ephemeral) throws NacosException {
- return client.getAllInstances(serviceController.getService(ephemeral).getServiceName(), false);
- }
-
- @SneakyThrows(InterruptedException.class)
- private void waitValue(final Collection keyValues) throws NacosException {
- if (!isAvailable(keyValues)) {
- long retryIntervalMilliseconds = nacosProps.getValue(NacosPropertyKey.RETRY_INTERVAL_MILLISECONDS);
- int maxRetries = nacosProps.getValue(NacosPropertyKey.MAX_RETRIES);
- for (int retry = 0; retry < maxRetries; retry++) {
- Thread.sleep(getSleepTimeMs(retry, retryIntervalMilliseconds));
- if (isAvailable(keyValues)) {
- return;
- }
- }
- throw new NacosException(NacosException.RESOURCE_NOT_FOUND, "Wait value availability timeout exceeded");
- }
- }
-
- private boolean isAvailable(final Collection keyValues) throws NacosException {
- Map> keyValueMap = keyValues.stream().collect(Collectors.groupingBy(KeyValue::isEphemeral));
- for (Entry> entry : keyValueMap.entrySet()) {
- ServiceMetaData service = serviceController.getService(entry.getKey());
- Map> instanceMap = client.getAllInstances(service.getServiceName(), false).stream().collect(Collectors.groupingBy(NacosMetaDataUtils::getKey));
- keyValues.removeIf(keyValue -> {
- String key = keyValue.getKey();
- String value = keyValue.getValue();
- return instanceMap.containsKey(key) ? instanceMap.get(key).stream().anyMatch(each -> Objects.equals(NacosMetaDataUtils.getValue(each), value)) : null == value;
- });
- }
- return keyValues.isEmpty();
- }
-
- private long getSleepTimeMs(final int retryCount, final long baseSleepTimeMs) {
- return baseSleepTimeMs * Math.max(1, random.nextInt(1 << (retryCount + 1)));
- }
-
- @Override
- public void close() {
- try {
- client.shutDown();
- } catch (final NacosException ex) {
- throw new ClusterPersistRepositoryException(ex);
- }
- }
-
- @Override
- public String getType() {
- return "Nacos";
- }
-}
diff --git a/mode/type/cluster/repository/provider/nacos/src/main/java/org/apache/shardingsphere/mode/repository/cluster/nacos/entity/KeyValue.java b/mode/type/cluster/repository/provider/nacos/src/main/java/org/apache/shardingsphere/mode/repository/cluster/nacos/entity/KeyValue.java
deleted file mode 100644
index a11b5cbc526b9..0000000000000
--- a/mode/type/cluster/repository/provider/nacos/src/main/java/org/apache/shardingsphere/mode/repository/cluster/nacos/entity/KeyValue.java
+++ /dev/null
@@ -1,35 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements. See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.shardingsphere.mode.repository.cluster.nacos.entity;
-
-import lombok.Getter;
-import lombok.RequiredArgsConstructor;
-
-/**
- * Key value.
- */
-@RequiredArgsConstructor
-@Getter
-public final class KeyValue {
-
- private final String key;
-
- private final String value;
-
- private final boolean ephemeral;
-}
diff --git a/mode/type/cluster/repository/provider/nacos/src/main/java/org/apache/shardingsphere/mode/repository/cluster/nacos/entity/ServiceController.java b/mode/type/cluster/repository/provider/nacos/src/main/java/org/apache/shardingsphere/mode/repository/cluster/nacos/entity/ServiceController.java
deleted file mode 100644
index 5b613961ecfbf..0000000000000
--- a/mode/type/cluster/repository/provider/nacos/src/main/java/org/apache/shardingsphere/mode/repository/cluster/nacos/entity/ServiceController.java
+++ /dev/null
@@ -1,63 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements. See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.shardingsphere.mode.repository.cluster.nacos.entity;
-
-import lombok.Getter;
-
-import java.util.Collection;
-import java.util.Map;
-import java.util.function.Function;
-import java.util.stream.Collectors;
-import java.util.stream.Stream;
-
-/**
- * Service controller.
- */
-public final class ServiceController {
-
- private static final String PERSISTENT_SERVICE_NAME = "PERSISTENT";
-
- private static final String EPHEMERAL_SERVICE_NAME = "EPHEMERAL";
-
- @Getter
- private final ServiceMetaData persistentService = new ServiceMetaData(PERSISTENT_SERVICE_NAME, false);
-
- @Getter
- private final ServiceMetaData ephemeralService = new ServiceMetaData(EPHEMERAL_SERVICE_NAME, true);
-
- private final Map serviceMap = Stream.of(persistentService, ephemeralService).collect(Collectors.toMap(ServiceMetaData::isEphemeral, Function.identity()));
-
- /**
- * Get all services.
- *
- * @return all services
- */
- public Collection getAllServices() {
- return serviceMap.values();
- }
-
- /**
- * Get service.
- *
- * @param ephemeral is ephemeral service
- * @return ephemeral service or persistent service
- */
- public ServiceMetaData getService(final boolean ephemeral) {
- return serviceMap.get(ephemeral);
- }
-}
diff --git a/mode/type/cluster/repository/provider/nacos/src/main/java/org/apache/shardingsphere/mode/repository/cluster/nacos/entity/ServiceMetaData.java b/mode/type/cluster/repository/provider/nacos/src/main/java/org/apache/shardingsphere/mode/repository/cluster/nacos/entity/ServiceMetaData.java
deleted file mode 100644
index 75314a59b292e..0000000000000
--- a/mode/type/cluster/repository/provider/nacos/src/main/java/org/apache/shardingsphere/mode/repository/cluster/nacos/entity/ServiceMetaData.java
+++ /dev/null
@@ -1,56 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements. See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.shardingsphere.mode.repository.cluster.nacos.entity;
-
-import com.google.common.base.Preconditions;
-import lombok.Getter;
-import lombok.RequiredArgsConstructor;
-import lombok.Setter;
-import org.apache.shardingsphere.mode.repository.cluster.nacos.listener.NamingEventListener;
-
-import java.util.concurrent.atomic.AtomicInteger;
-
-/**
- * Service meta data.
- */
-@RequiredArgsConstructor
-@Getter
-@Setter
-public final class ServiceMetaData {
-
- private final String serviceName;
-
- private String ip;
-
- private AtomicInteger port;
-
- private NamingEventListener listener;
-
- private final boolean ephemeral;
-
- /**
- * Get incremental port.
- *
- * @return incremental port
- */
- public int getPort() {
- int result = port.incrementAndGet();
- Preconditions.checkState(Integer.MIN_VALUE != result, "Specified cluster ip exceeded the maximum number of persisting");
- return result;
- }
-}
diff --git a/mode/type/cluster/repository/provider/nacos/src/main/java/org/apache/shardingsphere/mode/repository/cluster/nacos/listener/NamingEventListener.java b/mode/type/cluster/repository/provider/nacos/src/main/java/org/apache/shardingsphere/mode/repository/cluster/nacos/listener/NamingEventListener.java
deleted file mode 100644
index 09516f5bce07c..0000000000000
--- a/mode/type/cluster/repository/provider/nacos/src/main/java/org/apache/shardingsphere/mode/repository/cluster/nacos/listener/NamingEventListener.java
+++ /dev/null
@@ -1,132 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements. See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.shardingsphere.mode.repository.cluster.nacos.listener;
-
-import com.alibaba.nacos.api.naming.listener.Event;
-import com.alibaba.nacos.api.naming.listener.EventListener;
-import com.alibaba.nacos.api.naming.listener.NamingEvent;
-import com.alibaba.nacos.api.naming.pojo.Instance;
-import org.apache.shardingsphere.mode.event.DataChangedEvent;
-import org.apache.shardingsphere.mode.event.DataChangedEvent.Type;
-import org.apache.shardingsphere.mode.repository.cluster.listener.DataChangedEventListener;
-import org.apache.shardingsphere.mode.repository.cluster.nacos.util.NacosMetaDataUtils;
-
-import java.util.Collection;
-import java.util.Comparator;
-import java.util.HashMap;
-import java.util.LinkedList;
-import java.util.Map;
-import java.util.function.Function;
-import java.util.stream.Collectors;
-
-/**
- * Naming event listener.
- */
-public final class NamingEventListener implements EventListener {
-
- private Map preInstances = new HashMap<>();
-
- private final Map prefixListenerMap = new HashMap<>();
-
- @Override
- public void onEvent(final Event event) {
- if (!(event instanceof NamingEvent)) {
- return;
- }
- NamingEvent namingEvent = (NamingEvent) event;
- Collection instances = namingEvent.getInstances().stream().sorted(Comparator.comparing(NacosMetaDataUtils::getKey)).collect(Collectors.toList());
- Collection watchDataList = new LinkedList<>();
- synchronized (this) {
- instances.forEach(instance -> prefixListenerMap.forEach((prefixPath, listener) -> {
- String key = NacosMetaDataUtils.getKey(instance);
- if (key.startsWith(prefixPath)) {
- Instance preInstance = preInstances.remove(key);
- WatchData watchData = new WatchData(key, preInstance, instance, listener);
- watchDataList.add(watchData);
- }
- }));
- preInstances.values().stream().sorted(Comparator.comparing(NacosMetaDataUtils::getKey).reversed()).forEach(instance -> prefixListenerMap.forEach((prefixPath, listener) -> {
- String key = NacosMetaDataUtils.getKey(instance);
- if (key.startsWith(prefixPath)) {
- WatchData watchData = new WatchData(key, instance, null, listener);
- watchDataList.add(watchData);
- }
- }));
- watchDataList.forEach(this::watch);
- setPreInstances(instances);
- }
- }
-
- private void watch(final WatchData watchData) {
- String key = watchData.getKey();
- Instance preInstance = watchData.getPreInstance();
- Instance instance = watchData.getInstance();
- DataChangedEventListener listener = watchData.getListener();
- Type changedType = getEventChangedType(preInstance, instance);
- switch (changedType) {
- case ADDED:
- case UPDATED:
- listener.onChange(new DataChangedEvent(key, NacosMetaDataUtils.getValue(instance), changedType));
- break;
- case DELETED:
- listener.onChange(new DataChangedEvent(key, NacosMetaDataUtils.getValue(preInstance), changedType));
- break;
- default:
- }
- }
-
- private Type getEventChangedType(final Instance preInstance, final Instance instance) {
- if (null == preInstance && null != instance) {
- return DataChangedEvent.Type.ADDED;
- }
- if (null != preInstance && null != instance && NacosMetaDataUtils.getTimestamp(preInstance) != NacosMetaDataUtils.getTimestamp(instance)) {
- return DataChangedEvent.Type.UPDATED;
- }
- if (null != preInstance && null == instance) {
- return DataChangedEvent.Type.DELETED;
- }
- return DataChangedEvent.Type.IGNORED;
- }
-
- /**
- * Update pre instances.
- *
- * @param instances instances
- */
- public void setPreInstances(final Collection instances) {
- preInstances = instances.stream().filter(instance -> {
- for (String each : prefixListenerMap.keySet()) {
- if (NacosMetaDataUtils.getKey(instance).startsWith(each)) {
- return true;
- }
- }
- return false;
- }).collect(Collectors.toMap(NacosMetaDataUtils::getKey, Function.identity(),
- (oldValue, currentValue) -> NacosMetaDataUtils.getTimestamp(oldValue) > NacosMetaDataUtils.getTimestamp(currentValue) ? oldValue : currentValue));
- }
-
- /**
- * Put prefix path and listener.
- *
- * @param prefixPath prefix path
- * @param listener listener
- */
- public synchronized void put(final String prefixPath, final DataChangedEventListener listener) {
- prefixListenerMap.put(prefixPath, listener);
- }
-}
diff --git a/mode/type/cluster/repository/provider/nacos/src/main/java/org/apache/shardingsphere/mode/repository/cluster/nacos/listener/WatchData.java b/mode/type/cluster/repository/provider/nacos/src/main/java/org/apache/shardingsphere/mode/repository/cluster/nacos/listener/WatchData.java
deleted file mode 100644
index 939bb037a9c3e..0000000000000
--- a/mode/type/cluster/repository/provider/nacos/src/main/java/org/apache/shardingsphere/mode/repository/cluster/nacos/listener/WatchData.java
+++ /dev/null
@@ -1,39 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements. See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.shardingsphere.mode.repository.cluster.nacos.listener;
-
-import com.alibaba.nacos.api.naming.pojo.Instance;
-import lombok.Getter;
-import lombok.RequiredArgsConstructor;
-import org.apache.shardingsphere.mode.repository.cluster.listener.DataChangedEventListener;
-
-/**
- * Watch data.
- */
-@Getter
-@RequiredArgsConstructor
-public final class WatchData {
-
- private final String key;
-
- private final Instance preInstance;
-
- private final Instance instance;
-
- private final DataChangedEventListener listener;
-}
diff --git a/mode/type/cluster/repository/provider/nacos/src/main/java/org/apache/shardingsphere/mode/repository/cluster/nacos/props/NacosProperties.java b/mode/type/cluster/repository/provider/nacos/src/main/java/org/apache/shardingsphere/mode/repository/cluster/nacos/props/NacosProperties.java
deleted file mode 100644
index 5b4e3a5b7e64c..0000000000000
--- a/mode/type/cluster/repository/provider/nacos/src/main/java/org/apache/shardingsphere/mode/repository/cluster/nacos/props/NacosProperties.java
+++ /dev/null
@@ -1,32 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements. See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.shardingsphere.mode.repository.cluster.nacos.props;
-
-import org.apache.shardingsphere.infra.props.TypedProperties;
-
-import java.util.Properties;
-
-/**
- * Typed properties of Nacos.
- */
-public final class NacosProperties extends TypedProperties {
-
- public NacosProperties(final Properties props) {
- super(NacosPropertyKey.class, props);
- }
-}
diff --git a/mode/type/cluster/repository/provider/nacos/src/main/java/org/apache/shardingsphere/mode/repository/cluster/nacos/props/NacosPropertyKey.java b/mode/type/cluster/repository/provider/nacos/src/main/java/org/apache/shardingsphere/mode/repository/cluster/nacos/props/NacosPropertyKey.java
deleted file mode 100644
index fbec54ac9eed5..0000000000000
--- a/mode/type/cluster/repository/provider/nacos/src/main/java/org/apache/shardingsphere/mode/repository/cluster/nacos/props/NacosPropertyKey.java
+++ /dev/null
@@ -1,66 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements. See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.shardingsphere.mode.repository.cluster.nacos.props;
-
-import lombok.Getter;
-import lombok.RequiredArgsConstructor;
-import org.apache.shardingsphere.infra.props.TypedPropertyKey;
-
-/**
- * Typed property key of Nacos.
- */
-@RequiredArgsConstructor
-@Getter
-public enum NacosPropertyKey implements TypedPropertyKey {
-
- /**
- * Cluster ip.
- */
- CLUSTER_IP("clusterIp", "", String.class),
-
- /**
- * Retry interval milliseconds when checking whether value is available.
- */
- RETRY_INTERVAL_MILLISECONDS("retryIntervalMilliseconds", String.valueOf(500), long.class),
-
- /**
- * Max Retry times when checking whether value is available.
- */
- MAX_RETRIES("maxRetries", String.valueOf(3), int.class),
-
- /**
- * Time to live seconds.
- */
- TIME_TO_LIVE_SECONDS("timeToLiveSeconds", String.valueOf(30), int.class),
-
- /**
- * Username.
- */
- USERNAME("username", "", String.class),
-
- /**
- * Password.
- */
- PASSWORD("password", "", String.class);
-
- private final String key;
-
- private final String defaultValue;
-
- private final Class> type;
-}
diff --git a/mode/type/cluster/repository/provider/nacos/src/main/java/org/apache/shardingsphere/mode/repository/cluster/nacos/util/NacosMetaDataUtils.java b/mode/type/cluster/repository/provider/nacos/src/main/java/org/apache/shardingsphere/mode/repository/cluster/nacos/util/NacosMetaDataUtils.java
deleted file mode 100644
index c56447c8427e6..0000000000000
--- a/mode/type/cluster/repository/provider/nacos/src/main/java/org/apache/shardingsphere/mode/repository/cluster/nacos/util/NacosMetaDataUtils.java
+++ /dev/null
@@ -1,82 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements. See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.shardingsphere.mode.repository.cluster.nacos.util;
-
-import com.alibaba.nacos.api.exception.NacosException;
-import com.alibaba.nacos.api.naming.PreservedMetadataKeys;
-import com.alibaba.nacos.api.naming.pojo.Instance;
-import lombok.AccessLevel;
-import lombok.NoArgsConstructor;
-import lombok.SneakyThrows;
-
-import java.time.LocalDateTime;
-import java.time.ZoneOffset;
-
-/**
- * Nacos meta data utility class.
- */
-@NoArgsConstructor(access = AccessLevel.PRIVATE)
-public final class NacosMetaDataUtils {
-
- public static final ZoneOffset UTC_ZONE_OFFSET = ZoneOffset.of("+8");
-
- /**
- * Get timestamp.
- *
- * @param instance instance
- * @return timestamp
- */
- public static long getTimestamp(final Instance instance) {
- return Long.parseLong(instance.getMetadata().get(UTC_ZONE_OFFSET.toString()));
- }
-
- /**
- * Get timestamp.
- *
- * @return timeStamp
- */
- public static long getTimestamp() {
- return LocalDateTime.now().toInstant(UTC_ZONE_OFFSET).toEpochMilli();
- }
-
- /**
- * Get value.
- *
- * @param instance instance
- * @return value
- */
- public static String getValue(final Instance instance) {
- return instance.getMetadata().get(getKey(instance));
- }
-
- /**
- * Get key.
- *
- * @param instance instance
- * @return key
- */
- @SneakyThrows(NacosException.class)
- public static String getKey(final Instance instance) {
- return instance.getMetadata().keySet().stream()
- .filter(entryKey -> !PreservedMetadataKeys.HEART_BEAT_INTERVAL.equals(entryKey)
- && !PreservedMetadataKeys.HEART_BEAT_TIMEOUT.equals(entryKey)
- && !PreservedMetadataKeys.IP_DELETE_TIMEOUT.equals(entryKey)
- && !UTC_ZONE_OFFSET.toString().equals(entryKey))
- .findFirst().orElseThrow(() -> new NacosException(NacosException.RESOURCE_NOT_FOUND, "Failed to find key "));
- }
-}
diff --git a/mode/type/cluster/repository/provider/nacos/src/main/resources/META-INF/services/org.apache.shardingsphere.mode.repository.cluster.ClusterPersistRepository b/mode/type/cluster/repository/provider/nacos/src/main/resources/META-INF/services/org.apache.shardingsphere.mode.repository.cluster.ClusterPersistRepository
deleted file mode 100644
index 7ab9d965fcb9e..0000000000000
--- a/mode/type/cluster/repository/provider/nacos/src/main/resources/META-INF/services/org.apache.shardingsphere.mode.repository.cluster.ClusterPersistRepository
+++ /dev/null
@@ -1,18 +0,0 @@
-#
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements. See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License. You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-
-org.apache.shardingsphere.mode.repository.cluster.nacos.NacosRepository
diff --git a/mode/type/cluster/repository/provider/nacos/src/test/java/org/apache/shardingsphere/mode/repository/cluster/nacos/NacosRepositoryTest.java b/mode/type/cluster/repository/provider/nacos/src/test/java/org/apache/shardingsphere/mode/repository/cluster/nacos/NacosRepositoryTest.java
deleted file mode 100644
index efccc41fb4e2f..0000000000000
--- a/mode/type/cluster/repository/provider/nacos/src/test/java/org/apache/shardingsphere/mode/repository/cluster/nacos/NacosRepositoryTest.java
+++ /dev/null
@@ -1,377 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements. See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.shardingsphere.mode.repository.cluster.nacos;
-
-import com.alibaba.nacos.api.exception.NacosException;
-import com.alibaba.nacos.api.naming.NamingService;
-import com.alibaba.nacos.api.naming.PreservedMetadataKeys;
-import com.alibaba.nacos.api.naming.listener.Event;
-import com.alibaba.nacos.api.naming.listener.EventListener;
-import com.alibaba.nacos.api.naming.listener.NamingEvent;
-import com.alibaba.nacos.api.naming.pojo.Instance;
-import com.google.common.util.concurrent.SettableFuture;
-import org.apache.shardingsphere.mode.event.DataChangedEvent;
-import org.apache.shardingsphere.mode.repository.cluster.exception.ClusterPersistRepositoryException;
-import org.apache.shardingsphere.mode.repository.cluster.nacos.entity.ServiceController;
-import org.apache.shardingsphere.mode.repository.cluster.nacos.entity.ServiceMetaData;
-import org.apache.shardingsphere.mode.repository.cluster.nacos.props.NacosProperties;
-import org.apache.shardingsphere.mode.repository.cluster.nacos.props.NacosPropertyKey;
-import org.apache.shardingsphere.mode.repository.cluster.nacos.util.NacosMetaDataUtils;
-import org.apache.shardingsphere.mode.spi.PersistRepository;
-import org.junit.jupiter.api.BeforeEach;
-import org.junit.jupiter.api.Test;
-import org.junit.jupiter.api.extension.ExtendWith;
-import org.mockito.AdditionalAnswers;
-import org.mockito.ArgumentCaptor;
-import org.mockito.Mock;
-import org.mockito.internal.configuration.plugins.Plugins;
-import org.mockito.junit.jupiter.MockitoExtension;
-import org.mockito.plugins.MemberAccessor;
-import org.mockito.stubbing.VoidAnswer2;
-
-import java.util.Collections;
-import java.util.HashMap;
-import java.util.LinkedList;
-import java.util.List;
-import java.util.Map;
-import java.util.Objects;
-import java.util.Properties;
-import java.util.concurrent.ExecutionException;
-import java.util.concurrent.atomic.AtomicInteger;
-
-import static org.hamcrest.CoreMatchers.is;
-import static org.hamcrest.MatcherAssert.assertThat;
-import static org.junit.jupiter.api.Assertions.assertThrows;
-import static org.mockito.ArgumentMatchers.any;
-import static org.mockito.ArgumentMatchers.anyString;
-import static org.mockito.Mockito.doAnswer;
-import static org.mockito.Mockito.times;
-import static org.mockito.Mockito.verify;
-import static org.mockito.Mockito.when;
-
-@ExtendWith(MockitoExtension.class)
-class NacosRepositoryTest {
-
- private static final NacosRepository REPOSITORY = new NacosRepository();
-
- @Mock
- private NamingService client;
-
- private ServiceController serviceController;
-
- @BeforeEach
- void initClient() throws ReflectiveOperationException {
- MemberAccessor accessor = Plugins.getMemberAccessor();
- accessor.set(REPOSITORY.getClass().getDeclaredField("nacosProps"), REPOSITORY, new NacosProperties(new Properties()));
- accessor.set(REPOSITORY.getClass().getDeclaredField("client"), REPOSITORY, client);
- accessor.invoke(REPOSITORY.getClass().getDeclaredMethod("initServiceMetaData"), REPOSITORY);
- serviceController = (ServiceController) accessor.get(REPOSITORY.getClass().getDeclaredField("serviceController"), REPOSITORY);
- }
-
- @Test
- void assertGetLatestKey() throws NacosException {
- int total = 2;
- String key = "/test/children/keys/persistent/1";
- List instances = new LinkedList<>();
- for (int count = 1; count <= total; count++) {
- Instance instance = new Instance();
- Map metaDataMap = new HashMap<>(2, 1F);
- metaDataMap.put(key, "value" + count);
- metaDataMap.put(NacosMetaDataUtils.UTC_ZONE_OFFSET.toString(), String.valueOf(count));
- instance.setMetadata(metaDataMap);
- instances.add(instance);
- }
- ServiceMetaData persistentService = serviceController.getPersistentService();
- when(client.getAllInstances(persistentService.getServiceName(), false)).thenReturn(instances);
- String value = REPOSITORY.getDirectly(key);
- assertThat(value, is("value2"));
- }
-
- @Test
- void assertGetChildrenKeys() throws NacosException {
- Instance instance = new Instance();
- String key = "/test/children/keys/persistent/0";
- instance.setMetadata(Collections.singletonMap(key, "value0"));
- ServiceMetaData persistentService = serviceController.getPersistentService();
- when(client.getAllInstances(persistentService.getServiceName(), false)).thenReturn(Collections.singletonList(instance));
- instance = new Instance();
- key = "/test/children/keys/ephemeral/0";
- instance.setMetadata(Collections.singletonMap(key, "value0"));
- ServiceMetaData ephemeralService = serviceController.getEphemeralService();
- when(client.getAllInstances(ephemeralService.getServiceName(), false)).thenReturn(Collections.singletonList(instance));
- List childrenKeys = REPOSITORY.getChildrenKeys("/test/children/keys");
- assertThat(childrenKeys.size(), is(2));
- assertThat(childrenKeys.get(0), is("persistent"));
- assertThat(childrenKeys.get(1), is("ephemeral"));
- }
-
- @Test
- void assertPersistNotExistKey() throws NacosException {
- String key = "/test/children/keys/persistent/1";
- doAnswer(AdditionalAnswers.answerVoid(getRegisterInstanceAnswer())).when(client).registerInstance(anyString(), any(Instance.class));
- REPOSITORY.persist(key, "value4");
- ArgumentCaptor instanceArgumentCaptor = ArgumentCaptor.forClass(Instance.class);
- ArgumentCaptor stringArgumentCaptor = ArgumentCaptor.forClass(String.class);
- verify(client, times(5)).registerInstance(stringArgumentCaptor.capture(), instanceArgumentCaptor.capture());
- Instance registerInstance = instanceArgumentCaptor.getValue();
- String registerType = stringArgumentCaptor.getValue();
- ServiceMetaData persistentService = serviceController.getPersistentService();
- assertThat(registerType, is(persistentService.getServiceName()));
- assertThat(registerInstance.isEphemeral(), is(false));
- assertThat(NacosMetaDataUtils.getValue(registerInstance), is("value4"));
- }
-
- @Test
- void assertPersistExistKey() throws NacosException {
- String ip = "127.0.0.1";
- Instance instance = new Instance();
- instance.setIp(ip);
- instance.setEphemeral(false);
- String key = "/test/children/keys/persistent/0";
- instance.setMetadata(new HashMap<>(Collections.singletonMap(key, "value0")));
- List instances = new LinkedList<>();
- buildParentPath(key, instances);
- instances.add(instance);
- ServiceMetaData persistentService = serviceController.getPersistentService();
- when(client.getAllInstances(persistentService.getServiceName(), false)).thenReturn(instances);
- doAnswer(AdditionalAnswers.answerVoid(getRegisterInstanceAnswer())).when(client).registerInstance(anyString(), any(Instance.class));
- REPOSITORY.persist(key, "value4");
- ArgumentCaptor instanceArgumentCaptor = ArgumentCaptor.forClass(Instance.class);
- ArgumentCaptor stringArgumentCaptor = ArgumentCaptor.forClass(String.class);
- verify(client).registerInstance(stringArgumentCaptor.capture(), instanceArgumentCaptor.capture());
- Instance registerInstance = instanceArgumentCaptor.getValue();
- String registerType = stringArgumentCaptor.getValue();
- assertThat(registerType, is(persistentService.getServiceName()));
- assertThat(registerInstance.getIp(), is(ip));
- assertThat(registerInstance.isEphemeral(), is(false));
- assertThat(NacosMetaDataUtils.getValue(registerInstance), is("value4"));
- }
-
- @Test
- void assertPersistEphemeralExistKey() throws NacosException {
- final String key = "/test/children/keys/ephemeral/1";
- final Instance instance = new Instance();
- instance.setEphemeral(true);
- Map metaDataMap = new HashMap<>(4, 1F);
- metaDataMap.put(PreservedMetadataKeys.HEART_BEAT_INTERVAL, String.valueOf(2000));
- metaDataMap.put(PreservedMetadataKeys.HEART_BEAT_TIMEOUT, String.valueOf(4000));
- metaDataMap.put(PreservedMetadataKeys.IP_DELETE_TIMEOUT, String.valueOf(6000));
- metaDataMap.put(key, "value0");
- instance.setMetadata(metaDataMap);
- List instances = new LinkedList<>();
- buildParentPath(key, instances);
- ServiceMetaData persistentService = serviceController.getPersistentService();
- when(client.getAllInstances(persistentService.getServiceName(), false)).thenReturn(instances);
- instances = new LinkedList<>();
- instances.add(instance);
- ServiceMetaData ephemeralService = serviceController.getEphemeralService();
- when(client.getAllInstances(ephemeralService.getServiceName(), false)).thenReturn(instances);
- doAnswer(AdditionalAnswers.answerVoid(getDeregisterInstanceAnswer())).when(client).deregisterInstance(anyString(), any(Instance.class));
- doAnswer(AdditionalAnswers.answerVoid(getRegisterInstanceAnswer())).when(client).registerInstance(anyString(), any(Instance.class));
- REPOSITORY.persistEphemeral(key, "value4");
- ArgumentCaptor instanceArgumentCaptor = ArgumentCaptor.forClass(Instance.class);
- ArgumentCaptor stringArgumentCaptor = ArgumentCaptor.forClass(String.class);
- verify(client).deregisterInstance(anyString(), any(Instance.class));
- verify(client).registerInstance(stringArgumentCaptor.capture(), instanceArgumentCaptor.capture());
- Instance registerInstance = instanceArgumentCaptor.getValue();
- String registerType = stringArgumentCaptor.getValue();
- assertThat(registerType, is(ephemeralService.getServiceName()));
- assertThat(registerInstance.isEphemeral(), is(true));
- assertThat(NacosMetaDataUtils.getValue(registerInstance), is("value4"));
- Map metaData = registerInstance.getMetadata();
- long timeToLiveSeconds = Long.parseLong(NacosPropertyKey.TIME_TO_LIVE_SECONDS.getDefaultValue());
- assertThat(metaData.get(PreservedMetadataKeys.HEART_BEAT_INTERVAL), is(String.valueOf(timeToLiveSeconds * 1000 / 3)));
- assertThat(metaData.get(PreservedMetadataKeys.HEART_BEAT_TIMEOUT), is(String.valueOf(timeToLiveSeconds * 1000 * 2 / 3)));
- assertThat(metaData.get(PreservedMetadataKeys.IP_DELETE_TIMEOUT), is(String.valueOf(timeToLiveSeconds * 1000)));
- }
-
- private void buildParentPath(final String key, final List instances) {
- StringBuilder parentPath = new StringBuilder();
- final String[] partPath = key.split(PersistRepository.PATH_SEPARATOR);
- for (int index = 1; index < partPath.length - 1; index++) {
- parentPath.append(PersistRepository.PATH_SEPARATOR);
- parentPath.append(partPath[index]);
- String path = parentPath.toString();
- Instance instance = new Instance();
- instance.setEphemeral(false);
- instance.setMetadata(Collections.singletonMap(path, ""));
- instances.add(instance);
- }
- }
-
- @Test
- void assertPersistEphemeralNotExistKey() throws NacosException {
- String key = "/test/children/keys/ephemeral/0";
- doAnswer(AdditionalAnswers.answerVoid(getRegisterInstanceAnswer())).when(client).registerInstance(anyString(), any(Instance.class));
- REPOSITORY.persistEphemeral(key, "value0");
- ArgumentCaptor instanceArgumentCaptor = ArgumentCaptor.forClass(Instance.class);
- ArgumentCaptor stringArgumentCaptor = ArgumentCaptor.forClass(String.class);
- verify(client, times(5)).registerInstance(stringArgumentCaptor.capture(), instanceArgumentCaptor.capture());
- Instance registerInstance = instanceArgumentCaptor.getValue();
- String registerType = stringArgumentCaptor.getValue();
- ServiceMetaData ephemeralService = serviceController.getEphemeralService();
- assertThat(registerType, is(ephemeralService.getServiceName()));
- assertThat(registerInstance.isEphemeral(), is(true));
- assertThat(NacosMetaDataUtils.getValue(registerInstance), is("value0"));
- Map metaData = registerInstance.getMetadata();
- long timeToLiveSeconds = Long.parseLong(NacosPropertyKey.TIME_TO_LIVE_SECONDS.getDefaultValue());
- assertThat(metaData.get(PreservedMetadataKeys.HEART_BEAT_INTERVAL), is(String.valueOf(timeToLiveSeconds * 1000 / 3)));
- assertThat(metaData.get(PreservedMetadataKeys.HEART_BEAT_TIMEOUT), is(String.valueOf(timeToLiveSeconds * 1000 * 2 / 3)));
- assertThat(metaData.get(PreservedMetadataKeys.IP_DELETE_TIMEOUT), is(String.valueOf(timeToLiveSeconds * 1000)));
- }
-
- @Test
- void assertDeleteExistKey() throws NacosException {
- int total = 3;
- List instances = new LinkedList<>();
- for (int count = 1; count <= total; count++) {
- String key = "/test/children/keys/ephemeral/" + count;
- Instance instance = new Instance();
- instance.setEphemeral(true);
- instance.setMetadata(Collections.singletonMap(key, "value" + count));
- instances.add(instance);
- }
- ServiceMetaData ephemeralService = serviceController.getEphemeralService();
- when(client.getAllInstances(ephemeralService.getServiceName(), false)).thenReturn(instances);
- instances = new LinkedList<>();
- String key = "/test/children/keys/persistent/0";
- Instance instance = new Instance();
- instance.setEphemeral(false);
- instance.setMetadata(Collections.singletonMap(key, "value0"));
- instances.add(instance);
- ServiceMetaData persistentService = serviceController.getPersistentService();
- when(client.getAllInstances(persistentService.getServiceName(), false)).thenReturn(instances);
- doAnswer(AdditionalAnswers.answerVoid(getDeregisterInstanceAnswer())).when(client).deregisterInstance(anyString(), any(Instance.class));
- REPOSITORY.delete("/test/children/keys");
- verify(client, times(4)).deregisterInstance(anyString(), any(Instance.class));
- }
-
- @Test
- void assertDeleteNotExistKey() throws NacosException {
- REPOSITORY.delete("/test/children/keys/persistent/1");
- verify(client, times(0)).deregisterInstance(anyString(), any(Instance.class));
- }
-
- @Test
- void assertWatchAdded() throws NacosException, ExecutionException, InterruptedException {
- ServiceMetaData ephemeralService = serviceController.getEphemeralService();
- ephemeralService.setListener(null);
- String key = "key/key";
- String value = "value2";
- Instance instance = new Instance();
- instance.setMetadata(Collections.singletonMap(key, value));
- Event event = new NamingEvent(ephemeralService.getServiceName(), Collections.singletonList(instance));
- doAnswer(AdditionalAnswers.answerVoid(getListenerAnswer(null, event))).when(client).subscribe(anyString(), any(EventListener.class));
- SettableFuture settableFuture = SettableFuture.create();
- REPOSITORY.watch(key, settableFuture::set);
- DataChangedEvent dataChangedEvent = settableFuture.get();
- assertThat(dataChangedEvent.getType(), is(DataChangedEvent.Type.ADDED));
- assertThat(dataChangedEvent.getKey(), is(key));
- assertThat(dataChangedEvent.getValue(), is(value));
- }
-
- @Test
- void assertWatchUpdate() throws NacosException, ExecutionException, InterruptedException {
- ServiceMetaData persistentService = serviceController.getPersistentService();
- persistentService.setListener(null);
- String key = "key/key";
- long epochMilliseconds = NacosMetaDataUtils.getTimestamp();
- Instance preInstance = new Instance();
- Map metaDataMap = new HashMap<>();
- metaDataMap.put(key, "value1");
- metaDataMap.put(NacosMetaDataUtils.UTC_ZONE_OFFSET.toString(), String.valueOf(epochMilliseconds));
- preInstance.setMetadata(metaDataMap);
- final Instance instance = new Instance();
- metaDataMap = new HashMap<>();
- metaDataMap.put(key, "value2");
- metaDataMap.put(NacosMetaDataUtils.UTC_ZONE_OFFSET.toString(), String.valueOf(epochMilliseconds + 1));
- instance.setMetadata(metaDataMap);
- Event event = new NamingEvent(persistentService.getServiceName(), Collections.singletonList(instance));
- doAnswer(AdditionalAnswers.answerVoid(getListenerAnswer(preInstance, event))).when(client).subscribe(anyString(), any(EventListener.class));
- SettableFuture settableFuture = SettableFuture.create();
- REPOSITORY.watch(key, settableFuture::set);
- DataChangedEvent dataChangedEvent = settableFuture.get();
- assertThat(dataChangedEvent.getType(), is(DataChangedEvent.Type.UPDATED));
- assertThat(dataChangedEvent.getKey(), is(key));
- assertThat(dataChangedEvent.getValue(), is("value2"));
- }
-
- @Test
- void assertWatchDelete() throws NacosException, ExecutionException, InterruptedException {
- ServiceMetaData persistentService = serviceController.getPersistentService();
- persistentService.setListener(null);
- String key = "key/key";
- Instance preInstance = new Instance();
- preInstance.setMetadata(Collections.singletonMap(key, "value1"));
- Event event = new NamingEvent(persistentService.getServiceName(), Collections.emptyList());
- doAnswer(AdditionalAnswers.answerVoid(getListenerAnswer(preInstance, event))).when(client).subscribe(anyString(), any(EventListener.class));
- SettableFuture settableFuture = SettableFuture.create();
- REPOSITORY.watch(key, settableFuture::set);
- DataChangedEvent dataChangedEvent = settableFuture.get();
- assertThat(dataChangedEvent.getType(), is(DataChangedEvent.Type.DELETED));
- assertThat(dataChangedEvent.getKey(), is(key));
- assertThat(dataChangedEvent.getValue(), is("value1"));
- }
-
- @Test
- void assertClose() throws NacosException {
- REPOSITORY.close();
- verify(client).shutDown();
- }
-
- @Test
- void assertPersistNotAvailable() {
- assertThrows(ClusterPersistRepositoryException.class, () -> REPOSITORY.persist("/test/children/keys/persistent/1", "value4"));
- }
-
- @Test
- void assertExceededMaximum() {
- ServiceMetaData ephemeralService = serviceController.getEphemeralService();
- ephemeralService.setPort(new AtomicInteger(Integer.MAX_VALUE));
- assertThrows(IllegalStateException.class, () -> REPOSITORY.persistEphemeral("/key2", "value"));
- }
-
- private VoidAnswer2 getListenerAnswer(final Instance preInstance, final Event event) {
- return (serviceName, listener) -> {
- MemberAccessor accessor = Plugins.getMemberAccessor();
- if (null != preInstance) {
- Map preInstances = new HashMap<>();
- preInstances.put(NacosMetaDataUtils.getKey(preInstance), preInstance);
- accessor.set(listener.getClass().getDeclaredField("preInstances"), listener, preInstances);
- }
- listener.onEvent(event);
- };
- }
-
- private VoidAnswer2 getRegisterInstanceAnswer() {
- return (serviceName, instance) -> {
- List instances = client.getAllInstances(serviceName, false);
- instances.removeIf(each -> Objects.equals(each.getIp(), instance.getIp()) && each.getPort() == instance.getPort());
- instances.add(instance);
- when(client.getAllInstances(serviceName, false)).thenReturn(instances);
- };
- }
-
- private VoidAnswer2 getDeregisterInstanceAnswer() {
- return (serviceName, instance) -> {
- List instances = client.getAllInstances(serviceName, false);
- instances.remove(instance);
- when(client.getAllInstances(serviceName, false)).thenReturn(instances);
- };
- }
-}
diff --git a/mode/type/cluster/repository/provider/nacos/src/test/java/org/apache/shardingsphere/mode/repository/cluster/nacos/props/NacosPropertiesTest.java b/mode/type/cluster/repository/provider/nacos/src/test/java/org/apache/shardingsphere/mode/repository/cluster/nacos/props/NacosPropertiesTest.java
deleted file mode 100644
index 7e04a0810d333..0000000000000
--- a/mode/type/cluster/repository/provider/nacos/src/test/java/org/apache/shardingsphere/mode/repository/cluster/nacos/props/NacosPropertiesTest.java
+++ /dev/null
@@ -1,62 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements. See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.shardingsphere.mode.repository.cluster.nacos.props;
-
-import org.apache.shardingsphere.test.util.PropertiesBuilder;
-import org.apache.shardingsphere.test.util.PropertiesBuilder.Property;
-import org.junit.jupiter.api.Test;
-
-import java.util.Properties;
-
-import static org.hamcrest.CoreMatchers.is;
-import static org.hamcrest.MatcherAssert.assertThat;
-
-class NacosPropertiesTest {
-
- @Test
- void assertGetValue() {
- NacosProperties actual = new NacosProperties(createProperties());
- assertThat(actual.getValue(NacosPropertyKey.CLUSTER_IP), is("127.0.0.1"));
- assertThat(actual.getValue(NacosPropertyKey.RETRY_INTERVAL_MILLISECONDS), is(1000L));
- assertThat(actual.getValue(NacosPropertyKey.MAX_RETRIES), is(5));
- assertThat(actual.getValue(NacosPropertyKey.TIME_TO_LIVE_SECONDS), is(60));
- assertThat(actual.getValue(NacosPropertyKey.USERNAME), is("nacos"));
- assertThat(actual.getValue(NacosPropertyKey.PASSWORD), is("nacos"));
- }
-
- private Properties createProperties() {
- return PropertiesBuilder.build(
- new Property(NacosPropertyKey.CLUSTER_IP.getKey(), "127.0.0.1"),
- new Property(NacosPropertyKey.RETRY_INTERVAL_MILLISECONDS.getKey(), "1000"),
- new Property(NacosPropertyKey.MAX_RETRIES.getKey(), "5"),
- new Property(NacosPropertyKey.TIME_TO_LIVE_SECONDS.getKey(), "60"),
- new Property(NacosPropertyKey.USERNAME.getKey(), "nacos"),
- new Property(NacosPropertyKey.PASSWORD.getKey(), "nacos"));
- }
-
- @Test
- void assertGetDefaultValue() {
- NacosProperties actual = new NacosProperties(new Properties());
- assertThat(actual.getValue(NacosPropertyKey.CLUSTER_IP), is(""));
- assertThat(actual.getValue(NacosPropertyKey.RETRY_INTERVAL_MILLISECONDS), is(500L));
- assertThat(actual.getValue(NacosPropertyKey.MAX_RETRIES), is(3));
- assertThat(actual.getValue(NacosPropertyKey.TIME_TO_LIVE_SECONDS), is(30));
- assertThat(actual.getValue(NacosPropertyKey.USERNAME), is(""));
- assertThat(actual.getValue(NacosPropertyKey.PASSWORD), is(""));
- }
-}
diff --git a/mode/type/cluster/repository/provider/pom.xml b/mode/type/cluster/repository/provider/pom.xml
index f6c0480c8482e..c51a3ef34b6e5 100644
--- a/mode/type/cluster/repository/provider/pom.xml
+++ b/mode/type/cluster/repository/provider/pom.xml
@@ -30,7 +30,6 @@
zookeeper
etcd
- nacos
consul
diff --git a/parser/sql/dialect/oracle/src/main/java/org/apache/shardingsphere/sql/parser/oracle/visitor/statement/OracleStatementVisitor.java b/parser/sql/dialect/oracle/src/main/java/org/apache/shardingsphere/sql/parser/oracle/visitor/statement/OracleStatementVisitor.java
index ca737d7d20cf2..5480eb38c10df 100644
--- a/parser/sql/dialect/oracle/src/main/java/org/apache/shardingsphere/sql/parser/oracle/visitor/statement/OracleStatementVisitor.java
+++ b/parser/sql/dialect/oracle/src/main/java/org/apache/shardingsphere/sql/parser/oracle/visitor/statement/OracleStatementVisitor.java
@@ -1000,7 +1000,11 @@ public ASTNode visitCursorFunction(final CursorFunctionContext ctx) {
@Override
public ASTNode visitToDateFunction(final ToDateFunctionContext ctx) {
- return new FunctionSegment(ctx.getStart().getStartIndex(), ctx.getStop().getStopIndex(), ctx.TO_DATE().getText(), getOriginalText(ctx));
+ FunctionSegment result = new FunctionSegment(ctx.getStart().getStartIndex(), ctx.getStop().getStopIndex(), ctx.TO_DATE().getText(), getOriginalText(ctx));
+ if (null != ctx.STRING_()) {
+ ctx.STRING_().forEach(each -> result.getParameters().add(new LiteralExpressionSegment(each.getSymbol().getStartIndex(), each.getSymbol().getStopIndex(), each.getSymbol().getText())));
+ }
+ return result;
}
@Override
diff --git a/pom.xml b/pom.xml
index bc500f923ff96..88ae4008f759f 100644
--- a/pom.xml
+++ b/pom.xml
@@ -55,8 +55,6 @@
UTF-8
false
true
- true
- true
org.apache.shardingsphere.shade
${basedir}/target/generated-sources/antlr4
@@ -95,7 +93,6 @@
5.4.0
0.7.5
1.51.0
- 1.9.0
3.0.3
@@ -113,8 +110,6 @@
1.7.1
4.0.3
- 2.9.0
- 0.9.5.5
5.9.2
2.2
@@ -415,12 +410,6 @@
grpc-all
${grpc.version}
-
- com.ctrip.framework.apollo
- apollo-client
- ${apollo-client.version}
- provided
-
org.apache.shardingsphere.elasticjob
@@ -533,18 +522,6 @@
${hikari-cp.version}
test
-
- org.apache.commons
- commons-dbcp2
- ${commons-dbcp2.version}
- test
-
-
- com.mchange
- c3p0
- ${c3p0.version}
- test
-
org.junit.jupiter
@@ -874,119 +851,6 @@
-
-
- org.apache.rat
- apache-rat-plugin
- ${apache-rat-plugin.version}
-
-
- ${maven.multiModuleProjectDirectory}/src/resources/rat.txt
-
-
-
-
- check
-
- verify
-
-
-
-
- com.diffplug.spotless
- spotless-maven-plugin
- ${spotless-maven-plugin.version}
-
-
-
-
- ${maven.multiModuleProjectDirectory}/src/resources/spotless/java.xml
-
-
-
- ${maven.multiModuleProjectDirectory}/src/resources/spotless/copyright.txt
-
-
-
-
- UTF-8
- 4
- true
- true
- false
- true
- false
- false
- custom_1
- false
- false
-
-
- Leading blank line
- -->
-<project
- -->
-
-<project
-
-
-
-
-
- maven-checkstyle-plugin
- ${maven-checkstyle-plugin.version}
-
- true
- true
- true
- error
-
- ${maven.multiModuleProjectDirectory}/src/resources/checkstyle.xml
- true
- **/autogen/**/*
-
-
-
- maven-pmd-plugin
- ${maven-pmd-plugin.version}
-
- true
- ${java.version}
-
-
- ${maven.multiModuleProjectDirectory}/src/resources/pmd.xml
-
-
-
-
- com.github.spotbugs
- spotbugs-maven-plugin
- ${spotbugs-maven-plugin.version}
-
- false
- false
-
- ${maven.multiModuleProjectDirectory}/src/resources/spotbugs.xml
-
-
- com.mebigfatguy.fb-contrib
- fb-contrib
- ${fb-contrib.version}
-
-
- com.h3xstream.findsecbugs
- findsecbugs-plugin
- ${findsecbugs.version}
-
-
-
-
-
- org.sonarsource.scanner.maven
- sonar-maven-plugin
- ${sonar-maven-plugin.version}
-
-
maven-javadoc-plugin
@@ -1044,43 +908,6 @@
-
-
-
- org.apache.rat
- apache-rat-plugin
-
-
- com.diffplug.spotless
- spotless-maven-plugin
-
-
-
- apply
-
- compile
-
-
-
-
- maven-checkstyle-plugin
-
-
- validate
-
- check
-
- validate
-
-
-
-
- maven-pmd-plugin
-
-
- com.github.spotbugs
- spotbugs-maven-plugin
-
@@ -1292,5 +1119,164 @@
+
+ check
+
+
+
+
+
+ org.apache.rat
+ apache-rat-plugin
+ ${apache-rat-plugin.version}
+
+
+ ${maven.multiModuleProjectDirectory}/src/resources/rat.txt
+
+
+
+
+ check
+
+ verify
+
+
+
+
+ com.diffplug.spotless
+ spotless-maven-plugin
+ ${spotless-maven-plugin.version}
+
+
+
+
+ ${maven.multiModuleProjectDirectory}/src/resources/spotless/java.xml
+
+
+
+ ${maven.multiModuleProjectDirectory}/src/resources/spotless/copyright.txt
+
+
+
+
+ UTF-8
+ 4
+ true
+ true
+ false
+ true
+ false
+ false
+ custom_1
+ false
+ false
+
+
+ Leading blank line
+ -->
+<project
+ -->
+
+<project
+
+
+
+
+
+ maven-checkstyle-plugin
+ ${maven-checkstyle-plugin.version}
+
+ true
+ true
+ true
+ error
+
+ ${maven.multiModuleProjectDirectory}/src/resources/checkstyle.xml
+ true
+ **/autogen/**/*
+
+
+
+ maven-pmd-plugin
+ ${maven-pmd-plugin.version}
+
+ true
+ ${java.version}
+
+
+ ${maven.multiModuleProjectDirectory}/src/resources/pmd.xml
+
+
+
+
+ com.github.spotbugs
+ spotbugs-maven-plugin
+ ${spotbugs-maven-plugin.version}
+
+ false
+ false
+
+ ${maven.multiModuleProjectDirectory}/src/resources/spotbugs.xml
+
+
+ com.mebigfatguy.fb-contrib
+ fb-contrib
+ ${fb-contrib.version}
+
+
+ com.h3xstream.findsecbugs
+ findsecbugs-plugin
+ ${findsecbugs.version}
+
+
+
+
+
+ org.sonarsource.scanner.maven
+ sonar-maven-plugin
+ ${sonar-maven-plugin.version}
+
+
+
+
+
+
+ org.apache.rat
+ apache-rat-plugin
+
+
+ com.diffplug.spotless
+ spotless-maven-plugin
+
+
+
+ apply
+
+ compile
+
+
+
+
+ maven-checkstyle-plugin
+
+
+ validate
+
+ check
+
+ validate
+
+
+
+
+ maven-pmd-plugin
+
+
+ com.github.spotbugs
+ spotbugs-maven-plugin
+
+
+
+
diff --git a/proxy/backend/core/src/test/resources/conf/convert/config-encrypt.yaml b/proxy/backend/core/src/test/resources/conf/convert/config-encrypt.yaml
index 6fa6138f3b026..ab59b37693a56 100644
--- a/proxy/backend/core/src/test/resources/conf/convert/config-encrypt.yaml
+++ b/proxy/backend/core/src/test/resources/conf/convert/config-encrypt.yaml
@@ -49,7 +49,7 @@ rules:
props:
rc4-key-value: 123456abc
like_encryptor:
- type: CHAR_DIGEST_LIKE
+ type: CORE.QUERY_LIKE.FIXTURE
tables:
t_encrypt:
columns:
diff --git a/proxy/backend/core/src/test/resources/conf/import/config-encrypt.yaml b/proxy/backend/core/src/test/resources/conf/import/config-encrypt.yaml
index aef5444ece54d..167fea1656858 100644
--- a/proxy/backend/core/src/test/resources/conf/import/config-encrypt.yaml
+++ b/proxy/backend/core/src/test/resources/conf/import/config-encrypt.yaml
@@ -44,10 +44,6 @@ rules:
type: AES
props:
aes-key-value: 123456abc
- rc4_encryptor:
- type: RC4
- props:
- rc4-key-value: 123456abc
tables:
t_encrypt:
columns:
@@ -58,4 +54,4 @@ rules:
order_id:
cipher:
name: order_cipher
- encryptorName: rc4_encryptor
+ encryptorName: aes_encryptor
diff --git a/proxy/backend/core/src/test/resources/expected/convert-encrypt.yaml b/proxy/backend/core/src/test/resources/expected/convert-encrypt.yaml
index 8882645ae9399..54ea9b0474e81 100644
--- a/proxy/backend/core/src/test/resources/expected/convert-encrypt.yaml
+++ b/proxy/backend/core/src/test/resources/expected/convert-encrypt.yaml
@@ -31,6 +31,6 @@ PROPERTIES('minPoolSize'='1', 'connectionTimeoutMilliseconds'='30000', 'maxLifet
CREATE ENCRYPT RULE t_encrypt (
COLUMNS(
-(NAME=user_id, CIPHER=user_cipher, ASSISTED_QUERY_COLUMN=user_assisted, LIKE_QUERY_COLUMN=user_like, ENCRYPT_ALGORITHM(TYPE(NAME='aes', PROPERTIES('aes-key-value'='123456abc'))), ASSISTED_QUERY_ALGORITHM(TYPE(NAME='rc4', PROPERTIES('rc4-key-value'='123456abc'))), LIKE_QUERY_ALGORITHM(TYPE(NAME='char_digest_like'))),
+(NAME=user_id, CIPHER=user_cipher, ASSISTED_QUERY_COLUMN=user_assisted, LIKE_QUERY_COLUMN=user_like, ENCRYPT_ALGORITHM(TYPE(NAME='aes', PROPERTIES('aes-key-value'='123456abc'))), ASSISTED_QUERY_ALGORITHM(TYPE(NAME='rc4', PROPERTIES('rc4-key-value'='123456abc'))), LIKE_QUERY_ALGORITHM(TYPE(NAME='core.query_like.fixture'))),
(NAME=order_id, CIPHER=order_cipher, ENCRYPT_ALGORITHM(TYPE(NAME='rc4', PROPERTIES('rc4-key-value'='123456abc'))))
));
diff --git a/proxy/bootstrap/pom.xml b/proxy/bootstrap/pom.xml
index 7eb7e0d1c2584..1f30c4d35b5b2 100644
--- a/proxy/bootstrap/pom.xml
+++ b/proxy/bootstrap/pom.xml
@@ -142,16 +142,6 @@
HikariCP
runtime
-
- org.apache.commons
- commons-dbcp2
- runtime
-
-
- com.mchange
- c3p0
- runtime
-
ch.qos.logback
diff --git a/features/encrypt/core/src/main/java/org/apache/shardingsphere/encrypt/algorithm/like/CharDigestLikeEncryptAlgorithm.java b/test/e2e/fixture/src/test/java/org/apache/shardingsphere/test/e2e/fixture/ITEncryptLikeAlgorithmFixture.java
similarity index 95%
rename from features/encrypt/core/src/main/java/org/apache/shardingsphere/encrypt/algorithm/like/CharDigestLikeEncryptAlgorithm.java
rename to test/e2e/fixture/src/test/java/org/apache/shardingsphere/test/e2e/fixture/ITEncryptLikeAlgorithmFixture.java
index 4b64d85569554..42acf4f43ee4e 100644
--- a/features/encrypt/core/src/main/java/org/apache/shardingsphere/encrypt/algorithm/like/CharDigestLikeEncryptAlgorithm.java
+++ b/test/e2e/fixture/src/test/java/org/apache/shardingsphere/test/e2e/fixture/ITEncryptLikeAlgorithmFixture.java
@@ -15,10 +15,9 @@
* limitations under the License.
*/
-package org.apache.shardingsphere.encrypt.algorithm.like;
+package org.apache.shardingsphere.test.e2e.fixture;
import com.google.common.base.Strings;
-import lombok.EqualsAndHashCode;
import lombok.SneakyThrows;
import org.apache.shardingsphere.encrypt.api.context.EncryptContext;
import org.apache.shardingsphere.encrypt.api.encrypt.like.LikeEncryptAlgorithm;
@@ -33,11 +32,7 @@
import java.util.stream.Collectors;
import java.util.stream.IntStream;
-/**
- * Char digest like encrypt algorithm.
- */
-@EqualsAndHashCode
-public final class CharDigestLikeEncryptAlgorithm implements LikeEncryptAlgorithm {
+public final class ITEncryptLikeAlgorithmFixture implements LikeEncryptAlgorithm {
private static final String DELTA_KEY = "delta";
@@ -158,6 +153,6 @@ private char getMaskedChar(final char originalChar) {
@Override
public String getType() {
- return "CHAR_DIGEST_LIKE";
+ return "IT.ENCRYPT.LIKE.FIXTURE";
}
}
diff --git a/examples/docker/shardingsphere-proxy/governance/run.sh b/test/e2e/fixture/src/test/resources/META-INF/services/org.apache.shardingsphere.encrypt.spi.EncryptAlgorithm
similarity index 91%
rename from examples/docker/shardingsphere-proxy/governance/run.sh
rename to test/e2e/fixture/src/test/resources/META-INF/services/org.apache.shardingsphere.encrypt.spi.EncryptAlgorithm
index 235d08e27b914..f6c7dada3ea88 100644
--- a/examples/docker/shardingsphere-proxy/governance/run.sh
+++ b/test/e2e/fixture/src/test/resources/META-INF/services/org.apache.shardingsphere.encrypt.spi.EncryptAlgorithm
@@ -1,4 +1,3 @@
-#!/usr/bin/env bash
#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
@@ -16,4 +15,4 @@
# limitations under the License.
#
-docker-compose up -d
+org.apache.shardingsphere.test.e2e.fixture.ITEncryptLikeAlgorithmFixture
diff --git a/features/encrypt/core/src/main/resources/algorithm/like/common_chinese_character.dict b/test/e2e/fixture/src/test/resources/algorithm/like/common_chinese_character.dict
similarity index 100%
rename from features/encrypt/core/src/main/resources/algorithm/like/common_chinese_character.dict
rename to test/e2e/fixture/src/test/resources/algorithm/like/common_chinese_character.dict
diff --git a/test/e2e/sql/src/test/resources/cases/rql/dataset/encrypt/show_encrypt_rule.xml b/test/e2e/sql/src/test/resources/cases/rql/dataset/encrypt/show_encrypt_rule.xml
index b1b7460494b57..d94f7c2cc56dd 100644
--- a/test/e2e/sql/src/test/resources/cases/rql/dataset/encrypt/show_encrypt_rule.xml
+++ b/test/e2e/sql/src/test/resources/cases/rql/dataset/encrypt/show_encrypt_rule.xml
@@ -29,8 +29,8 @@
-
+
-
+
diff --git a/test/e2e/sql/src/test/resources/cases/rql/dataset/encrypt/show_encrypt_rules.xml b/test/e2e/sql/src/test/resources/cases/rql/dataset/encrypt/show_encrypt_rules.xml
index 488e8d0e859ef..1bbed24a732b1 100644
--- a/test/e2e/sql/src/test/resources/cases/rql/dataset/encrypt/show_encrypt_rules.xml
+++ b/test/e2e/sql/src/test/resources/cases/rql/dataset/encrypt/show_encrypt_rules.xml
@@ -29,12 +29,12 @@
-
+
-
+
-
-
+
+
diff --git a/test/e2e/sql/src/test/resources/env/scenario/encrypt/proxy/conf/mysql/config-encrypt.yaml b/test/e2e/sql/src/test/resources/env/scenario/encrypt/proxy/conf/mysql/config-encrypt.yaml
index 1a6b944972f75..52e62adb6160e 100644
--- a/test/e2e/sql/src/test/resources/env/scenario/encrypt/proxy/conf/mysql/config-encrypt.yaml
+++ b/test/e2e/sql/src/test/resources/env/scenario/encrypt/proxy/conf/mysql/config-encrypt.yaml
@@ -39,7 +39,7 @@ rules:
props:
aes-key-value: 123456abc
like_encryptor:
- type: CHAR_DIGEST_LIKE
+ type: IT.ENCRYPT.LIKE.FIXTURE
props:
mask: 4093
tables:
diff --git a/test/e2e/sql/src/test/resources/env/scenario/encrypt/proxy/conf/opengauss/config-encrypt.yaml b/test/e2e/sql/src/test/resources/env/scenario/encrypt/proxy/conf/opengauss/config-encrypt.yaml
index 89a633b9d4e9b..ed3a7d99c7c90 100644
--- a/test/e2e/sql/src/test/resources/env/scenario/encrypt/proxy/conf/opengauss/config-encrypt.yaml
+++ b/test/e2e/sql/src/test/resources/env/scenario/encrypt/proxy/conf/opengauss/config-encrypt.yaml
@@ -39,7 +39,7 @@ rules:
props:
aes-key-value: 123456abc
like_encryptor:
- type: CHAR_DIGEST_LIKE
+ type: IT.ENCRYPT.LIKE.FIXTURE
props:
mask: 4093
tables:
diff --git a/test/e2e/sql/src/test/resources/env/scenario/encrypt/proxy/conf/postgresql/config-encrypt.yaml b/test/e2e/sql/src/test/resources/env/scenario/encrypt/proxy/conf/postgresql/config-encrypt.yaml
index 9cc2b3d538277..a5030de3a717f 100644
--- a/test/e2e/sql/src/test/resources/env/scenario/encrypt/proxy/conf/postgresql/config-encrypt.yaml
+++ b/test/e2e/sql/src/test/resources/env/scenario/encrypt/proxy/conf/postgresql/config-encrypt.yaml
@@ -39,7 +39,7 @@ rules:
props:
aes-key-value: 123456abc
like_encryptor:
- type: CHAR_DIGEST_LIKE
+ type: IT.ENCRYPT.LIKE.FIXTURE
props:
mask: 4093
tables:
diff --git a/test/e2e/sql/src/test/resources/env/scenario/encrypt/rules.yaml b/test/e2e/sql/src/test/resources/env/scenario/encrypt/rules.yaml
index 69a2ac6a6d94c..a4c0b5294e469 100644
--- a/test/e2e/sql/src/test/resources/env/scenario/encrypt/rules.yaml
+++ b/test/e2e/sql/src/test/resources/env/scenario/encrypt/rules.yaml
@@ -26,7 +26,7 @@ rules:
props:
aes-key-value: 123456abc
like_encryptor:
- type: CHAR_DIGEST_LIKE
+ type: IT.ENCRYPT.LIKE.FIXTURE
props:
mask: 4093
tables:
diff --git a/test/it/optimizer/src/test/java/org/apache/shardingsphere/test/it/optimizer/converter/SQLNodeConverterEngineIT.java b/test/it/optimizer/src/test/java/org/apache/shardingsphere/test/it/optimizer/converter/SQLNodeConverterEngineIT.java
index fc1233355d9a9..708fb470cc222 100644
--- a/test/it/optimizer/src/test/java/org/apache/shardingsphere/test/it/optimizer/converter/SQLNodeConverterEngineIT.java
+++ b/test/it/optimizer/src/test/java/org/apache/shardingsphere/test/it/optimizer/converter/SQLNodeConverterEngineIT.java
@@ -89,7 +89,7 @@ private static class TestCaseArgumentsProvider implements ArgumentsProvider {
@Override
public Stream extends Arguments> provideArguments(final ExtensionContext extensionContext) {
- return getTestParameters("MySQL", "PostgreSQL", "openGauss", "Oracle").stream();
+ return getTestParameters("MySQL", "PostgreSQL", "openGauss", "Oracle", "SQLServer").stream();
}
private Collection getTestParameters(final String... databaseTypes) {
diff --git a/test/it/optimizer/src/test/resources/converter/delete.xml b/test/it/optimizer/src/test/resources/converter/delete.xml
index 7145066efdb6b..1f320631d0bed 100644
--- a/test/it/optimizer/src/test/resources/converter/delete.xml
+++ b/test/it/optimizer/src/test/resources/converter/delete.xml
@@ -36,6 +36,10 @@
-
-
+
+
+
+
+
+
diff --git a/test/it/parser/src/main/resources/case/dml/insert.xml b/test/it/parser/src/main/resources/case/dml/insert.xml
index 197fccd0f65bb..441c2d12fae0b 100644
--- a/test/it/parser/src/main/resources/case/dml/insert.xml
+++ b/test/it/parser/src/main/resources/case/dml/insert.xml
@@ -2575,41 +2575,77 @@
-
+
+
+
+
+
+
+
+ TO_DATE('2009', 'YYYY')
+
-
+
+
+
+
+
+
+
+ TO_DATE('2009', 'YYYY')
+
-
+
-
+
+
+
+
+
+
+
+ TO_DATE('2009', 'YYYY')
+
- -
-
+
+
+
+
+
+
+
+ TO_DATE('2009', 'YYYY')
+
-
- year
- to
- MONTH
-
-
+
-
+
+
+
+
+
+
+
+ TO_DATE('2009', 'YYYY')
+
- -
-
+
+
+
+
+
+
+
+ TO_DATE('2009', 'YYYY')
+
-
- DAY
- TO
- SECOND
-
diff --git a/test/it/parser/src/main/resources/case/dml/select.xml b/test/it/parser/src/main/resources/case/dml/select.xml
index 38059feba3ec8..9dd07a52faf19 100644
--- a/test/it/parser/src/main/resources/case/dml/select.xml
+++ b/test/it/parser/src/main/resources/case/dml/select.xml
@@ -6839,7 +6839,21 @@
TO_DATE('Febuary 15, 2016, 11:00 A.M.' DEFAULT 'January 01, 2016 12:00 A.M.' ON CONVERSION ERROR, 'Month dd, YYYY, HH:MI A.M.')
-
+
+
+
+
+
+
+
+
+
+
TO_DATE('Febuary 15, 2016, 11:00 A.M.' DEFAULT 'January 01, 2016 12:00 A.M.' ON CONVERSION ERROR, 'Month dd, YYYY, HH:MI A.M.')
@@ -6851,7 +6865,7 @@