Skip to content
Merged

meger #215

Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
145 commits
Select commit Hold shift + click to select a range
98aa150
Merge remote-tracking branch 'origin/1.5_v3.7.2' into 1.5_v3.8.0_beta…
todd5167 Aug 26, 2019
78f5131
[禁用vertx缓存][flinkSQL任务,设置结果表,提交运行中,停止该任务修改结果表,再次提交运行中,数据曲线变成两条,且新的结果表…
todd5167 Sep 19, 2019
f6ff456
[rdbsink表名与注册表名称一致][flinkSQL任务,设置结果表,提交运行中,停止该任务修改结果表,再次提交运行中,数据曲线变成两…
todd5167 Sep 19, 2019
2969771
flinksql增加udaf函数功能
simenliuxing Sep 20, 2019
acd9335
kafka parallelism
todd5167 Sep 24, 2019
72ba34e
kafka11 parallelism
todd5167 Sep 24, 2019
925718e
oracle sink code opt
todd5167 Sep 24, 2019
6f94aa2
Merge branch '1.5_param' into 'v1.5.0_dev'
zoudaokoulife Sep 25, 2019
b8513bb
Merge branch '1.5_3.6.1_vertxcache' into '1.5_v3.6.1'
zoudaokoulife Sep 25, 2019
f1fbc11
udaf函数改为与scala和table函数并行存在的方式
simenliuxing Sep 25, 2019
96ac939
Merge remote-tracking branch 'origin/1.5_v3.8.0_beta_1.0' into 1.5_v3…
todd5167 Sep 25, 2019
4f1c243
import class
todd5167 Sep 25, 2019
4b239c6
rdb side sql error
todd5167 Sep 25, 2019
a876b55
Merge branch '1.8_v3.8.1_mysqlasync_error' into '1.8_v3.8.1'
zoudaokoulife Sep 25, 2019
0b5c515
Merge branch '1.8_v3.8.1' into v1.8.0_dev
todd5167 Sep 25, 2019
736c42d
Merge branch '1.5_v3.6.1' into v1.5.0_dev
todd5167 Sep 25, 2019
5f25a85
Merge remote-tracking branch 'origin/v1.5.0_dev' into v1.8.0_dev
todd5167 Sep 25, 2019
4b11bea
properties field trim
todd5167 Sep 26, 2019
409723b
modify oracle mergeinto sql
todd5167 Sep 27, 2019
747cebd
Merge branch 'v1.8.0_dev_feature_udaf' into 'v1.8.0_dev'
zoudaokoulife Sep 27, 2019
64ba57e
add PropertiesUtils
todd5167 Sep 27, 2019
57f2d67
Merge branch '1.8_3.8.1_fieldTrim' into '1.8_v3.8.1'
zoudaokoulife Sep 27, 2019
7044b48
oracle merge into midify
todd5167 Sep 27, 2019
37c9487
Merge branch '1.5.0_dev_oracle_sink' into 'v1.5.0_dev'
zoudaokoulife Sep 27, 2019
6cde615
Merge remote-tracking branch 'origin/v1.5.0_dev' into v1.8.0_dev
todd5167 Sep 27, 2019
3ba6e49
Merge remote-tracking branch 'origin/1.8_v3.8.1' into v1.8.0_dev
todd5167 Sep 27, 2019
bbf1e99
Merge remote-tracking branch 'origin/1.5_v3.6.1' into 1.5_v3.7.2
todd5167 Sep 27, 2019
fa8ba66
Merge remote-tracking branch 'origin/1.5_v3.7.2' into 1.5_v3.8.0
todd5167 Sep 27, 2019
3ef6a95
deal oracle table name
todd5167 Sep 29, 2019
48e5773
modify method name
todd5167 Sep 29, 2019
6e2b087
Merge branch '1.8_dev_oracle_tablename' into 'v1.8.0_dev'
zoudaokoulife Sep 30, 2019
8b8c54c
[mysql维表全量缓存报空指针异常错误][19145]
todd5167 Sep 30, 2019
227c8f0
Merge branch 'master' of github.com:DTStack/flinkStreamSQL into v1.8.…
todd5167 Oct 8, 2019
e90d95e
删除timestampudf
Oct 8, 2019
07c7b09
删除timestampudf
Oct 8, 2019
349fa5a
Merge branch 'v1.8.0_dev_deleteudf' into 'v1.8.0_dev'
Oct 8, 2019
0655426
Merge branch 'v1.5.0_dev_deleteudf' into 'v1.5.0_dev'
Oct 9, 2019
0282bba
Merge branch '1.8_v3.8.1_mysql_allcache' into '1.8_v3.8.1'
zoudaokoulife Oct 9, 2019
fd20156
Merge remote-tracking branch 'origin/1.8_v3.8.1' into v1.8.0_dev
todd5167 Oct 9, 2019
b8ee11f
oracle schema
todd5167 Oct 9, 2019
cb25fb3
rdb schema
todd5167 Oct 10, 2019
4e61d02
Merge branch '1.8.0_oracle_schema' into 'v1.8.0_dev'
zoudaokoulife Oct 10, 2019
76f65f5
流维join,源表的关联字段不输入,无法打入数据到结果表.关系型数据已经修改
simenliuxing Oct 11, 2019
df2d178
Merge branch 'v1.8.0_dev_bugfix_leftjoinnull' into 'v1.8.0_dev'
zoudaokoulife Oct 11, 2019
86a0e2a
add timestamp
todd5167 Oct 11, 2019
09d5f92
kafka parse timestamp
todd5167 Oct 11, 2019
56f9eda
Merge branch '1.5_v3.8.0_timestamp' into '1.5_v3.8.0'
zoudaokoulife Oct 11, 2019
3ae5e2a
flinksql 150 classloader
jiemotongxue Oct 14, 2019
e8d915e
[flinksql][维表join npe,错误提示不友好][19240]
simenliuxing Oct 15, 2019
ecba3eb
udf classloader
jiemotongxue Oct 15, 2019
461b876
rename
jiemotongxue Oct 15, 2019
7b068c1
Merge branch '1.5_v3.8.0_bugfix_joinnpe' into '1.5_v3.8.0'
zoudaokoulife Oct 16, 2019
6a5f75c
merge 1.5.0_dev_classloader
jiemotongxue Oct 16, 2019
b683034
multiple join
todd5167 Oct 16, 2019
9c50e44
维表join npe问题
simenliuxing Oct 17, 2019
4700955
Merge branch 'v1.8.0_dev_bugfix_joinnpe' into 'v1.8.0_dev'
zoudaokoulife Oct 17, 2019
dab993c
Merge branch 'v1.5.0_dev_classloader' into 'v1.5.0_dev'
Oct 17, 2019
496e9ad
Merge branch 'v1.5.0_dev' into v1.8.0_dev_classloader
jiemotongxue Oct 17, 2019
ccc6e1a
实时计算谓词下推,关系型数据库异步方式
simenliuxing Oct 17, 2019
6cb8bc4
stream join convert
todd5167 Oct 17, 2019
968fc95
add calcite config
todd5167 Oct 17, 2019
c91046b
Merge branch 'v1.8.0_dev_classloader' into 'v1.8.0_dev'
Oct 18, 2019
9f48daf
Merge branch '1.5_v3.8.0' into v1.5.0_dev
zoudaokoulife Oct 18, 2019
71b0dd6
删除无用文件
zoudaokoulife Oct 18, 2019
9468eb4
Merge remote-tracking branch 'origin/v1.5.0_dev' into v1.8.0_dev
todd5167 Oct 18, 2019
88da30f
modify class pre
todd5167 Oct 18, 2019
88cb091
cep功
simenliuxing Oct 20, 2019
f5c3c7f
fix conflicts
todd5167 Oct 21, 2019
3fc3adf
Merge branch 'v1.8.0_dev_feature_cep' into 'v1.8.0_dev'
zoudaokoulife Oct 21, 2019
49bc561
extract replaceWhenOrThenSelectFieldTabName method
todd5167 Oct 21, 2019
b3037d4
Merge branch '1.8_dev_multi_join' into 'v1.8.0_dev'
zoudaokoulife Oct 21, 2019
0f619d8
midify DtStringUtil splitIgnoreQuota
todd5167 Oct 21, 2019
df1d888
[flinksql][udf类不能加载][19234]
simenliuxing Oct 22, 2019
a49da13
Merge branch '1.5_v3.8.1_bigfix_udfclassloader' into '1.5_v3.8.1'
zoudaokoulife Oct 22, 2019
beb9b3a
[flinksql][udf类加载器和tableEnv不是同一个][19234]
simenliuxing Oct 24, 2019
b5cdc9b
Merge branch '1.5_v3.8.1_bugfix_classloader' into '1.5_v3.8.1'
zoudaokoulife Oct 24, 2019
d03368a
Merge remote-tracking branch 'origin/1.5_v3.8.1' into v1.5.0_dev
todd5167 Oct 28, 2019
d60ac39
fix conflict
todd5167 Oct 28, 2019
bd847ae
Merge remote-tracking branch 'origin/v1.5.0_dev' into v1.8.0_dev
todd5167 Oct 28, 2019
561474b
udf类加载器问题
simenliuxing Oct 28, 2019
9f42ef3
Merge branch 'v1.5.0_dev_udfclassloader' into 'v1.5.0_dev'
simenliuxing Oct 28, 2019
3ebdaea
Merge remote-tracking branch 'origin/v1.5.0_dev' into v1.8.0_dev
simenliuxing Oct 28, 2019
14977bc
udf类加载问题
simenliuxing Oct 28, 2019
7c3d531
fix conflict
simenliuxing Oct 28, 2019
bc39433
add plugin load mode classpath or shipfile
todd5167 Oct 31, 2019
e55192e
Merge branch '1.5_dev_split_str' into 'v1.5.0_dev'
zoudaokoulife Oct 31, 2019
d4f0686
[flinksql][sql解析失败][19739]
simenliuxing Nov 1, 2019
758d2e7
Merge branch 'v1.5.0_dev_bugfix_sqlparse' into 'v1.5.0_dev'
zoudaokoulife Nov 1, 2019
b59af8e
Merge branch 'v1.5.0_dev' into 'v1.8.0_dev'
simenliuxing Nov 1, 2019
c7f2f10
关系型数据库谓词下推
simenliuxing Nov 4, 2019
416e766
modify method params
todd5167 Nov 4, 2019
85f078d
kudu维表和结果表
Nov 4, 2019
1977edb
Merge branch 'v1.5.0_dev_feature_kudu' into 'v1.5.0_dev'
simenliuxing Nov 4, 2019
22a59be
clickhouse side table
todd5167 Nov 4, 2019
7479059
Merge remote-tracking branch 'origin/v1.5.0_dev' into v1.8.0_dev
simenliuxing Nov 4, 2019
ecf0bca
fix splitIgoreQuota method
todd5167 Nov 4, 2019
60fb28b
Merge branch '1.5_dev_split_str' into 'v1.5.0_dev'
simenliuxing Nov 5, 2019
4f38b7f
Merge branch 'v1.5.0_dev' into 'v1.8.0_dev'
simenliuxing Nov 5, 2019
1366c24
支持嵌套json字段提取和数组类型解析
Nov 5, 2019
73754ed
在kafkaSource.md增加对嵌套json和数组类型字段解析的说明
Nov 5, 2019
4f2e9a9
clickhouse sink and field parse
todd5167 Nov 5, 2019
d7ac275
增加kuduSink.md和kuduSide.md说明
Nov 5, 2019
24f392d
kafka1.0 connector
simenliuxing Nov 5, 2019
d81c58d
Merge branch 'v1.5.0_dev_feature_parseJson' into 'v1.5.0_dev'
simenliuxing Nov 5, 2019
7a2ad4e
Merge branch 'v1.8.0_dev_feature_kafka1' into 'v1.8.0_dev'
simenliuxing Nov 5, 2019
11a840c
Merge remote-tracking branch 'origin/v1.5.0_dev' into v1.8.0_dev
simenliuxing Nov 5, 2019
b4382bd
Merge remote-tracking branch 'origin/v1.8.0_dev' into v1.8.0_dev
simenliuxing Nov 5, 2019
a93d7e0
do not handle brackets in quotes
todd5167 Nov 5, 2019
bb1aa1b
Merge branch 'v1.5.0_dev' of ssh://git.dtstack.cn:10022/dtstack/dt-ce…
todd5167 Nov 5, 2019
3ee2a1d
Merge branch '1.5_dev_split_str' into 'v1.5.0_dev'
simenliuxing Nov 5, 2019
21b0cf6
最新kafka版本支持嵌套json和数组类型字段解析
Nov 5, 2019
79cd993
Merge branch 'v1.5.0_dev' into 'v1.8.0_dev'
simenliuxing Nov 5, 2019
138a873
Merge branch 'v1.8.0_dev_kafkaNew' into 'v1.8.0_dev'
simenliuxing Nov 5, 2019
7df8f46
plugin load mode code opt
todd5167 Nov 6, 2019
25e07ea
add licens
todd5167 Nov 6, 2019
236312b
test code
todd5167 Nov 6, 2019
de9e241
整理postgresql维表和结果表代码,merge to v1.5.0_dev
Nov 6, 2019
6e52362
add timestampadd and timestampdiff
todd5167 Nov 6, 2019
bfa4d45
Merge branch 'v1.8.0_dev' of ssh://git.dtstack.cn:10022/dtstack/dt-ce…
todd5167 Nov 7, 2019
7ad245c
Merge branch 'v1.5.0_dev_feature_kudu' into 'v1.5.0_dev'
simenliuxing Nov 7, 2019
6e1e73b
Merge remote-tracking branch 'origin/v1.5.0_dev' into v1.8.0_dev
simenliuxing Nov 7, 2019
a82f900
pgsql
simenliuxing Nov 7, 2019
5e2d906
Merge remote-tracking branch 'origin/v1.5.0_dev' into v1.8.0_dev
simenliuxing Nov 7, 2019
8209255
pgsql
simenliuxing Nov 7, 2019
e0e45e4
readme
simenliuxing Nov 7, 2019
9d60249
Merge branch '1.8_timestamp_add' into 'v1.8.0_dev'
simenliuxing Nov 7, 2019
8649b8b
Merge branch '1.8.0_dev_clickhouse' into 'v1.8.0_dev'
simenliuxing Nov 8, 2019
3fe1eb7
流表前缀扫描的方式join hbase维表时,如果扫描到了多条记录,只会输入一条记录
simenliuxing Nov 8, 2019
56165d8
Merge branch 'v1.8.0_dev_bugfix_hbase' into 'v1.8.0_dev'
simenliuxing Nov 8, 2019
eb6e7e2
修改redis结果表和维表bug
Nov 8, 2019
99fca76
Merge branch 'v1.5.0_dev_feature_redis' into 'v1.5.0_dev'
simenliuxing Nov 8, 2019
7937ecb
Merge remote-tracking branch 'origin/v1.8.0_dev_feature_predicate-pus…
simenliuxing Nov 9, 2019
cd0e854
Merge branch '1.8.0_dev_feature_pluginjar' into 'v1.8.0_dev'
zoudaokoulife Nov 11, 2019
1703b02
解决redis维表内连接的bug
Nov 11, 2019
4d8dbb7
解决redis async side内连接问题
Nov 11, 2019
d2c2ef3
Merge branch 'v1.5.0_dev_feature_redis' into 'v1.5.0_dev'
simenliuxing Nov 11, 2019
faffa35
readme
simenliuxing Nov 11, 2019
faea88d
Merge remote-tracking branch 'origin/v1.5.0_dev' into v1.8.0_dev
simenliuxing Nov 11, 2019
cf3d682
readme format
simenliuxing Nov 11, 2019
e8b8c01
kudusink readme
simenliuxing Nov 11, 2019
7ffe70b
remove flink shade class
todd5167 Nov 11, 2019
3b264e0
readme增加新版本中的新功能
simenliuxing Nov 11, 2019
3361be9
fix conflicts
todd5167 Nov 11, 2019
4fe99de
Merge branch '1.8.0_dev_feature_pluginjar' into 'v1.8.0_dev'
simenliuxing Nov 11, 2019
f9378f4
readme oracle side、sink doc
simenliuxing Nov 11, 2019
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
44 changes: 31 additions & 13 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,27 +8,32 @@
> > * 扩展了输入和输出的性能指标到promethus

## 新特性:
1.kafka源表支持not null语法,支持字符串类型的时间转换。
2.rdb维表与DB建立连接时,周期进行连接,防止连接断开。rdbsink写入时,对连接进行检查。
3.异步维表支持非等值连接,比如:<>,<,>。
* 1.kafka源表支持not null语法,支持字符串类型的时间转换。
* 2.rdb维表与DB建立连接时,周期进行连接,防止连接断开。rdbsink写入时,对连接进行检查。
* 3.异步维表支持非等值连接,比如:<>,<,>。
* 4.增加kafka数组解析
* 5.增加kafka1.0以上版本的支持
* 6.增加postgresql、kudu、clickhouse维表、结果表的支持
* 7.支持插件的依赖方式,参考pluginLoadMode参数
* 8.支持cep处理
* 9.支持udaf
* 10.支持谓词下移

## BUG修复:
1.修复不能解析sql中orderby,union语法。
2.修复yarnPer模式提交失败的异常。
* 1.修复不能解析sql中orderby,union语法。
* 2.修复yarnPer模式提交失败的异常。
* 3.一些bug的修复

# 已支持
* 源表:kafka 0.91.x版本
* 维表:mysqlSQlServer,oracle,hbasemongoredis,cassandra,serversocket
* 结果表:mysqlSQlServer,oracle,hbaseelasticsearch5.xmongoredis,cassandra,console
* 源表:kafka 0.9、0.10、0.11、1.x版本
* 维表:mysql, SQlServer,oracle, hbase, mongo, redis, cassandra, serversocket, kudu, postgresql, clickhouse
* 结果表:mysql, SQlServer, oracle, hbase, elasticsearch5.x, mongo, redis, cassandra, console, kudu, postgresql, clickhouse

# 后续开发计划
* 增加SQL支持CEP
* 维表快照
* sql优化(谓词下移等)
* kafka avro格式
* topN


## 1 快速起步
### 1.1 运行模式

Expand All @@ -40,7 +45,7 @@
### 1.2 执行环境

* Java: JDK8及以上
* Flink集群: 1.4,1.5(单机模式不需要安装Flink集群)
* Flink集群: 1.4,1.5,1.8(单机模式不需要安装Flink集群)
* 操作系统:理论上不限

### 1.3 打包
Expand Down Expand Up @@ -150,6 +155,11 @@ sh submit.sh -sql D:\sideSql.txt -name xctest -remoteSqlPluginPath /opt/dtstack
* 必选:否
* 默认值:false

* **pluginLoadMode**
* 描述:per_job 模式下的插件包加载方式。classpath:从每台机器加载插件包,shipfile:将需要插件从提交的节点上传到hdfs,不需要每台安装插件
* 必选:否
* 默认值:classpath

* **yarnSessionConf**
* 描述:yarn session 模式下指定的运行的一些参数,[可参考](https://ci.apache.org/projects/flink/flink-docs-release-1.8/ops/cli.html),目前只支持指定yid
* 必选:否
Expand All @@ -163,16 +173,24 @@ sh submit.sh -sql D:\sideSql.txt -name xctest -remoteSqlPluginPath /opt/dtstack
* [elasticsearch 结果表插件](docs/elasticsearchSink.md)
* [hbase 结果表插件](docs/hbaseSink.md)
* [mysql 结果表插件](docs/mysqlSink.md)
* [oracle 结果表插件](docs/oracleSink.md)
* [mongo 结果表插件](docs/mongoSink.md)
* [redis 结果表插件](docs/redisSink.md)
* [cassandra 结果表插件](docs/cassandraSink.md)
* [kudu 结果表插件](docs/kuduSink.md)
* [postgresql 结果表插件](docs/postgresqlSink.md)
* [clickhouse 结果表插件](docs/clickhouseSink.md)

### 2.3 维表插件
* [hbase 维表插件](docs/hbaseSide.md)
* [mysql 维表插件](docs/mysqlSide.md)
* [oracle 维表插件](docs/oracleSide.md)
* [mongo 维表插件](docs/mongoSide.md)
* [redis 维表插件](docs/redisSide.md)
* [cassandra 维表插件](docs/cassandraSide.md)
* [kudu 维表插件](docs/kuduSide.md)
* [postgresql 维表插件](docs/postgresqlSide.md)
* [clickhouse 维表插件](docs/clickhouseSide.md)

## 3 性能指标(新增)

Expand Down Expand Up @@ -203,7 +221,7 @@ sh submit.sh -sql D:\sideSql.txt -name xctest -remoteSqlPluginPath /opt/dtstack

```

CREATE (scala|table) FUNCTION CHARACTER_LENGTH WITH com.dtstack.Kun
CREATE (scala|table|aggregate) FUNCTION CHARACTER_LENGTH WITH com.dtstack.Kun;


CREATE TABLE MyTable(
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -36,8 +36,8 @@
import org.apache.calcite.sql.JoinType;
import org.apache.commons.collections.CollectionUtils;
import org.apache.flink.api.java.typeutils.RowTypeInfo;
import org.apache.flink.calcite.shaded.com.google.common.collect.Lists;
import org.apache.flink.calcite.shaded.com.google.common.collect.Maps;
import com.google.common.collect.Lists;
import com.google.common.collect.Maps;
import org.apache.flink.table.typeutils.TimeIndicatorTypeInfo;
import org.apache.flink.types.Row;
import org.apache.flink.util.Collector;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,12 +24,10 @@
import com.dtstack.flink.sql.side.SideTableInfo;
import com.dtstack.flink.sql.side.cassandra.table.CassandraSideTableInfo;
import com.dtstack.flink.sql.util.ParseUtils;
import org.apache.calcite.sql.SqlBasicCall;
import org.apache.calcite.sql.SqlKind;
import org.apache.calcite.sql.SqlNode;
import org.apache.commons.collections.CollectionUtils;
import org.apache.flink.api.java.typeutils.RowTypeInfo;
import org.apache.flink.calcite.shaded.com.google.common.collect.Lists;
import com.google.common.collect.Lists;

import java.util.List;

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@
import io.vertx.core.json.JsonArray;
import org.apache.flink.api.java.typeutils.RowTypeInfo;
import org.apache.flink.configuration.Configuration;
import org.apache.flink.shaded.guava18.com.google.common.collect.Lists;
import com.google.common.collect.Lists;
import org.apache.flink.streaming.api.functions.async.ResultFuture;
import org.apache.flink.table.typeutils.TimeIndicatorTypeInfo;
import org.apache.flink.types.Row;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@
import org.apache.calcite.sql.SqlKind;
import org.apache.calcite.sql.SqlNode;
import org.apache.flink.api.java.typeutils.RowTypeInfo;
import org.apache.flink.calcite.shaded.com.google.common.collect.Lists;
import com.google.common.collect.Lists;

import java.util.List;

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
package com.dtstack.flink.sql.side.cassandra.table;

import com.dtstack.flink.sql.side.SideTableInfo;
import org.apache.flink.calcite.shaded.com.google.common.base.Preconditions;
import com.google.common.base.Preconditions;

/**
* Reason:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
package com.dtstack.flink.sql.sink.cassandra.table;

import com.dtstack.flink.sql.table.TargetTableInfo;
import org.apache.flink.calcite.shaded.com.google.common.base.Preconditions;
import com.google.common.base.Preconditions;

/**
* Reason:
Expand Down
92 changes: 92 additions & 0 deletions clickhouse/clickhouse-side/clickhouse-all-side/pom.xml
Original file line number Diff line number Diff line change
@@ -0,0 +1,92 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<parent>
<artifactId>sql.side.clickhouse</artifactId>
<groupId>com.dtstack.flink</groupId>
<version>1.0-SNAPSHOT</version>
</parent>
<modelVersion>4.0.0</modelVersion>

<artifactId>sql.side.all.clickhouse</artifactId>
<name>clickhouse-all-side</name>

<packaging>jar</packaging>

<properties>
<sql.side.clickhouse.core.version>1.0-SNAPSHOT</sql.side.clickhouse.core.version>
</properties>

<dependencies>
<dependency>
<groupId>com.dtstack.flink</groupId>
<artifactId>sql.side.clickhouse.core</artifactId>
<version>${sql.side.clickhouse.core.version}</version>
</dependency>
</dependencies>

<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>1.4</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<artifactSet>
<excludes>

</excludes>
</artifactSet>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
</configuration>
</execution>
</executions>
</plugin>

<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.2</version>
<executions>
<execution>
<id>copy-resources</id>
<!-- here the phase you need -->
<phase>package</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<tasks>
<copy todir="${basedir}/../../../plugins/clickhouseallside">
<fileset dir="target/">
<include name="${project.artifactId}-${project.version}.jar"/>
</fileset>
</copy>

<move file="${basedir}/../../../plugins/clickhouseallside/${project.artifactId}-${project.version}.jar"
tofile="${basedir}/../../../plugins/clickhouseallside/${project.name}-${git.branch}.jar"/>
</tasks>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>


</project>
Original file line number Diff line number Diff line change
@@ -0,0 +1,65 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/

package com.dtstack.flink.sql.side.clickhouse;

import com.dtstack.flink.sql.side.FieldInfo;
import com.dtstack.flink.sql.side.JoinInfo;
import com.dtstack.flink.sql.side.SideTableInfo;
import com.dtstack.flink.sql.side.rdb.all.RdbAllReqRow;
import com.dtstack.flink.sql.util.DtStringUtil;
import com.dtstack.flink.sql.util.JDBCUtils;
import org.apache.flink.api.java.typeutils.RowTypeInfo;
import org.apache.flink.shaded.guava18.com.google.common.collect.Maps;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import java.sql.Connection;
import java.sql.DriverManager;
import java.util.List;
import java.util.Map;

public class ClickhouseAllReqRow extends RdbAllReqRow {

private static final Logger LOG = LoggerFactory.getLogger(ClickhouseAllReqRow.class);

private static final String CLICKHOUSE_DRIVER = "ru.yandex.clickhouse.ClickHouseDriver";

public ClickhouseAllReqRow(RowTypeInfo rowTypeInfo, JoinInfo joinInfo, List<FieldInfo> outFieldInfoList, SideTableInfo sideTableInfo) {
super(new ClickhouseAllSideInfo(rowTypeInfo, joinInfo, outFieldInfoList, sideTableInfo));
}

@Override
public Connection getConn(String dbURL, String userName, String passWord) {
try {
Connection connection ;
JDBCUtils.forName(CLICKHOUSE_DRIVER, getClass().getClassLoader());
// ClickHouseProperties contains all properties
if (userName == null) {
connection = DriverManager.getConnection(dbURL);
} else {
connection = DriverManager.getConnection(dbURL, userName, passWord);
}
return connection;
} catch (Exception e) {
LOG.error("", e);
throw new RuntimeException("", e);
}
}

}
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/

package com.dtstack.flink.sql.side.clickhouse;

import com.dtstack.flink.sql.side.FieldInfo;
import com.dtstack.flink.sql.side.JoinInfo;
import com.dtstack.flink.sql.side.SideTableInfo;
import com.dtstack.flink.sql.side.rdb.all.RdbAllSideInfo;
import org.apache.flink.api.java.typeutils.RowTypeInfo;

import java.util.List;


public class ClickhouseAllSideInfo extends RdbAllSideInfo {
public ClickhouseAllSideInfo(RowTypeInfo rowTypeInfo, JoinInfo joinInfo, List<FieldInfo> outFieldInfoList, SideTableInfo sideTableInfo) {
super(rowTypeInfo, joinInfo, outFieldInfoList, sideTableInfo);
}
}
Loading