首页 > 数据库 >flinksql的初始化

flinksql的初始化

时间:2023-02-02 10:03:05浏览次数:45  
标签:初始化 scala flinksql flink version apache org com


Mavn的依赖

<properties>
<java.version>1.8</java.version>
<maven.compiler.source>${java.version}</maven.compiler.source>
<maven.compiler.target>${java.version}</maven.compiler.target>
<flink.version>1.12.0</flink.version>
<scala.version>2.12</scala.version>
<hadoop.version>3.1.3</hadoop.version>
</properties>

<dependencies>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>${flink.version}</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_${scala.version}</artifactId>
<version>${flink.version}</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka_${scala.version}</artifactId>
<version>${flink.version}</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_${scala.version}</artifactId>
<version>${flink.version}</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-cep_${scala.version}</artifactId>
<version>${flink.version}</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-json</artifactId>
<version>${flink.version}</version>
</dependency>

<dependency>
<groupId>com.alibaba</groupId>
<artifactId>fastjson</artifactId>
<version>1.2.68</version>
</dependency>

<!--如果保存检查点到hdfs上,需要引入此依赖-->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>${hadoop.version}</version>
</dependency>

<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.1.49</version>
</dependency>

<dependency>
<groupId>com.alibaba.ververica</groupId>
<artifactId>flink-connector-mysql-cdc</artifactId>
<version>1.2.0</version>
</dependency>

<!--Flink默认使用的是slf4j记录日志,相当于一个日志的接口,我们这里使用log4j作为具体的日志实现-->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.25</version>
</dependency>

<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.7.25</version>
</dependency>

<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-to-slf4j</artifactId>
<version>2.14.0</version>
</dependency>

<!--lomback插件依赖-->
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<version>1.18.12</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-jdbc_${scala.version}</artifactId>
<version>${flink.version}</version>
</dependency>

<dependency>
<groupId>org.apache.phoenix</groupId>
<artifactId>phoenix-spark</artifactId>
<version>5.0.0-HBase-2.0</version>
<exclusions>
<exclusion>
<groupId>org.glassfish</groupId>
<artifactId>javax.el</artifactId>
</exclusion>
</exclusions>
</dependency>

<!--commons-beanutils是Apache开源组织提供的用于操作JAVA BEAN的工具包。
使用commons-beanutils,我们可以很方便的对bean对象的属性进行操作-->
<dependency>
<groupId>commons-beanutils</groupId>
<artifactId>commons-beanutils</artifactId>
<version>1.9.3</version>
</dependency>

<!--Guava工程包含了若干被Google的Java项目广泛依赖的核心库,方便开发-->
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>29.0-jre</version>
</dependency>

<dependency>
<groupId>redis.clients</groupId>
<artifactId>jedis</artifactId>
<version>3.3.0</version>
</dependency>

<dependency>
<groupId>ru.yandex.clickhouse</groupId>
<artifactId>clickhouse-jdbc</artifactId>
<version>0.2.4</version>
<exclusions>
<exclusion>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
</exclusion>
<exclusion>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
</exclusion>
</exclusions>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-api-java-bridge_${scala.version}</artifactId>
<version>${flink.version}</version>
</dependency>

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-planner-blink_${scala.version}</artifactId>
<version>${flink.version}</version>
</dependency>

<dependency>
<groupId>com.janeluo</groupId>
<artifactId>ikanalyzer</artifactId>
<version>2012_u6</version>
</dependency>
<dependency>
<!-- will stop using ru.yandex.clickhouse starting from 0.4.0 -->
<groupId>com.clickhouse</groupId>
<artifactId>clickhouse-jdbc</artifactId>
<version>0.3.2-patch4</version>
<!-- below is only needed when all you want is a shaded jar -->
<classifier>http</classifier>
<exclusions>
<exclusion>
<groupId>*</groupId>
<artifactId>*</artifactId>
</exclusion>
</exclusions>
</dependency>

</dependencies>

<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<version>3.0.0</version>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>

核心代码

//流代码
EnvironmentSettings environment = EnvironmentSettings.newInstance().useBlinkPlanner().inStreamingMode().build();
StreamTableEnvironment streamTableEnvironment = StreamTableEnvironment.create(env, environment);
//dataStream.print();
//以一个的分钟作为周期
SingleOutputStreamOperator<List<OrderRecord>> streamOperator = dataStreams.timeWindowAll(Time.minutes(1)).apply(new AllWindowFunction<OrderRecord, List<OrderRecord>, TimeWindow>() {
@Override
public void apply(TimeWindow timeWindow, Iterable<OrderRecord> iterable, Collector<List<OrderRecord>> collector) throws Exception {
ArrayList<OrderRecord> list = Lists.newArrayList(iterable);
if (list.size() > 0) {
collector.collect(list);
}
}
});
//dataStreams.print();
/*dataStreams.addSink(new OrderSinkFunc());*/
Table table = streamTableEnvironment.fromDataStream(dataStreams, "user_id,item_id,cate_id,times,name,keyword,factory,price,pro,city,par,brank");
streamTableEnvironment.createTemporaryView("t1", table);
streamOperator.addSink(new OrderSinkFunc());
//,tumble(times, interval '1' day)
Table table1 = streamTableEnvironment.sqlQuery("select item_id,name,count(*)as num ,sum(price) as total from t1 group by item_id,name ");
//支持撤回
streamTableEnvironment.toRetractStream(table1, Row.class).print("输出结果");

标签:初始化,scala,flinksql,flink,version,apache,org,com
From: https://blog.51cto.com/u_15063934/6032811

相关文章

  • 神经网络基础部件-参数初始化详解
    本文内容参考资料为《深度学习》和《解析卷积神经网络》两本书,以及部分网络资料,加以个人理解和内容提炼总结得到。文中直方图的图片来源于参考资料3。一,参数初始化概述......
  • 解决STM32定时器初始化后直接进入中断问题
    解决STM32定时器初始化后直接进入中断问题STM32初始化完毕后第一次启动定时器直接进入中断,原因是定时器的寄存器中断标识没有被清理掉。以HAL库为例,在​​MX_TIMx_Init​​......
  • 初始化顺序、==、AB交换
    初始化顺序题publicclassDemo01{publicstaticvoidmain(String[]args){Cardcard=newCard();card.f();}}classTag{Tag(intmarker){System.......
  • vue初始化项目
    1cmd指定的文件夹,然后执行vuecreate文件夹名称  上图选择第三个之后,选择如下两个,这里按空格可以进行选择,上下箭头可以切换  点击确定之后,选择对应的vue版......
  • ubuntu初始化登录jenkins提示Error错误
    安装环境:jdk使用的是二进制安装的jdk11https://www.oracle.com/java/technologies/downloads/#java11jenkins使用的是jenkins2.2774(deb包)版本https://mirrors.jenk......
  • 类的初始化
    1.导致类初始化的原因:运行主方法所在的类,需要先完成类的初始化,再执行main方法。publicclassMain{static{System.out.println("Main.staticini......
  • 通过py脚本往model中添加初始化数据
    有时在开发的过程中,我们往往需要一些虚假的数据进行更好的代码编写。手动的一行行添加代码过于繁杂,这时通过使用脚本,就是一个很好的选择。"""初始化动态表,在动态表中添......
  • 基础篇02-macOS系统初始化
    标题:macOS系统初始化说明:主机名、硬件、软件排列等内容:1、主机名及装机历史2、硬件配置(太多机器)3、软件排列历史:2023-01-30ztFirstRelease一、macOS系统1、3个......
  • NETAPP FAS2720初始化配置
    配置前准备1.管理地址(必须)3个:1个集群管理地址,2个节点管理地址2.SP地址2个:2个底层管理地址,相当于服务器BMC地址,配置完成后可以远程进行系统重装等操作3.DNS地址:使用CIFS需......
  • 【C++ OOP 02 对象的初始化和清理】构造/析构函数、深/浅拷贝、初始化列表以及静态成
    【对象的初始化和清理】生活中我们买的电子产品都基本会有出厂设置,在某一天我们不用时候也会删除一些自己信息数据保证安全C++中的面向对象来源于生活,每个对象也都会有......