首页 > 其他分享 >Flink与Hive集成错误总结

Flink与Hive集成错误总结

时间:2022-11-24 19:12:21浏览次数:42  
标签:集成 Flink lib flink jar hadoop Hive usr local


1. Caused by: java.lang.ClassNotFoundException: org.apache.hive.common.util.HiveVersionInfo

原因:flink缺少hive-exec-3.1.2.jar包
解决方法:cp /usr/local/hive/lib/hive-exec-3.1.2.jar /usr/local/flink/lib/

2. [ERROR] Could not execute SQL statement. Reason:java.lang.ClassNotFoundException: org.apache.hadoop.fs.FSDataInputStream

原因:缺少hadoop 依赖或者hadoop 的环境变量
解决办法:export HADOOP_CLASSPATH=hadoop classpath(可以直接加载到环境变量)

3. Caused by: org.apache.flink.table.api.ValidationException: Could not find any factory for identifier 'hive' that implements 'org.apache.flink.table.factories.CatalogFactory' in the classpath

原因:flink缺少flink-connector-hive_2.12-1.13.0.jar包(需要和flink、scala版本一致)
解决办法:cp /usr/local/hive/lib/hive-exec-3.1.2.jar /usr/local/flink/lib/

4. Caused by: java.lang.NoClassDefFoundError: com/ctc/wstx/io/InputBootstrapper
原因:是缺少woodstox-core-5.0.3.jar包
解决方法: /usr/local/hadoop/share/hadoop/common/lib/woodstox-core-5.0.3.jar /usr/local/flink/lib/

5. Caused by: java.lang.NoClassDefFoundError: org/codehaus/stax2/XMLInputFactory2

原因:是缺少stax2-api-3.1.4.jar包
解决方法: /usr/local/hadoop/share/hadoop/common/lib/stax2-api-3.1.4.jar /usr/local/flink/lib/

6. Caused by: java.lang.ClassNotFoundException: org.apache.commons.configuration2.Configuration

原因:是缺少commons-configuration2-2.1.1.jar包
解决方法: /usr/local/hadoop/share/hadoop/common/lib/commons-configuration2-2.1.1.jar /usr/local/flink/lib/

标签:集成,Flink,lib,flink,jar,hadoop,Hive,usr,local
From: https://www.cnblogs.com/654wangzai321/p/16922927.html

相关文章