首页 > 其他分享 >ERROR session.SessionState: Error setting up authorization: Privilege [delete/insert] is not found.

ERROR session.SessionState: Error setting up authorization: Privilege [delete/insert] is not found.

时间:2023-03-03 09:35:49浏览次数:45  
标签:insert scala sql up hive Error apache org spark

在hive-site.xml文件中插入下面代码,意思是自己建的表就有增删改查所有权限,hive版本是2.3.3,在hive命令行执行操作没有问题

<property>
        <name>hive.security.authorization.enabled</name>
        <value>true</value>
        <description>enable or disable the hive client authorization</description>
</property>
<property>
        <name>hive.security.authorization.createtable.owner.grants</name>
        <value>ALL</value>
        <description>the privileges automatically granted to the owner whenever a table gets created. Anexample like "select,drop" will grant select and drop privil
</property>

在是使用spark on hive配置时,需要将hive-site.xml文件复制到spark/conf目录下,结果在使用spark 执行建表时出现了异常,在hive的TBL_PRIVS表中权限字段TBL_PRIV存的是“ALL”,而不是四条记录[INSERT/UPDATE/DELETE/SELECT]的表权限记录,在更改hive-site为以下配置后

<property>
        <name>hive.security.authorization.enabled</name>
        <value>true</value>
        <description>enable or disable the hive client authorization</description>
</property>
<property>
        <name>hive.security.authorization.createtable.owner.grants</name>
        <value>select,update,delete,insert</value>
        <description>the privileges automatically granted to the owner whenever a table gets created. Anexample like "select,drop" will grant select and drop privil
</property>

用spark执行建表语句无问题,但是hive的TBL_PRIVS表中权限字段TBL_PRIV存的是SELECT和UPDATE两条记录,缺少DELETE和INSERT,在插入数据时报以下异常

ERROR session.SessionState: Error setting up authorization: Privilege insert is not found.
org.apache.hadoop.hive.ql.metadata.HiveException: Privilege delete is not found.
    at org.apache.hadoop.hive.ql.session.CreateTableAutomaticGrant.validatePrivilege(CreateTableAutomaticGrant.java:110)
    at org.apache.hadoop.hive.ql.session.CreateTableAutomaticGrant.getGrantorInfoList(CreateTableAutomaticGrant.java:91)
    at org.apache.hadoop.hive.ql.session.CreateTableAutomaticGrant.create(CreateTableAutomaticGrant.java:52)
    at org.apache.hadoop.hive.ql.session.SessionState.setupAuth(SessionState.java:740)
    at org.apache.hadoop.hive.ql.session.SessionState.getAuthenticator(SessionState.java:1391)
    at org.apache.hadoop.hive.ql.session.SessionState.getUserFromAuthenticator(SessionState.java:984)
    at org.apache.hadoop.hive.ql.metadata.Table.getEmptyTable(Table.java:177)
    at org.apache.hadoop.hive.ql.metadata.Table.<init>(Table.java:119)
    at org.apache.spark.sql.hive.client.HiveClientImpl$.toHiveTable(HiveClientImpl.scala:931)
    at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$createTable$1.apply$mcV$sp(HiveClientImpl.scala:485)
    at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$createTable$1.apply(HiveClientImpl.scala:483)
    at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$createTable$1.apply(HiveClientImpl.scala:483)
    at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$withHiveState$1.apply(HiveClientImpl.scala:278)
    at org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:216)
    at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:215)
    at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:261)
    at org.apache.spark.sql.hive.client.HiveClientImpl.createTable(HiveClientImpl.scala:483)
    at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$createTable$1.apply$mcV$sp(HiveExternalCatalog.scala:278)
    at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$createTable$1.apply(HiveExternalCatalog.scala:236)
    at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$createTable$1.apply(HiveExternalCatalog.scala:236)
    at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
    at org.apache.spark.sql.hive.HiveExternalCatalog.createTable(HiveExternalCatalog.scala:236)
    at org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.createTable(ExternalCatalogWithListener.scala:94)
    at org.apache.spark.sql.catalyst.catalog.SessionCatalog.createTable(SessionCatalog.scala:324)
    at org.apache.spark.sql.execution.command.CreateTableCommand.run(tables.scala:130)
    at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
    at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
    at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:79)
    at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:194)
    at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:194)
    at org.apache.spark.sql.Dataset$$anonfun$53.apply(Dataset.scala:3369)
    at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:80)
    at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:127)
    at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:75)
    at org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$withAction(Dataset.scala:3368)
    at org.apache.spark.sql.Dataset.<init>(Dataset.scala:194)
    at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:79)
    at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:643)
    at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:694)
    at org.apache.spark.sql.hive.thriftserver.SparkSQLDriver.run(SparkSQLDriver.scala:62)
    at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.processCmd(SparkSQLCLIDriver.scala:371)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:376)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:311)
    at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:193)
    at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
    at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:855)
    at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
    at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
    at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
    at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:930)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:939)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

最终发现spark的jars目录下存的是hive-exec-1.2.1.spark2.jar包,通过查看源发现,1.x版本使用的是以下授权类型,缺少INSERT和DELETE

 

 最终在网上找到问题出处

https://blog.csdn.net/zyzzxycj/article/details/104824072/

https://github.com/apache/hive/pull/894/files

 

标签:insert,scala,sql,up,hive,Error,apache,org,spark
From: https://www.cnblogs.com/colourness/p/17174396.html

相关文章