参考文章: 在idea里面怎么远程提交spark任务到yarn集群
代码
注意setJars,提交的代码,要提前打好包。否则会报找不到类的错误
个人理解就相当于运行的main方法是起了一个spark-submit任务,提交任务到集群时还是要指定好任务的jar包,以便复制到各个Executor执行代码。
import org.apache.spark.rdd.RDD
import org.apache.spark.{SparkConf, SparkContext}
object TestSparkStandalone {
def main(args: Array[String]): Unit = {
val conf = new SparkConf().setAppName("TestSparkStandalone")
.setMaster("spark://xxx.xxx.xxx.xxx:7077")
.setJars(List("D:\\CodePlace\\myspark\\target\\myspark-1.0-SNAPSHOT.jar"))
val spark = new SparkContext(conf)
val value: RDD[Int] = spark.makeRDD(1 to 5)
value.foreach(println _)
}
}
driver问题
报错内容:
Caused by: java.net.UnknownHostException: LAPTOP-2B1EN4I2
at java.net.InetAddress.getAllByName0(InetAddress.java:1281)
at java.net.InetAddress.getAllByName(InetAddress.java:1193)
at java.net.InetAddress.getAllByName(InetAddress.java:1127)
at java.net.InetAddress.getByName(InetAddress.java:1077)
解决方式:
回调driver端接口的时候域名解析问题,在executor的机器上配置hosts文件,将LAPTOP-2B1EN4I2指定对应的ip地址即可
标签:java,standalone,xxx,IDEA,提交,InetAddress,spark,net From: https://www.cnblogs.com/zhanggengdi/p/16903914.html