首页 > 编程语言 >hadoop的java.lang.InterruptedException

hadoop的java.lang.InterruptedException

时间:2022-10-31 13:34:14浏览次数:68  
标签:lang INFO 20 07 hadoop DFSOutputStream java 21


 运行hadoop的时候,爆出来java.lang.InterruptedException:

[root@node-1 text]# hadoop jar hadoop-04-1.0-SNAPSHOT.jar
19/07/21 20:41:48 INFO client.RMProxy: Connecting to ResourceManager at node-1/192.168.52.100:8032
19/07/21 20:41:49 WARN mapreduce.JobResourceUploader: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
19/07/21 20:41:50 INFO input.FileInputFormat: Total input paths to process : 1
19/07/21 20:41:50 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1252)
at java.lang.Thread.join(Thread.java:1326)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:967)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:705)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:894)
19/07/21 20:41:50 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1252)
at java.lang.Thread.join(Thread.java:1326)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:967)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:705)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:894)
19/07/21 20:41:50 INFO mapreduce.JobSubmitter: number of splits:1
19/07/21 20:41:51 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1563691024394_0002
19/07/21 20:41:52 INFO impl.YarnClientImpl: Submitted application application_1563691024394_0002
19/07/21 20:41:52 INFO mapreduce.Job: The url to track the job: http://node-1:8088/proxy/application_1563691024394_0002/
19/07/21 20:41:52 INFO mapreduce.Job: Running job: job_1563691024394_0002
19/07/21 20:42:10 INFO mapreduce.Job: Job job_1563691024394_0002 running in uber mode : true
19/07/21 20:42:10 INFO mapreduce.Job: map 0% reduce 0%
19/07/21 20:42:12 INFO mapreduce.Job: map 100% reduce 0%
19/07/21 20:42:14 INFO mapreduce.Job: map 100% reduce 100%
19/07/21 20:42:14 INFO mapreduce.Job: Job job_1563691024394_0002 completed successfully
19/07/21 20:42:15 INFO mapreduce.Job: Counters: 52
File System Counters
FILE: Number of bytes read=194
FILE: Number of bytes written=307
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=424
HDFS: Number of bytes written=306983
HDFS: Number of read operations=37
HDFS: Number of large read operations=0
HDFS: Number of write operations=14
Job Counters
Launched map tasks=1
Launched reduce tasks=1
Other local map tasks=1
Total time spent by all maps in occupied slots (ms)=0
Total time spent by all reduces in occupied slots (ms)=0
TOTAL_LAUNCHED_UBERTASKS=2
NUM_UBER_SUBMAPS=1
NUM_UBER_SUBREDUCES=1
Total time spent by all map tasks (ms)=2567
Total time spent by all reduce tasks (ms)=1905
Total vcore-milliseconds taken by all map tasks=0
Total vcore-milliseconds taken by all reduce tasks=0
Total megabyte-milliseconds taken by all map tasks=0
Total megabyte-milliseconds taken by all reduce tasks=0
Map-Reduce Framework
Map input records=3
Map output records=6
Map output bytes=63
Map output materialized bytes=81
Input split bytes=104
Combine input records=0
Combine output records=0
Reduce input groups=4
Reduce shuffle bytes=81
Reduce input records=6
Reduce output records=4
Spilled Records=12
Shuffled Maps =1
Failed Shuffles=0
Merged Map outputs=1
GC time elapsed (ms)=242
CPU time spent (ms)=3000
Physical memory (bytes) snapshot=545247232
Virtual memory (bytes) snapshot=6020751360
Total committed heap usage (bytes)=270802944
Shuffle Errors
BAD_ID=0
CONNECTION=0
IO_ERROR=0
WRONG_LENGTH=0
WRONG_MAP=0
WRONG_REDUCE=0
File Input Format Counters
Bytes Read=41
File Output Format Counters
Bytes Written=33

解释:

原因是:最初,DataStreamer :: closeResponder始终打印有关InterruptedException的警告; 自从​​HDFS-9812​​以来,DFSOutputStream :: closeImpl总是强制线程关闭,这会导致InterruptedException。

 

标签:lang,INFO,20,07,hadoop,DFSOutputStream,java,21
From: https://blog.51cto.com/u_12277263/5809364

相关文章

  • Hadoop搭建的时候,安装包编译为什么要编译?
    cdh版本编译 jar下载问题。因为官方只提供源码包 需要自己编译。软件运行某些特性跟操作系统相关 结合具体操作系统编译符合它版本的软件。修改源码中某些属性。编译......
  • 【Java】接口多态练习题
    设计一个USB接口,USB接口拥有启动和停止方法,网卡类实现USB接口,重写实现网卡启动停止方法声卡类实现USB接口,重写实现声卡启动和停止方法,主板类拥有使用USB接口的方法。方法......
  • java如何引用外部类(文末有视频讲解)
    导航​​步骤1全限定名​​​​步骤2编译,报错,欧耶~​​​​步骤3访问权限修饰符​​​​步骤4编译和运行Demo.java​​​​步骤5导入包​​​​步骤6导入静态方......
  • java访问权限讲解(文末有视频教程)
    导航​​步骤1图解​​​​步骤2public对于class是最常用的​​​​步骤3protected多用于继承的情况​​​​步骤4private一般就是用于属性​​​​步骤5企业中的......
  • java项目结构分类实战(文末有视频讲解)
    导航​​步骤1分包​​​​步骤2Application​​​​步骤3Customer​​​​步骤4Vip​​​​步骤5目录结构​​​​步骤6下载源码​​​​步骤7视频教程​​......
  • Java算法基础 - 单链表详解(文末有配套视频)
    导航​​步骤1只用Java类能实现吗?​​​​步骤2类里面有顾客属性​​​​步骤3排队打饭​​​​步骤4从一个顾客联系到另一个顾客​​​​步骤5加一个next字段​......
  • java项目如何打包?
    导航​​步骤1编写编译的脚本​​​​步骤2打包脚本​​​​步骤3jar包里面是什么​​D盘下面的tool文件夹已经有三个工具类了(其实是两个,CustNode是为了TuziLinkedLis......
  • Java基础 -- 我是这么理解static关键字的(文末配讲解视频)
    static是java里面的关键字,主要用来修饰属性和方法。打上static标记后,就是静态的,不需要new就可以访问。导航​​假如一个方法没有用到this?​​​​static的意义​​​​stati......
  • java项目如何引入其他jar包?
    导航​​步骤1lib目录​​​​步骤2修改编译脚本​​​​步骤3编译cstmgr项目​​​​步骤4新的运行方式​​​​步骤5不妨把编译和打包放一起​​​​步骤6总......
  • idea将所有的代码放在一个txt文件里面-java代码
    最近在申请软件著作权要求把所有的代码放在一个txt文件里一个类一个类复制我觉得太傻了,用java写了一个小程序,记录一下packagecom.ynk;importjava.io.*;importjava.util.......