1.执行命令
[hadoop@namenode mapreduce]$ hadoop jar hadoop-mapreduce-examples-3.3.6.jar pi 2 2
hadoop jar
Hadoop jar命令hadoop-mapreduce-examples-3.3.6.jar
程序所在jar包pi
- 2 2——参数
2.执行信息
Number of Maps = 2
Samples per Map = 2
Wrote input for Map #0
Wrote input for Map #1
Starting Job
2023-10-30 05:05:05,746 INFO client.DefaultNoHARMFailoverProxyProvider: Connecting to ResourceManager at namenode/192.168.42.134:8032
2023-10-30 05:05:06,669 INFO mapreduce.JobResourceUploader: Disabling Erasure Coding for path: /tmp/hadoop-yarn/staging/hadoop/.staging/job_1698655691785_0001 #hadoop job的id为job_1698655691785_0001
2023-10-30 05:05:06,957 INFO input.FileInputFormat: Total input files to process : 2 #参数
2023-10-30 05:05:07,130 INFO mapreduce.JobSubmitter: number of splits:2 #分片为2
2023-10-30 05:05:07,613 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1698655691785_0001
2023-10-30 05:05:07,613 INFO mapreduce.JobSubmitter: Executing with tokens: []
2023-10-30 05:05:07,948 INFO conf.Configuration: resource-types.xml not found #暂不影响
2023-10-30 05:05:07,948 INFO resource.ResourceUtils: Unable to find 'resource-types.xml'.
2023-10-30 05:05:09,318 INFO impl.YarnClientImpl: Submitted application application_1698655691785_0001
2023-10-30 05:05:09,529 INFO mapreduce.Job: The url to track the job: http://namenode:8088/proxy/application_1698655691785_0001/
2023-10-30 05:05:09,530 INFO mapreduce.Job: Running job: job_1698655691785_0001
2023-10-30 05:05:22,935 INFO mapreduce.Job: Job job_1698655691785_0001 running in uber mode : false
2023-10-30 05:05:22,941 INFO mapreduce.Job: map 0% reduce 0%
2023-10-30 05:05:36,300 INFO mapreduce.Job: map 100% reduce 0%
2023-10-30 05:05:46,544 INFO mapreduce.Job: map 100% reduce 100%
2023-10-30 05:05:47,561 INFO mapreduce.Job: Job job_1698655691785_0001 completed successfully
2023-10-30 05:05:47,714 INFO mapreduce.Job: Counters: 54
File System Counters
FILE: Number of bytes read=50
FILE: Number of bytes written=831506
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=528
HDFS: Number of bytes written=215
HDFS: Number of read operations=13
HDFS: Number of large read operations=0
HDFS: Number of write operations=3
HDFS: Number of bytes read erasure-coded=0
Job Counters
Launched map tasks=2 #map task个数,因为splits为2
Launched reduce tasks=1 #reduce task个数,默认为1
Data-local map tasks=2
Total time spent by all maps in occupied slots (ms)=20629
Total time spent by all reduces in occupied slots (ms)=5938
Total time spent by all map tasks (ms)=20629
Total time spent by all reduce tasks (ms)=5938
Total vcore-milliseconds taken by all map tasks=20629
Total vcore-milliseconds taken by all reduce tasks=5938
Total megabyte-milliseconds taken by all map tasks=21124096
Total megabyte-milliseconds taken by all reduce tasks=6080512
Map-Reduce Framework
Map input records=2
Map output records=4
Map output bytes=36
Map output materialized bytes=56
Input split bytes=292
Combine input records=0
Combine output records=0
Reduce input groups=2
Reduce shuffle bytes=56
Reduce input records=4
Reduce output records=0
Spilled Records=8
Shuffled Maps =2
Failed Shuffles=0
Merged Map outputs=2
GC time elapsed (ms)=393
CPU time spent (ms)=2200
Physical memory (bytes) snapshot=499138560
Virtual memory (bytes) snapshot=8220651520
Total committed heap usage (bytes)=269922304
Peak Map Physical memory (bytes)=191692800
Peak Map Virtual memory (bytes)=2737623040
Peak Reduce Physical memory (bytes)=115863552
Peak Reduce Virtual memory (bytes)=2745405440
Shuffle Errors
BAD_ID=0
CONNECTION=0
IO_ERROR=0
WRONG_LENGTH=0
WRONG_MAP=0
WRONG_REDUCE=0
File Input Format Counters
Bytes Read=236
File Output Format Counters
Bytes Written=97
Job Finished in 42.136 seconds
Estimated value of Pi is 4.00000000000000000000 #执行结果,受参数影响,精度并不高。
标签:INFO,10,初体验,05,30,bytes,hadoop,2023,pi
From: https://www.cnblogs.com/ghoodoo/p/17798329.html