ReduceJoin 输出空文件,但是没有报错

来源:10-3 ReduceJoin功能实现

zhangyulei

2019-11-20

开始以为自己代码写错了, 后来拷贝PK哥代码过来运行,结果还是输出了空文件,也没有任何错误提示。
图片描述

"C:\Program Files\Java\jdk1.8.0_202\bin\java.exe" "-javaagent:D:\Idea\IntelliJ IDEA 2018.3.1\lib\idea_rt.jar=60253:D:\Idea\IntelliJ IDEA 2018.3.1\bin" -Dfile.encoding=UTF-8 -classpath "C:\Program Files\Java\jdk1.8.0_202\jre\lib\charsets.jar;C:\Program Files\Java\jdk1.8.0_202\jre\lib\deploy.jar;C:\Program Files\Java\jdk1.8.0_202\jre\lib\ext\access-bridge-64.jar;C:\Program Files\Java\jdk1.8.0_202\jre\lib\ext\cldrdata.jar;C:\Program Files\Java\jdk1.8.0_202\jre\lib\ext\dnsns.jar;C:\Program Files\Java\jdk1.8.0_202\jre\lib\ext\jaccess.jar;C:\Program Files\Java\jdk1.8.0_202\jre\lib\ext\jfxrt.jar;C:\Program Files\Java\jdk1.8.0_202\jre\lib\ext\localedata.jar;C:\Program Files\Java\jdk1.8.0_202\jre\lib\ext\nashorn.jar;C:\Program Files\Java\jdk1.8.0_202\jre\lib\ext\sunec.jar;C:\Program Files\Java\jdk1.8.0_202\jre\lib\ext\sunjce_provider.jar;C:\Program Files\Java\jdk1.8.0_202\jre\lib\ext\sunmscapi.jar;C:\Program Files\Java\jdk1.8.0_202\jre\lib\ext\sunpkcs11.jar;C:\Program Files\Java\jdk1.8.0_202\jre\lib\ext\zipfs.jar;C:\Program Files\Java\jdk1.8.0_202\jre\lib\javaws.jar;C:\Program Files\Java\jdk1.8.0_202\jre\lib\jce.jar;C:\Program Files\Java\jdk1.8.0_202\jre\lib\jfr.jar;C:\Program Files\Java\jdk1.8.0_202\jre\lib\jfxswt.jar;C:\Program Files\Java\jdk1.8.0_202\jre\lib\jsse.jar;C:\Program Files\Java\jdk1.8.0_202\jre\lib\management-agent.jar;C:\Program Files\Java\jdk1.8.0_202\jre\lib\plugin.jar;C:\Program Files\Java\jdk1.8.0_202\jre\lib\resources.jar;C:\Program Files\Java\jdk1.8.0_202\jre\lib\rt.jar;D:\Idea\workspace\hadoop-train-v2\target\classes;D:\Idea\maven\repository\org\apache\hadoop\hadoop-client\2.6.0\hadoop-client-2.6.0.jar;D:\Idea\maven\repository\org\apache\hadoop\hadoop-common\2.6.0\hadoop-common-2.6.0.jar;D:\Idea\maven\repository\com\google\guava\guava\11.0.2\guava-11.0.2.jar;D:\Idea\maven\repository\commons-cli\commons-cli\1.2\commons-cli-1.2.jar;D:\Idea\maven\repository\org\apache\commons\commons-math3\3.1.1\commons-math3-3.1.1.jar;D:\Idea\maven\repository\xmlenc\xmlenc\0.52\xmlenc-0.52.jar;D:\Idea\maven\repository\commons-httpclient\commons-httpclient\3.1\commons-httpclient-3.1.jar;D:\Idea\maven\repository\commons-codec\commons-codec\1.4\commons-codec-1.4.jar;D:\Idea\maven\repository\commons-io\commons-io\2.4\commons-io-2.4.jar;D:\Idea\maven\repository\commons-net\commons-net\3.1\commons-net-3.1.jar;D:\Idea\maven\repository\commons-collections\commons-collections\3.2.1\commons-collections-3.2.1.jar;D:\Idea\maven\repository\commons-logging\commons-logging\1.1.3\commons-logging-1.1.3.jar;D:\Idea\maven\repository\log4j\log4j\1.2.17\log4j-1.2.17.jar;D:\Idea\maven\repository\commons-lang\commons-lang\2.6\commons-lang-2.6.jar;D:\Idea\maven\repository\commons-configuration\commons-configuration\1.6\commons-configuration-1.6.jar;D:\Idea\maven\repository\commons-digester\commons-digester\1.8\commons-digester-1.8.jar;D:\Idea\maven\repository\commons-beanutils\commons-beanutils\1.7.0\commons-beanutils-1.7.0.jar;D:\Idea\maven\repository\commons-beanutils\commons-beanutils-core\1.8.0\commons-beanutils-core-1.8.0.jar;D:\Idea\maven\repository\org\slf4j\slf4j-api\1.7.5\slf4j-api-1.7.5.jar;D:\Idea\maven\repository\org\slf4j\slf4j-log4j12\1.7.5\slf4j-log4j12-1.7.5.jar;D:\Idea\maven\repository\org\codehaus\jackson\jackson-core-asl\1.9.13\jackson-core-asl-1.9.13.jar;D:\Idea\maven\repository\org\codehaus\jackson\jackson-mapper-asl\1.9.13\jackson-mapper-asl-1.9.13.jar;D:\Idea\maven\repository\org\apache\avro\avro\1.7.4\avro-1.7.4.jar;D:\Idea\maven\repository\com\thoughtworks\paranamer\paranamer\2.3\paranamer-2.3.jar;D:\Idea\maven\repository\org\xerial\snappy\snappy-java\1.0.4.1\snappy-java-1.0.4.1.jar;D:\Idea\maven\repository\com\google\protobuf\protobuf-java\2.5.0\protobuf-java-2.5.0.jar;D:\Idea\maven\repository\com\google\code\gson\gson\2.2.4\gson-2.2.4.jar;D:\Idea\maven\repository\org\apache\hadoop\hadoop-auth\2.6.0\hadoop-auth-2.6.0.jar;D:\Idea\maven\repository\org\apache\httpcomponents\httpclient\4.2.5\httpclient-4.2.5.jar;D:\Idea\maven\repository\org\apache\httpcomponents\httpcore\4.2.4\httpcore-4.2.4.jar;D:\Idea\maven\repository\org\apache\directory\server\apacheds-kerberos-codec\2.0.0-M15\apacheds-kerberos-codec-2.0.0-M15.jar;D:\Idea\maven\repository\org\apache\directory\server\apacheds-i18n\2.0.0-M15\apacheds-i18n-2.0.0-M15.jar;D:\Idea\maven\repository\org\apache\directory\api\api-asn1-api\1.0.0-M20\api-asn1-api-1.0.0-M20.jar;D:\Idea\maven\repository\org\apache\directory\api\api-util\1.0.0-M20\api-util-1.0.0-M20.jar;D:\Idea\maven\repository\org\apache\curator\curator-framework\2.6.0\curator-framework-2.6.0.jar;D:\Idea\maven\repository\org\apache\curator\curator-client\2.6.0\curator-client-2.6.0.jar;D:\Idea\maven\repository\org\apache\curator\curator-recipes\2.6.0\curator-recipes-2.6.0.jar;D:\Idea\maven\repository\com\google\code\findbugs\jsr305\1.3.9\jsr305-1.3.9.jar;D:\Idea\maven\repository\org\htrace\htrace-core\3.0.4\htrace-core-3.0.4.jar;D:\Idea\maven\repository\org\apache\zookeeper\zookeeper\3.4.6\zookeeper-3.4.6.jar;D:\Idea\maven\repository\org\apache\commons\commons-compress\1.4.1\commons-compress-1.4.1.jar;D:\Idea\maven\repository\org\tukaani\xz\1.0\xz-1.0.jar;D:\Idea\maven\repository\org\apache\hadoop\hadoop-hdfs\2.6.0\hadoop-hdfs-2.6.0.jar;D:\Idea\maven\repository\org\mortbay\jetty\jetty-util\6.1.26\jetty-util-6.1.26.jar;D:\Idea\maven\repository\io\netty\netty\3.6.2.Final\netty-3.6.2.Final.jar;D:\Idea\maven\repository\xerces\xercesImpl\2.9.1\xercesImpl-2.9.1.jar;D:\Idea\maven\repository\xml-apis\xml-apis\1.3.04\xml-apis-1.3.04.jar;D:\Idea\maven\repository\org\apache\hadoop\hadoop-mapreduce-client-app\2.6.0\hadoop-mapreduce-client-app-2.6.0.jar;D:\Idea\maven\repository\org\apache\hadoop\hadoop-mapreduce-client-common\2.6.0\hadoop-mapreduce-client-common-2.6.0.jar;D:\Idea\maven\repository\org\apache\hadoop\hadoop-yarn-client\2.6.0\hadoop-yarn-client-2.6.0.jar;D:\Idea\maven\repository\org\apache\hadoop\hadoop-yarn-server-common\2.6.0\hadoop-yarn-server-common-2.6.0.jar;D:\Idea\maven\repository\org\apache\hadoop\hadoop-mapreduce-client-shuffle\2.6.0\hadoop-mapreduce-client-shuffle-2.6.0.jar;D:\Idea\maven\repository\org\fusesource\leveldbjni\leveldbjni-all\1.8\leveldbjni-all-1.8.jar;D:\Idea\maven\repository\org\apache\hadoop\hadoop-yarn-api\2.6.0\hadoop-yarn-api-2.6.0.jar;D:\Idea\maven\repository\org\apache\hadoop\hadoop-mapreduce-client-core\2.6.0\hadoop-mapreduce-client-core-2.6.0.jar;D:\Idea\maven\repository\org\apache\hadoop\hadoop-yarn-common\2.6.0\hadoop-yarn-common-2.6.0.jar;D:\Idea\maven\repository\javax\xml\bind\jaxb-api\2.2.2\jaxb-api-2.2.2.jar;D:\Idea\maven\repository\javax\xml\stream\stax-api\1.0-2\stax-api-1.0-2.jar;D:\Idea\maven\repository\javax\activation\activation\1.1\activation-1.1.jar;D:\Idea\maven\repository\javax\servlet\servlet-api\2.5\servlet-api-2.5.jar;D:\Idea\maven\repository\com\sun\jersey\jersey-core\1.9\jersey-core-1.9.jar;D:\Idea\maven\repository\com\sun\jersey\jersey-client\1.9\jersey-client-1.9.jar;D:\Idea\maven\repository\org\codehaus\jackson\jackson-jaxrs\1.9.13\jackson-jaxrs-1.9.13.jar;D:\Idea\maven\repository\org\codehaus\jackson\jackson-xc\1.9.13\jackson-xc-1.9.13.jar;D:\Idea\maven\repository\org\apache\hadoop\hadoop-mapreduce-client-jobclient\2.6.0\hadoop-mapreduce-client-jobclient-2.6.0.jar;D:\Idea\maven\repository\org\apache\hadoop\hadoop-annotations\2.6.0\hadoop-annotations-2.6.0.jar" com.derek.bigdata.hadoop.mr.join.ReduceJoinApp
[INFO ] method:org.apache.hadoop.conf.Configuration.warnOnceIfDeprecated(Configuration.java:1049)
session.id is deprecated. Instead, use dfs.metrics.session-id
[INFO ] method:org.apache.hadoop.metrics.jvm.JvmMetrics.init(JvmMetrics.java:76)
Initializing JVM Metrics with processName=JobTracker, sessionId=
[WARN ] method:org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:153)
Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
[WARN ] method:org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:261)
No job jar file set.  User classes may not be found. See Job or Job#setJar(String).
[INFO ] method:org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:281)
Total input paths to process : 2
[INFO ] method:org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:494)
number of splits:2
[INFO ] method:org.apache.hadoop.mapreduce.JobSubmitter.printTokens(JobSubmitter.java:583)
Submitting tokens for job: job_local620867361_0001
[INFO ] method:org.apache.hadoop.mapreduce.Job.submit(Job.java:1300)
The url to track the job: http://localhost:8080/
[INFO ] method:org.apache.hadoop.mapreduce.Job.monitorAndPrintJob(Job.java:1345)
Running job: job_local620867361_0001
[INFO ] method:org.apache.hadoop.mapred.LocalJobRunner$Job.createOutputCommitter(LocalJobRunner.java:471)
OutputCommitter set in config null
[INFO ] method:org.apache.hadoop.mapred.LocalJobRunner$Job.createOutputCommitter(LocalJobRunner.java:489)
OutputCommitter is org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter
[INFO ] method:org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:448)
Waiting for map tasks
[INFO ] method:org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:224)
Starting task: attempt_local620867361_0001_m_000000_0
[INFO ] method:org.apache.hadoop.yarn.util.ProcfsBasedProcessTree.isAvailable(ProcfsBasedProcessTree.java:181)
ProcfsBasedProcessTree currently is supported only on Linux.
[INFO ] method:org.apache.hadoop.mapred.Task.initialize(Task.java:587)
 Using ResourceCalculatorProcessTree : org.apache.hadoop.yarn.util.WindowsBasedProcessTree@43aedc47
[INFO ] method:org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:753)
Processing split: file:/D:/Idea/workspace/hadoop-train-v2/input/join/input/emp.txt:0+703
[INFO ] method:org.apache.hadoop.mapred.MapTask$MapOutputBuffer.setEquator(MapTask.java:1202)
(EQUATOR) 0 kvi 26214396(104857584)
[INFO ] method:org.apache.hadoop.mapred.MapTask$MapOutputBuffer.init(MapTask.java:995)
mapreduce.task.io.sort.mb: 100
[INFO ] method:org.apache.hadoop.mapred.MapTask$MapOutputBuffer.init(MapTask.java:996)
soft limit at 83886080
[INFO ] method:org.apache.hadoop.mapred.MapTask$MapOutputBuffer.init(MapTask.java:997)
bufstart = 0; bufvoid = 104857600
[INFO ] method:org.apache.hadoop.mapred.MapTask$MapOutputBuffer.init(MapTask.java:998)
kvstart = 26214396; length = 6553600
[INFO ] method:org.apache.hadoop.mapred.MapTask.createSortingCollector(MapTask.java:402)
Map output collector class = org.apache.hadoop.mapred.MapTask$MapOutputBuffer
[INFO ] method:org.apache.hadoop.mapred.LocalJobRunner$Job.statusUpdate(LocalJobRunner.java:591)

[INFO ] method:org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1457)
Starting flush of map output
[INFO ] method:org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1475)
Spilling map output
[INFO ] method:org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1476)
bufstart = 0; bufend = 362; bufvoid = 104857600
[INFO ] method:org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1478)
kvstart = 26214396(104857584); kvend = 26214348(104857392); length = 49/6553600
[INFO ] method:org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1660)
Finished spill 0
[INFO ] method:org.apache.hadoop.mapred.Task.done(Task.java:1001)
Task:attempt_local620867361_0001_m_000000_0 is done. And is in the process of committing
[INFO ] method:org.apache.hadoop.mapred.LocalJobRunner$Job.statusUpdate(LocalJobRunner.java:591)
map
[INFO ] method:org.apache.hadoop.mapred.Task.sendDone(Task.java:1121)
Task 'attempt_local620867361_0001_m_000000_0' done.
[INFO ] method:org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:249)
Finishing task: attempt_local620867361_0001_m_000000_0
[INFO ] method:org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:224)
Starting task: attempt_local620867361_0001_m_000001_0
[INFO ] method:org.apache.hadoop.yarn.util.ProcfsBasedProcessTree.isAvailable(ProcfsBasedProcessTree.java:181)
ProcfsBasedProcessTree currently is supported only on Linux.
[INFO ] method:org.apache.hadoop.mapred.Task.initialize(Task.java:587)
 Using ResourceCalculatorProcessTree : org.apache.hadoop.yarn.util.WindowsBasedProcessTree@47ae5032
[INFO ] method:org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:753)
Processing split: file:/D:/Idea/workspace/hadoop-train-v2/input/join/input/dept.txt:0+106
[INFO ] method:org.apache.hadoop.mapred.MapTask$MapOutputBuffer.setEquator(MapTask.java:1202)
(EQUATOR) 0 kvi 26214396(104857584)
[INFO ] method:org.apache.hadoop.mapred.MapTask$MapOutputBuffer.init(MapTask.java:995)
mapreduce.task.io.sort.mb: 100
[INFO ] method:org.apache.hadoop.mapred.MapTask$MapOutputBuffer.init(MapTask.java:996)
soft limit at 83886080
[INFO ] method:org.apache.hadoop.mapred.MapTask$MapOutputBuffer.init(MapTask.java:997)
bufstart = 0; bufvoid = 104857600
[INFO ] method:org.apache.hadoop.mapred.MapTask$MapOutputBuffer.init(MapTask.java:998)
kvstart = 26214396; length = 6553600
[INFO ] method:org.apache.hadoop.mapred.MapTask.createSortingCollector(MapTask.java:402)
Map output collector class = org.apache.hadoop.mapred.MapTask$MapOutputBuffer
[INFO ] method:org.apache.hadoop.mapred.LocalJobRunner$Job.statusUpdate(LocalJobRunner.java:591)

[INFO ] method:org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1457)
Starting flush of map output
[INFO ] method:org.apache.hadoop.mapred.Task.done(Task.java:1001)
Task:attempt_local620867361_0001_m_000001_0 is done. And is in the process of committing
[INFO ] method:org.apache.hadoop.mapred.LocalJobRunner$Job.statusUpdate(LocalJobRunner.java:591)
map
[INFO ] method:org.apache.hadoop.mapred.Task.sendDone(Task.java:1121)
Task 'attempt_local620867361_0001_m_000001_0' done.
[INFO ] method:org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:249)
Finishing task: attempt_local620867361_0001_m_000001_0
[INFO ] method:org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:456)
map task executor complete.
[INFO ] method:org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:448)
Waiting for reduce tasks
[INFO ] method:org.apache.hadoop.mapred.LocalJobRunner$Job$ReduceTaskRunnable.run(LocalJobRunner.java:302)
Starting task: attempt_local620867361_0001_r_000000_0
[INFO ] method:org.apache.hadoop.yarn.util.ProcfsBasedProcessTree.isAvailable(ProcfsBasedProcessTree.java:181)
ProcfsBasedProcessTree currently is supported only on Linux.
[INFO ] method:org.apache.hadoop.mapred.Task.initialize(Task.java:587)
 Using ResourceCalculatorProcessTree : org.apache.hadoop.yarn.util.WindowsBasedProcessTree@b172bf2
[INFO ] method:org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:362)
Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@52588ddd
[INFO ] method:org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl.<init>(MergeManagerImpl.java:196)
MergerManager: memoryLimit=1320838784, maxSingleShuffleLimit=330209696, mergeThreshold=871753664, ioSortFactor=10, memToMemMergeOutputsThreshold=10
[INFO ] method:org.apache.hadoop.mapreduce.task.reduce.EventFetcher.run(EventFetcher.java:61)
attempt_local620867361_0001_r_000000_0 Thread started: EventFetcher for fetching Map Completion Events
[INFO ] method:org.apache.hadoop.mapreduce.task.reduce.LocalFetcher.copyMapOutput(LocalFetcher.java:141)
localfetcher#1 about to shuffle output of map attempt_local620867361_0001_m_000001_0 decomp: 2 len: 6 to MEMORY
[INFO ] method:org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput.shuffle(InMemoryMapOutput.java:100)
Read 2 bytes from map-output for attempt_local620867361_0001_m_000001_0
[INFO ] method:org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl.closeInMemoryFile(MergeManagerImpl.java:314)
closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
[INFO ] method:org.apache.hadoop.mapreduce.task.reduce.LocalFetcher.copyMapOutput(LocalFetcher.java:141)
localfetcher#1 about to shuffle output of map attempt_local620867361_0001_m_000000_0 decomp: 390 len: 394 to MEMORY
[INFO ] method:org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput.shuffle(InMemoryMapOutput.java:100)
Read 390 bytes from map-output for attempt_local620867361_0001_m_000000_0
[INFO ] method:org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl.closeInMemoryFile(MergeManagerImpl.java:314)
closeInMemoryFile -> map-output of size: 390, inMemoryMapOutputs.size() -> 2, commitMemory -> 2, usedMemory ->392
[INFO ] method:org.apache.hadoop.mapreduce.task.reduce.EventFetcher.run(EventFetcher.java:76)
EventFetcher is interrupted.. Returning
[INFO ] method:org.apache.hadoop.mapred.LocalJobRunner$Job.statusUpdate(LocalJobRunner.java:591)
2 / 2 copied.
[INFO ] method:org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl.finalMerge(MergeManagerImpl.java:674)
finalMerge called with 2 in-memory map-outputs and 0 on-disk map-outputs
[INFO ] method:org.apache.hadoop.mapred.Merger$MergeQueue.merge(Merger.java:597)
Merging 2 sorted segments
[INFO ] method:org.apache.hadoop.mapred.Merger$MergeQueue.merge(Merger.java:696)
Down to the last merge-pass, with 1 segments left of total size: 384 bytes
[INFO ] method:org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl.finalMerge(MergeManagerImpl.java:751)
Merged 2 segments, 392 bytes to disk to satisfy reduce memory limit
[INFO ] method:org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl.finalMerge(MergeManagerImpl.java:781)
Merging 1 files, 394 bytes from disk
[INFO ] method:org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl.finalMerge(MergeManagerImpl.java:796)
Merging 0 segments, 0 bytes from memory into reduce
[INFO ] method:org.apache.hadoop.mapred.Merger$MergeQueue.merge(Merger.java:597)
Merging 1 sorted segments
[INFO ] method:org.apache.hadoop.mapred.Merger$MergeQueue.merge(Merger.java:696)
Down to the last merge-pass, with 1 segments left of total size: 384 bytes
[INFO ] method:org.apache.hadoop.mapred.LocalJobRunner$Job.statusUpdate(LocalJobRunner.java:591)
2 / 2 copied.
[INFO ] method:org.apache.hadoop.conf.Configuration.warnOnceIfDeprecated(Configuration.java:1049)
mapred.skip.on is deprecated. Instead, use mapreduce.job.skiprecords
[INFO ] method:org.apache.hadoop.mapred.Task.done(Task.java:1001)
Task:attempt_local620867361_0001_r_000000_0 is done. And is in the process of committing
[INFO ] method:org.apache.hadoop.mapred.LocalJobRunner$Job.statusUpdate(LocalJobRunner.java:591)
2 / 2 copied.
[INFO ] method:org.apache.hadoop.mapred.Task.commit(Task.java:1162)
Task attempt_local620867361_0001_r_000000_0 is allowed to commit now
[INFO ] method:org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter.commitTask(FileOutputCommitter.java:439)
Saved output of task 'attempt_local620867361_0001_r_000000_0' to file:/D:/Idea/workspace/hadoop-train-v2/input/join/output/_temporary/0/task_local620867361_0001_r_000000
[INFO ] method:org.apache.hadoop.mapred.LocalJobRunner$Job.statusUpdate(LocalJobRunner.java:591)
reduce > reduce
[INFO ] method:org.apache.hadoop.mapred.Task.sendDone(Task.java:1121)
Task 'attempt_local620867361_0001_r_000000_0' done.
[INFO ] method:org.apache.hadoop.mapred.LocalJobRunner$Job$ReduceTaskRunnable.run(LocalJobRunner.java:325)
Finishing task: attempt_local620867361_0001_r_000000_0
[INFO ] method:org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:456)
reduce task executor complete.
[INFO ] method:org.apache.hadoop.mapreduce.Job.monitorAndPrintJob(Job.java:1366)
Job job_local620867361_0001 running in uber mode : false
[INFO ] method:org.apache.hadoop.mapreduce.Job.monitorAndPrintJob(Job.java:1373)
 map 100% reduce 100%
[INFO ] method:org.apache.hadoop.mapreduce.Job.monitorAndPrintJob(Job.java:1384)
Job job_local620867361_0001 completed successfully
[INFO ] method:org.apache.hadoop.mapreduce.Job.monitorAndPrintJob(Job.java:1391)
Counters: 33
	File System Counters
		FILE: Number of bytes read=6395
		FILE: Number of bytes written=788332
		FILE: Number of read operations=0
		FILE: Number of large read operations=0
		FILE: Number of write operations=0
	Map-Reduce Framework
		Map input records=19
		Map output records=13
		Map output bytes=362
		Map output materialized bytes=400
		Input split bytes=589
		Combine input records=0
		Combine output records=0
		Reduce input groups=3
		Reduce shuffle bytes=400
		Reduce input records=13
		Reduce output records=0
		Spilled Records=26
		Shuffled Maps =2
		Failed Shuffles=0
		Merged Map outputs=2
		GC time elapsed (ms)=0
		CPU time spent (ms)=0
		Physical memory (bytes) snapshot=0
		Virtual memory (bytes) snapshot=0
		Total committed heap usage (bytes)=913833984
	Shuffle Errors
		BAD_ID=0
		CONNECTION=0
		IO_ERROR=0
		WRONG_LENGTH=0
		WRONG_MAP=0
		WRONG_REDUCE=0
	File Input Format Counters 
		Bytes Read=0
	File Output Format Counters 
		Bytes Written=8

Process finished with exit code 0

写回答

1回答

Michael_PK

2019-11-20

走一个debug,确定你的map数据都读进来没

0
2
Michael_PK
回复
zhangyulei
是的,idea是有这问题,数据文件千万不能在idea中编辑,不然各种问题
2019-11-21
共2条回复

Hadoop 系统入门+核心精讲

从Hadoop核心技术入手,掌握数据处理中ETL应用,轻松进军大数据

2408 学习 · 908 问题

查看课程