开发过程中,log4j日志级别设为ALL,还是看不到问题所在,怎么排查?

来源:4-17 流量统计实战升级之自定义Partitioner

他门说这就是人生

2020-01-19

我把老师的代码拷过来就能运行,所以maven依赖没问题。

但断点都还没能进入Mapper里,main方法内的boolean result = job.waitForCompletion(true)中返回了false,我应该怎样排查呢?

以下是贴出的代码和日志:

main方法:

public static void main(String[] args) throws IOException, ClassNotFoundException, InterruptedException, URISyntaxException {

        Configuration configuration = new Configuration();
//        configuration.set("fs.defaultFS", "hdfs://192.168.230.129:8020");

        Job job = Job.getInstance(configuration);

        job.setJarByClass(QuestionLocalApp.class);

        job.setMapperClass(QuestionMapper.class);
        job.setReducerClass(QuestionReducer.class);

        job.setMapOutputKeyClass(QuestionModel.class);
        job.setMapOutputValueClass(LongWritable.class);

        job.setOutputKeyClass(QuestionModel.class);
        job.setOutputValueClass(LongWritable.class);

//        FileSystem fileSystem = FileSystem.get(new URI("hdfs://192.168.230.129:8020"), configuration, "root");
        FileSystem fileSystem = FileSystem.get(configuration);
        Path outputPath = new Path("data/output");
        if (fileSystem.exists(outputPath)) {
            fileSystem.delete(outputPath, true);
        }

        FileInputFormat.setInputPaths(job, new Path("data/input/question_log.txt"));
        FileOutputFormat.setOutputPath(job, outputPath);

        boolean result = job.waitForCompletion(true);

        System.exit(result ? 0 : -1);

    }

日志:

SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
[DEBUG] method:org.apache.htrace.core.Tracer$Builder.loadSamplers(Tracer.java:106)
sampler.classes = ; loaded no samplers
[TRACE] method:org.apache.htrace.core.TracerId.(TracerId.java:134)
ProcessID(fmt=%{tname}/%{ip}): computed process ID of "FSClient/192.168.230.130"
[TRACE] method:org.apache.htrace.core.TracerPool.addTracer(TracerPool.java:264)
TracerPool(Global): adding tracer Tracer(FSClient/192.168.230.130)
[DEBUG] method:org.apache.htrace.core.Tracer$Builder.loadSpanReceivers(Tracer.java:128)
span.receiver.classes = ; loaded no span receivers
[TRACE] method:org.apache.htrace.core.Tracer$Builder.build(Tracer.java:165)
Created Tracer(FSClient/192.168.230.130) for FSClient

Process finished with exit code 255

ps: 下面贴出全部代码

mapper:

package com.gaojingsi.question.hadoop.statistics;

import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Mapper;

import java.io.IOException;

public class QuestionMapper extends Mapper<LongWritable, Text, QuestionModel, LongWritable> {

    @Override
    protected void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException {

        String[] row = value.toString().split(" ");
        String questionDescription = row[0];
        long questionId = Long.parseLong(row[1]);
        context.write(new QuestionModel(questionId, questionDescription), new LongWritable(1L));
    }
}

reducer:

package com.gaojingsi.question.hadoop.statistics;

import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.mapreduce.Reducer;

import java.io.IOException;

public class QuestionReducer extends Reducer<QuestionModel, LongWritable, QuestionModel, LongWritable> {

    @Override
    protected void reduce(QuestionModel key, Iterable<LongWritable> values, Context context) throws IOException, InterruptedException {

        long amount = 0;

        for (LongWritable result :
                values) {
            amount += result.get();
        }
        context.write(key, new LongWritable(amount));

    }
}

questionModel:

package com.gaojingsi.question.hadoop.statistics;

import org.apache.hadoop.io.Writable;

import java.io.DataInput;
import java.io.DataOutput;
import java.io.IOException;

public class QuestionModel implements Writable {

    private Long id;
    private String question;

    public QuestionModel() {
    }

    public QuestionModel(Long id, String question) {
        this.id = id;
        this.question = question;
    }

    @Override
    public String toString() {
        return "QuestionModel{" +
                "id=" + id +
                ", question='" + question + '\'' +
                '}';
    }

    public Long getId() {
        return id;
    }

    public void setId(Long id) {
        this.id = id;
    }

    public String getQuestion() {
        return question;
    }

    public void setQuestion(String question) {
        this.question = question;
    }

    @Override
    public void write(DataOutput dataOutput) throws IOException {
        dataOutput.writeLong(this.id);
        dataOutput.writeUTF(this.question);
    }

    @Override
    public void readFields(DataInput dataInput) throws IOException {
        this.id = dataInput.readLong();
        this.question = dataInput.readUTF();
    }
}

question_log.txt:

XXX怎么办理 615
单位帮职工办理XXX的,需要提供哪些资料? 674
XXX需要提供什么资料? 562
XXX办理申领资料有哪些? 615
办理XXX的资料有哪些 615
办理XXX的资料 615

main方法:

package com.gaojingsi.question;

import com.gaojingsi.question.hadoop.statistics.QuestionMapper;
import com.gaojingsi.question.hadoop.statistics.QuestionModel;
import com.gaojingsi.question.hadoop.statistics.QuestionReducer;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;

import java.io.IOException;
import java.net.URI;
import java.net.URISyntaxException;

/**
 * Hello world!
 */
public class App {
    public static void main(String[] args) throws IOException, ClassNotFoundException, InterruptedException, URISyntaxException {

        Configuration configuration = new Configuration();

        Job job = Job.getInstance(configuration);

        job.setJarByClass(App.class);

        job.setMapperClass(QuestionMapper.class);
        job.setReducerClass(QuestionReducer.class);

        job.setMapOutputKeyClass(QuestionModel.class);
        job.setMapOutputValueClass(LongWritable.class);

        job.setOutputKeyClass(QuestionModel.class);
        job.setOutputValueClass(LongWritable.class);

        FileSystem fileSystem = FileSystem.get(configuration);
        Path outputPath = new Path("data/output");
        if (fileSystem.exists(outputPath)) {
            fileSystem.delete(outputPath, true);
        }

        FileInputFormat.setInputPaths(job, new Path("data/input/question_log.txt"));
        FileOutputFormat.setOutputPath(job, outputPath);

        boolean result = job.waitForCompletion(true);

        System.exit(result ? 0 : -1);

    }
}

注:日志文件第一个字段没有实际用出,就是一个描述,是按第二个字段(问题ID)统计每个问题出现次数的。

写回答

1回答

Michael_PK

2020-01-19

进不去的话,pomreimport,idea重新编译下工程,一定要先保证开发环境是OK的。而且你现在日志里面并没有看到error信息

0
15
他门说这就是人生
回复
Michael_PK
实现WritableComparable接口后,问题解决了。谢谢老师帮助。
2020-01-22
共15条回复

Hadoop 系统入门+核心精讲

从Hadoop核心技术入手,掌握数据处理中ETL应用,轻松进军大数据

2397 学习 · 902 问题

查看课程