hadoop第一个程序WordCount出现的问题

我用mapreduce做单词的统计
数据:

图片说明

map程序

public class MyMapper extends Mapper<LongWritable,Text,Text,LongWritable> {
    @Override
    protected void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException {
        Text word = new Text();
        LongWritable one = new LongWritable(1);
        //将Text转为String
        String line = value.toString();
        //分词
        String[] wordArr = line.split("\\s+");

        word.set(wordArr[0]);
        //将词的次数放入context
        context.write(word ,one);
    }
}

reduce程序:

public class MyReducer extends Reducer<Text,LongWritable,Text,LongWritable> {
    @Override
    protected void reduce(Text key, Iterable<LongWritable> values, Context context) throws IOException, InterruptedException {
        Long sum = 0L;
        for(LongWritable value:values){
            sum += value.get();
        }
        context.write(key,new LongWritable(sum));
    }
}

结果:
图片说明

我只想统计第一列的词汇
想要的结果是
word 1
hello 1
hahha 1
ahaha 1

到底哪里出错了

1个回答

sum += value.get();
->
sum = 1;
break;

qq_23239685
式微胡不归 谢谢
一年多之前 回复
Csdn user default icon
上传中...
上传图片
插入图片
抄袭、复制答案,以达到刷声望分或其他目的的行为,在CSDN问答是严格禁止的,一经发现立刻封号。是时候展现真正的技术了!
立即提问
相关内容推荐