【问题标题】:hadoop jar error while copying data from mongoDB to hdfs将数据从 mongoDB 复制到 hdfs 时出现 hadoop jar 错误
【发布时间】:2017-06-06 22:15:40
【问题描述】:

我正在尝试使用 HadoopMongodb 连接器将集合从 mongodb 复制到 hadoop 使用下面的代码: 打包hdfs;

import java.io.*;
import org.apache.commons.logging.*;
import org.apache.hadoop.conf.*;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapreduce.lib.output.*;
import org.apache.hadoop.mapreduce.*;
import org.bson.*;
import com.mongodb.hadoop.*;
import com.mongodb.hadoop.util.*;

public class ImportWeblogsFromMongo {
    private static final Log log = LogFactory.getLog(ImportWeblogsFromMongo.class);

    public static class ReadWeblogsFromMongo extends Mapper<Object, BSONObject, Text, Text> {
        public void map(Object key, BSONObject value, Context context) throws IOException, InterruptedException {
            System.out.println("Key: " + key);
            System.out.println("Value: " + value);
            String md5 = value.get("md5").toString();
            String url = value.get("url").toString();
            String date = value.get("date").toString();
            String time = value.get("time").toString();
            String ip = value.get("ip").toString();
            String output = "\t" + url + "\t" + date + "\t" + time + "\t" + ip;
            context.write(new Text(md5), new Text(output));
        }
    }

    public static void main(String[] args) throws Exception {
        final Configuration conf = new Configuration();
        MongoConfigUtil.setInputURI(conf,"mongodb://localhost:27017/clusterdb.fish");
        MongoConfigUtil.setCreateInputSplits(conf, false);
        System.out.println("Configuration: " + conf);
        @SuppressWarnings("deprecation")
        final Job job = new Job(conf, "Mongo Import");
        Path out = new Path("/home/mongo_import");
        FileOutputFormat.setOutputPath(job, out);
        job.setJarByClass(ImportWeblogsFromMongo.class);
        job.setMapperClass(ReadWeblogsFromMongo.class);
        job.setOutputKeyClass(Text.class);
        job.setOutputValueClass(Text.class);
        job.setInputFormatClass(MongoInputFormat.class);
        job.setOutputFormatClass(TextOutputFormat.class);
        job.setNumReduceTasks(0);
        System.exit(job.waitForCompletion(true) ? 0 : 1);
    }
    }

1。导出名为importmongo.jar 的Jar 文件后 2.我尝试执行这个命令hadoop jar /home/yass/importmongo.jar hdfs.ImportWeblogsFromMongo,但是我得到了以下错误:

Exception in thread "main" java.lang.NoClassDefFoundError: com/mongodb/hadoop/util/MongoConfigUtil
    at hdfs.ImportWeblogsFromMongo.main(ImportWeblogsFromMongo.java:33)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.lang.ClassNotFoundException: com.mongodb.hadoop.util.MongoConfigUtil
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 7 more

NB:clustedb 是数据库名称,fish 是它的集合 而 hdfs.ImportWeblogsFromMongo 是 package.class

有什么建议

【问题讨论】:

标签: java mongodb hadoop jar


【解决方案1】:

我没有以这种方式解决问题,但我通过将文件复制到 Hdfs 找到了使用 Mongodump 的解决方案,它下面的行可能会帮助某人完成工作

   mongodump  --db clusterdb --collection CollectionName

   bsondump file.bson > file.json

   hadoop dfs -copyFromLocal /path/to/file/fish.json mongo

【讨论】:

    猜你喜欢
    • 2010-11-18
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    相关资源
    最近更新 更多