【问题标题】:Appending data to a file in HDFS using java failed , getting error使用 java 将数据附加到 HDFS 中的文件失败,出现错误
【发布时间】:2019-11-09 07:11:13
【问题描述】:

我使用 java 在 hdfs 中创建了一个新的 csv 文件,我正在尝试将数据附加到该 csv 文件,但未能附加错误

由于没有更多好的数据节点可供尝试,无法替换现有管道上的坏数据节点。 (节点:current=[DatanodeInfoWithStorage[192.168.1.25:9866,DS-b6d8a63b-357d-4d39-9f27-1ab76b8b6ccc,DISK]], original=[Dat

下面是代码

csv file created and uplaoded to HDFS from java code , but not able append data to the existing file . but a newly uploaded csv from ui interface was able to appended data  with java code , please help to resolve this issue.

private void appendFileToFile (String fileName) 抛出异常{

    long testTime1 = System.currentTimeMillis();

    String hdfsHostDetails = new String("hdfs://192.168.1.25:9000");

    Configuration conf = new Configuration();
    conf.setBoolean("dfs.support.append", true);

    FileSystem fs = FileSystem.get(URI.create(hdfsHostDetails), conf);

    String dirpath = new String(hdfsHostDetails);
    String targetfilepath = new String(dirpath+"/"+fileName);
    int count = 0;
    while (count < 2) {
        int offset = 0;
        int limit = 10000;
        IgniteTable table = new IgniteTable(ignite, "nok_customer_demand");
        String query = "SELECT * FROM nok_customer_demand  OFFSET "+ offset +" ROWS FETCH NEXT "+ limit +" ROWS ONLY";
        List<List<?>> lists = table._select(query);
        List<String[]> rows = new ArrayList();
        System.out.println(":::::::::::::::::: Data Ready for iteration ::::::::::::::"+ count);

        // create a new file on each iteration
        File file = new File("/home/tejatest1"+count+".csv");
        FileWriter outputfile = new FileWriter(file);
        CSVWriter writer = new CSVWriter(outputfile);

        for (List eachlist : lists) {
            String[] eachRowAsString = new String[eachlist.size()];
            ;
            int i = 0;
            for (Object eachcol : eachlist) {
                eachRowAsString[i] = String.valueOf(eachcol);
                rows.add(eachRowAsString);
                i++;
            }

            writer.writeNext(eachRowAsString);
        }

        // on each iteration append the data in the file to hdfs
        InputStream in = new BufferedInputStream(new FileInputStream(file));
        FSDataOutputStream out =null;

        if(!fs.exists(new Path(targetfilepath))) {

            out = fs.create(new Path(targetfilepath));

        } else{

            out = fs.append(new Path(targetfilepath));

        }

        IOUtils.copyBytes(in, out, 4096, true);

        writer.close();
        out.close();
        outputfile.close();
        lists.clear();
        in.close();
        file.delete();
        count++;

    }
    long testTime2 = System.currentTimeMillis();
    System.out.println("-----total time taken for data fetch for all records in table using limit and offset:-------" + (testTime2 - testTime1) + " ms");

    fs.close();

}

【问题讨论】:

    标签: java file append hdfs


    【解决方案1】:

    我通过以下配置解决了这个问题 Configuration conf = new Configuration(); conf.set("fs.defaultFS",hdfsHostDetails); conf.setInt("dfs.replication",1); conf.setBoolean("dfs.client.block.write.replace-datanode-on-failure.enable",false); conf.setBoolean("dfs.support.append", true); FileSystem fs = FileSystem.get(URI.create(hdfsHostDetails), conf);

    【讨论】:

      猜你喜欢
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 2023-03-07
      • 2016-01-26
      • 1970-01-01
      • 2020-04-03
      相关资源
      最近更新 更多