【问题标题】:how to pipe an archive (zip) to an S3 bucket如何将档案 (zip) 通过管道传输到 S3 存储桶
【发布时间】:2019-01-27 01:12:23
【问题描述】:

我对如何进行有点困惑。我正在使用存档(节点 js 模块)作为将数据写入 zip 文件的一种方式。目前,当我写入文件(本地存储)时,我的代码正在运行。

var fs = require('fs');
var archiver = require('archiver');

var output = fs.createWriteStream(__dirname + '/example.zip');
var archive = archiver('zip', {
     zlib: { level: 9 }  
});

archive.pipe(output);
archive.append(mybuffer, {name: ‘msg001.txt’});

我想修改代码,使存档目标文件成为 AWS S3 存储桶。查看代码示例,我可以在创建存储桶对象时指定存储桶名称和键(和正文),如下所示:

var s3 = new AWS.S3();
var params = {Bucket: 'myBucket', Key: 'myMsgArchive.zip' Body: myStream};
s3.upload( params, function(err,data){
    … 
});

Or 

s3 = new AWS.S3({ parms: {Bucket: ‘myBucket’ Key: ‘myMsgArchive.zip’}});
s3.upload( {Body: myStream})
    .send(function(err,data) {
    …
    });

关于我的 S3 示例,myStream 似乎是一个可读流,我很困惑如何使它工作,因为archive.pipe 需要一个可写流。这是我们需要使用直通流的地方吗?我找到了一个示例,其中有人创建了传递流,但该示例太简洁而无法获得正确理解。我指的具体例子是:

Pipe a stream to s3.upload()

如果有人可以给我任何帮助,我们将不胜感激。谢谢。

【问题讨论】:

    标签: javascript node.js amazon-s3 archive


    【解决方案1】:

    这对于想知道如何使用pipe 的其他人可能很有用。

    由于您正确引用了使用传递流的示例,因此这是我的工作代码:

    1 - 例程本身,使用node-archiver 压缩文件

    exports.downloadFromS3AndZipToS3 = () => {
      // These are my input files I'm willing to read from S3 to ZIP them
    
      const files = [
        `${s3Folder}/myFile.pdf`,
        `${s3Folder}/anotherFile.xml`
      ]
    
      // Just in case you like to rename them as they have a different name in the final ZIP
    
      const fileNames = [
        'finalPDFName.pdf',
        'finalXMLName.xml'
      ]
    
      // Use promises to get them all
    
      const promises = []
    
      files.map((file) => {
        promises.push(s3client.getObject({
          Bucket: yourBubucket,
          Key: file
        }).promise())
      })
    
      // Define the ZIP target archive
    
      let archive = archiver('zip', {
        zlib: { level: 9 } // Sets the compression level.
      })
    
      // Pipe!
    
      archive.pipe(uploadFromStream(s3client, 'someDestinationFolderPathOnS3', 'zipFileName.zip'))
    
      archive.on('warning', function(err) {
        if (err.code === 'ENOENT') {
          // log warning
        } else {
          // throw error
          throw err;
        }
      })
    
      // Good practice to catch this error explicitly
      archive.on('error', function(err) {
        throw err;
      })
    
      // The actual archive is populated here 
    
      return Promise
        .all(promises)
        .then((data) => {
          data.map((thisFile, index) => {
            archive.append(thisFile.Body, { name: fileNames[index] })
          })
    
          archive.finalize()
        })
      }
    

    2 - 辅助方法

    const uploadFromStream = (s3client) => {
      const pass = new stream.PassThrough()
    
      const s3params = {
        Bucket: yourBucket,
        Key: `${someFolder}/${aFilename}`,
        Body: pass,
        ContentType: 'application/zip'
      }
    
      s3client.upload(s3params, (err, data) => {
        if (err)
          console.log(err)
    
        if (data)
          console.log('Success')
      })
    
      return pass
    }
    

    【讨论】:

      【解决方案2】:

      以下示例采用接受的答案,并根据请求使其与本地文件一起使用。

      const archiver = require("archiver")
      const fs = require("fs")
      const AWS = require("aws-sdk")
      const s3 = new AWS.S3()
      const stream = require("stream")
      
      const zipAndUpload = async () => {
        const files = [`test1.txt`, `test2.txt`]
        const fileNames = [`test1target.txt`, `test2target.txt`]
        const archive = archiver("zip", {
          zlib: { level: 9 } // Sets the compression level.
        })
        files.map((thisFile, index) => {
          archive.append(fs.createReadStream(thisFile), { name: fileNames[index] })
        })
        const uploadStream = new stream.PassThrough()
        archive.pipe(uploadStream)
        archive.finalize()
        archive.on("warning", function (err) {
          if (err.code === "ENOENT") {
            console.log(err)
          } else {
            throw err
          }
        })
        archive.on("error", function (err) {
          throw err
        })
        archive.on("end", function () {
          console.log("archive end")
        })
        await uploadFromStream(uploadStream)
        console.log("all done")
      }
      
      const uploadFromStream = async pass => {
        const s3params = {
          Bucket: "bucket-name",
          Key: `streamtest.zip`,
          Body: pass,
          ContentType: "application/zip"
        }
        return s3.upload(s3params).promise()
      }
      
      zipAndUpload()
      

      【讨论】:

        猜你喜欢
        • 1970-01-01
        • 2016-09-17
        • 2021-12-29
        • 1970-01-01
        • 2020-10-17
        • 2020-01-18
        • 1970-01-01
        • 1970-01-01
        相关资源
        最近更新 更多