【问题标题】:Clone record and copy remote files to new location?克隆记录并将远程文件复制到新位置?
【发布时间】:2023-04-09 03:01:01
【问题描述】:

我有一个Job 模型,它可以有很多附件。 Attachment 模型上安装了 CarrierWave 上传器。

class Job < ActiveRecord::Base
  has_many :attachments
end

class Attachment < ActiveRecord::Base
  mount_uploader :url, AttachmentUploader

  belongs_to :job
end

可以克隆作业,并且克隆作业应创建新的作业和附件记录。这部分很简单。

然后系统需要将物理文件复制到与克隆作业关联的上传位置。
有没有一种简单的方法可以使用 CarrierWave 做到这一点?该解决方案应同时支持本地文件系统和 AWS S3。

class ClonedJob
  def self.create_from(orig_job)
    @job_clone = orig_job.dup

    if orig_job.attachments.any?
      orig_job.attachments.each do |attach|
        cloned_attactment = attach.dup
        # Need to physically copy files at this point. Otherwise
        # this cloned_attachment will still point to the same file 
        # as the original attachment.
        @job_clone.attachments << cloned_attachment
      end
    end
  end
end

【问题讨论】:

    标签: ruby-on-rails ruby amazon-s3 ruby-on-rails-3.2 carrierwave


    【解决方案1】:

    我已将我拼凑在一起的模块粘贴在下面以完成此操作。它有效,但如果它足够重要,我仍然会改进一些事情。我只是把我的想法留在了代码中。

    require "fileutils"
    
    # IDEA: I think it would make more sense to create another module
    # which I could mix into Job for copying attachments. Really, the
    # logic for iterating over attachments should be in Job. That way,
    # this class could become a more generalized class for copying
    # files whether we are on local or remote storage.
    #
    # The only problem with that is that I would like to not create
    # a new connection to AWS every time I copy a file. If I do then
    # I could be opening loads of connections if I iterate over an
    # array and copy each item. Once I get that part fixed, this
    # refactoring should definitely happen.
    
    module UploadCopier
      # Take a job which is a reprint (ie. it's original_id
      # is set to the id of another job) and copy all of 
      # the original jobs remote files over for the reprint
      # to use.
      #
      # Otherwise, if a user edits the reprints attachment
      # files, the files of the original job would also be
      # changed in the process.
      def self.copy_attachments_for(reprint)
        case storage
        when :file
          UploadCopier::LocalUploadCopier.copy_attachments_for(reprint)
        when :fog 
          UploadCopier::S3UploadCopier.copy_attachments_for(reprint)
        end
      end
    
      # IDEA: Create another method which takes a block. This method
      # can check which storage system we're using and then call
      # the block and pass in the reprint. Would DRY this up a bit more.
    
      def self.copy(old_path, new_path)
        case storage
        when :file
          UploadCopier::LocalUploadCopier.copy(old_path, new_path)
        when :fog 
          UploadCopier::S3UploadCopier.copy(old_path, new_path)
        end
      end
    
      def self.storage
        # HACK: I should ask CarrierWave what method to use
        # rather than relying on the config variable.
        APP_CONFIG[:carrierwave][:storage].to_sym 
      end
    
      class S3UploadCopier
        # Copy the originals of a certain job's attachments over
        # to a location associated with the reprint.
        def self.copy_attachments_for(reprint)
          reprint.attachments.each do |attachment|
            orig_path = attachment.original_full_storage_path
            # We can pass :fog in here without checking because
            # we know it's :fog since we're in the S3UploadCopier.
            new_path = attachment.full_storage_path
            copy(orig_path, new_path)
          end
        end
    
        # Copy a file from one place to another within a bucket.
        def self.copy(old_path, new_path)
          # INFO: http://goo.gl/lmgya
          object_at(old_path).copy_to(new_path)
        end
    
      private
    
        def self.object_at(path)
          bucket.objects[path]
        end
    
        # IDEA: THis will be more flexible if I go through
        # Fog when I open the connection to the remote storage.
        # My credentials are already configured there anyway.
    
        # Get the current s3 bucket currently in use.
        def self.bucket
          s3 = AWS::S3.new(access_key_id: APP_CONFIG[:aws][:access_key_id],
            secret_access_key: APP_CONFIG[:aws][:secret_access_key])
          s3.buckets[APP_CONFIG[:fog_directory]]
        end
      end
    
      # This will only be used in development when uploads are
      # stored on the local file system.
      class LocalUploadCopier
        # Copy the originals of a certain job's attachments over
        # to a location associated with the reprint.
        def self.copy_attachments_for(reprint)
          reprint.attachments.each do |attachment|
            # We have to pass :file in here since the default is :fog.
            orig_path = attachment.original_full_storage_path
            new_path = attachment.full_storage_path(:file)
            copy(orig_path, new_path)
          end
        end
    
        # Copy a file from one place to another within the
        # local filesystem.
        def self.copy(old_path, new_path)
          FileUtils.mkdir_p(File.dirname(new_path))
          FileUtils.cp(old_path, new_path)
        end
      end
    end
    

    我是这样使用的:

    # Have to save the record first because it needs to have a DB ID.
    if @cloned_job.save
      UploadCopier.copy_attachments_for(@cloned_job)
    end
    

    【讨论】:

      【解决方案2】:
      class Job < ActiveRecord::Base
        has_many :attachments
      end
      
      class Attachment < ActiveRecord::Base
        mount_uploader :attachment, AttachmentUploader
        belongs_to :job
      end
      
      class ClonedJob
        def self.create_from(orig_job)
          @job_clone = orig_job.dup
      
          if orig_job.attachments.any?
            orig_job.attachments.each do |attach|
              cloned_attachment = attach.dup
              @job_clone.attachments << cloned_attachment
              # !!! Here is the trick
              cloned_attachment.remote_attachment_url = attach.attachment_url
            end
          end
        end
      end
      

      【讨论】:

      猜你喜欢
      • 1970-01-01
      • 2014-08-22
      • 2018-12-29
      • 2016-09-25
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 2021-06-28
      • 2010-10-24
      相关资源
      最近更新 更多