且构网

分享程序员开发的那些事...
且构网 - 分享程序员编程开发的那些事

Ruby Azure Blob存储:"RequestBodyTooLarge"

更新时间:2023-02-09 08:06:43

当前的Ruby Azure SDK确实具有执行分块上载的方法,但是在任何地方都没有使用示例,并且规范中的所有内容都是模拟的,而没有真的没有帮助.

The current Ruby Azure SDK does indeed have methods for doing chunked uploads, but there are no usage examples anywhere and everything in the specs is a mock, which doesn't really help.

让分块上载开始工作真是太奇妙了,这绝对是应该包含在库中的东西.我花了几个小时才能解决这个问题,希望这段代码对您有所帮助.

Getting chunked uploads to work is so fiddly that this is absolutely something that should be included in the library. It took me a number of hours to get this right and I hope this code snippet helps.

这是一个非常基本的用法示例:

Here is a very basic usage example:

class ::File
  def each_chunk(chunk_size=2**20)
    yield read(chunk_size) until eof?
  end
end

container  = 'your container name'
blob       = 'your blob name'
block_list = []
service    = Azure::BlobService.new
counter    = 1

open('path/to/file', 'rb') do |f|
  f.each_chunk {|chunk|
    block_id = counter.to_s.rjust(5, '0')
    block_list << [block_id, :uncommitted]

    # You will likely want to get the MD5 for retries
    options = {
      content_md5: Base64.strict_encode64(Digest::MD5.digest(chunk)),
      timeout:     300 # 5 minutes
    }

    md5 = service.create_blob_block(container, blob, block_id, chunk, options)
    counter += 1
  }
end

service.commit_blob_blocks(container, blob, block_list)

请给我几天,我应该对它进行更合理的封装,以提交给 https://github.com/dmichael/azure-contrib

Give me a couple of days and I should have something more reasonably encapsulated committed to https://github.com/dmichael/azure-contrib