1
\$\begingroup\$

I have been using aws-sdk gem with Rails, for uploading and downloading objects. I believe it's working fine.

Demo code:

begin
s3 = AWS::S3.new
bucket = s3.buckets[ENV['BUCKET_NAME']]
bucket.objects.create(fname, data)
object = bucket.objects[fname]
exported_url = object.url_for(:get, {expires: 3.weeks,
 response_content_type: "text/csv", response_content_disposition: 
"attachment; filename=#{fname}.csv"}).to_s
rescue Aws::S3::Errors::ServiceError => e
 puts e.message
end

Are there any issues if uploading a large file to s3? I mean, that issues will affect my code. Will it handle that situation for large files? Should I go for the multipart options in s3?

Jamal
35.2k13 gold badges134 silver badges238 bronze badges
asked Nov 18, 2014 at 13:51
\$\endgroup\$

1 Answer 1

1
\$\begingroup\$

There are two issues you should watch out for:

  1. This will block the current thread while it's running. For a large file, this could be a substantial amount of time. You should consider moving this code out to a Sidekiq (or similar) worker, so that you aren't blocking threads in your web app.
  2. When you rescue exceptions, you just puts them, which you're likely to ignore in production. You should be logging that exception to a place that you can get ahold of it later. I would personally just not rescue it and use BugSnag or similar in my application to ensure that errors are collected and kept for resolution.
answered Nov 18, 2014 at 18:24
\$\endgroup\$

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.