I have a model called client_billing_file where I use Carrierwave to upload a CSV file like this:
mount_uploader :billing_file_name, UsageFileUploader
and I schedule a job to run 5 minutes after commiting the creation of a new record:
after_commit :generate_usage_file, on: :create
def generate_usage_file
Resque.enqueue_in(5.minutes, GenerateUsageFileQueue, id, admin.email)
end
This is my background job:
def self.perform(client_billing_file_id, email)
cbf = ClientBillingFile.find(client_billing_file_id)
filepath = cbf.billing_file_name.current_path
csv_file = CSV.read(filepath, headers: true)
.
.
.
end
This is working in my development and testing environments, but it fails when I try to open the CSV file in the staging environment (where it actually uploads the file to the S3 bucket). I checked the bucket and the file is getting uploaded to the specified directory correctly, but for some reason the job is throwing the following error:
Exception Errno::ENOENT
Error No such file or directory @ rb_sysopen - my_path/my_file.csv
Versions:
I tried Jared Beck's idea and it's working now, basically I added this condition to my BG job:
if Rails.env.production? || Rails.env.staging?
url = cbf.billing_file_name.url
cbf.billing_file_name.download!(url)
end
So the final code looks like this:
def self.perform(client_billing_file_id, email)
cbf = ClientBillingFile.find(client_billing_file_id)
if Rails.env.production? || Rails.env.staging?
url = cbf.billing_file_name.url
cbf.billing_file_name.download!(url)
end
filepath = cbf.billing_file_name.current_path
csv_file = CSV.read(filepath, headers: true)
.
.
.
end