ruby-on-railsrubyamazon-s3rails-activestorage

Dynamically change active storage service on rails model?


I'm using active storage with s3 to attach a file to a model called Document. I need to add support for users in the EU that want their document files stored in an s3 bucket in the EU.

I have my storage.yml configured like:

amazon:
    service: S3
    access_key_id: <%= ENV['S3_ACCESS_KEY_ID'] %>
    secret_access_key: <%= ENV['S3_SECRET_KEY_ACCESS'] %>
    region: <%= ENV['S3_REGION_EU'] %>
    bucket: <%= ENV['S3_BUCKET_NAME'] %>

amazon_eu:
    service: S3
    access_key_id: <%= ENV['S3_ACCESS_KEY_ID'] %>
    secret_access_key: <%= ENV['S3_SECRET_KEY_ACCESS'] %>
    region: <%= ENV['S3_REGION_EU'] %>
    bucket: <%= ENV['S3_BUCKET_NAME_EU'] %>

Is there a way to dynamically switch in the service in the Document model based on the region set in the Account? Something like:

class Document < ApplicationRecord
  belongs_to :account
  
  if account.region == 'eu'
    has_one_attached :file, service: amazon_eu
  else
    has_one_attached :file, service: amazon
  end

end

Or really anyway to specify the specific service that I want to use dynamically at runtime?


Solution

  • I figured out how to do it. I created a custom service for active storage that inherits from the s3 service and created a second bucket instance on the class for the eu region I want to use.

    # lib/active_storage/service/dynamic_storage_service.rb
    require "active_storage/service/s3_service"
    
    module ActiveStorage
      class Service::DynamicStorageService < ActiveStorage::Service::S3Service
        # create attributes for another client and bucket
        attr_reader :client_eu, :bucket_eu
        
    
        # override the initializer with options to pass in an eu bucket from the storage.yml
        # this is where you would create whatever extra buckets you need
        def initialize(bucket:, bucket_eu:, upload: {}, public: false, **options)
          eu_options = options.except(:region)
          eu_options[:region] = eu_options[:region_eu]
    
          @client_eu = Aws::S3::Resource.new(eu_options.except(:region_eu))
          @bucket_eu = @client_eu.bucket(bucket_eu)
    
          super(bucket: bucket, upload: upload, public: public, **options.except(:region_eu))
        end
    
        private
          # override the method where the bucket is used
          # this is where you would add whatever logic you need to select the bucket
          # this is the implementation that works for me
          # the method just needs to return the S3 object and everything else will work
          def object_for(key)
            # this is how you get the record based on the key
            document_id = ActiveStorage::Attachment.find_by(blob_id: ActiveStorage::Blob.find_by(key: key).id).record_id
    
            document = Document.find(document_id)
    
            if document.account.region == 'eu'
              return bucket_eu.object(key)
            else
              return bucket.object(key)
            end
          end
      end
    end
    

    And then you would use it in the storage.yml instead of the regular S3 service:

    # config/storage.yml
    amazon:
        service: DynamicStorage
        access_key_id: <%= ENV['S3_ACCESS_KEY_ID'] %>
        secret_access_key: <%= ENV['S3_SECRET_KEY_ACCESS'] %>
        region: <%= ENV['S3_REGION'] %>
        bucket: <%= ENV['S3_BUCKET_NAME'] %>
        region_eu: <%= ENV['S3_REGION_EU'] %>
        bucket_eu: <%= ENV['S3_BUCKET_NAME_EU'] %>
    

    And the Document.rb can work like normal

    # document.rb
    class Document < ApplicationRecord
      belongs_to :account
      
      has_one_attached :file
    
    end
    

    You could probably rewrite this better and make it more dynamic and able to handle as many buckets as you want but this is what worked for me