pythonamazon-web-servicesamazon-s3python-s3fs

How to mount S3 bucket as local FileSystem?


I have a python app running on a Jupiter-notebook on AWS. I loaded a C-library into my python code which expects a path to a file. I would like to access this file from the S3 bucket.

I tried to use s3fs:

s3 = s3fs.S3FileSystem(anon=False)

using s3.ls('..') lists all my bucket files... this is ok so far. But, the library I am using should actually use the s3 variable inside where I have no access. I can only pass the path to the c library.

Is there a way to mount the s3 bucket in a way, where I don't have to call s3.open(), and can just call open(/path/to/s3) were somewhere hidden the s3 bucket is really mounted as a local filesystem?

I think it should work like this without using s3. Because I can't change the library I am using internally to use the s3 variable...

with s3.open("path/to/s3/file",'w') as f:
      df.to_csv(f)

with open("path/to/s3/file",'w') as f:
      df.to_csv(f)

Or am I doing it completely wrong?

The c library iam using is loaded as DLL in python and i call a function :

lib.OpenFile(path/to/s3/file)

I have to pass the path to s3 into the library OpenFile function.


Solution

  • If you're looking to mount the S3 bucket as part of the file system, then use s3fs-fuse

    https://github.com/s3fs-fuse/s3fs-fuse

    That will make it part of the file system, and the regular file system functions will work as you would expect.