apache-sparkamazon-s3pysparkfilesystemspython-s3fs

cannot import s3fs in pyspark


When i try importing the s3fs library in pyspark using the following code:

import s3fs

I get the following error:

An error was encountered: cannot import name 'maybe_sync' from 'fsspec.asyn' (/usr/local/lib/python3.7/site-packages/fsspec/asyn.py) Traceback (most recent call last): File "/usr/local/lib/python3.7/site-packages/s3fs/init.py", line 1, in from .core import S3FileSystem, S3File File "/usr/local/lib/python3.7/site-packages/s3fs/core.py", line 12, in from fsspec.asyn import AsyncFileSystem, sync, sync_wrapper, maybe_sync ImportError: cannot import name 'maybe_sync' from 'fsspec.asyn' (/usr/local/lib/python3.7/site-packages/fsspec/asyn.py)

The fsspec package has been installed in my notebook. And I actually had been using it fine for a long time, when this suddenly happened. I tried googling, but could not find this specific error. Has anyone come across this before? And if so, do you know how to solve it?


Solution

  • Glad to hear this wasn't just me. It looks like if you pip install versions s3fs==0.5.1 and fsspec==0.8.3, that should fix it.