pythonpandaspostgresqlsupabase

Supabase connection error "server didn't return client encoding"


I am trying to connect and write a Pandas Dataframe to a Postgres custom schema on Supabase. Below is the code.

from sqlalchemy import create_engine

def server_access():
    # Create SQLAlchemy connection string
    conn_str = (
        f"postgresql+psycopg2://{"[USER]"}:{"[PASSWORD]"}"
        f"@{"[HOST]"}:{[PORT]}/{"[SCHEMA]"}?client_encoding=utf8"
    )
    engine = create_engine(url = conn_str)
    return engine
    
engine = server_access()
df.to_sql('tbl', engine, if_exists='append', index=False)

As per the documentation I've followed the steps to expose custom schema but it didn't worked. However, if I try using public schema the same code works fine without defining client_encoding in connection string.

How to connect and write a Pandas Dataframe using SQLAlchemy on a custom schema on Supabase?

Edit:

Sample connection string: postgresql+psycopg2://user:password@host:6543/custom_schema?client_encoding=utf8

After replacing custom_schema with postgres it works fine.

Error:

conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) server didn't return client encoding
(Background on this error at: https://sqlalche.me/e/20/e3q8)

Solution

  • Schema name should be defined within Pandas to_sql. When trying postgresql+psycopg2://user:password@host:6543/custom_schema?client_encoding=utf8 SQLAlchemy tried connecting to custom_schema which is actually a database not a schema and nothing like custom_schema exists. Below is the working code.

    conn_str = postgresql+psycopg2://user:password@host:6543/postgres?client_encoding=utf8
    engine = create_engine(url = conn_str)
    df.to_sql('tbl', engine, if_exists='append', index=False, schema='custom_schema')