I have a postgres database inside a docker container. I compose it up with this docker-compose.yaml:
services:
db:
container_name: postgres
image: postgres:latest
restart: always
volumes:
- db-data:/var/lib/postgresql/data
environment:
POSTGRES_DB: ${POSTGRES_DB}
POSTGRES_USER: ${POSTGRES_USER}
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
POSTGRES_HOST: ${POSTGRES_HOST}
POSTGRES_PORT: ${POSTGRES_PORT}
healthcheck:
test: [ "CMD", "pg_isready" , "-d", "${POSTGRES_DB}", "-U", "${POSTGRES_USER}"]
# The `interval` option specifies how often to run the health check.
interval: 10s
timeout: 5s
retries: 5
volumes:
db-data:
I get the environment variables from an .env file. Then I have a python script that uses SQLAlchemy to create table schema and a function to init the table like this:
from sqlalchemy import Column, Integer, String, DateTime, Float
from sqlalchemy import inspect
from sqlalchemy.orm import Mapped, DeclarativeBase
from pgvector.sqlalchemy import Vector
class CallData(DeclarativeBase):
__tablename__ = "call_data"
id: Mapped[int] = Column(Integer, primary_key=True)
nombre: Mapped[str] = Column(String(255), nullable=False)
entidad: Mapped[str] = Column(String(255), nullable=False)
descripcion = Vector(dim=300, nullable=False)
def __repr__(self):
return f"""
<CallData(id={self.id}, titulo={self.nombre}, entidad_convocante={self.entidad},
descripcion={self.descripcion})>
"""
@classmethod
def init_table(cls, engine):
"""Initialize the table in the database."""
if not inspect(engine).has_table(cls.__tablename__):
cls.metadata.create_all(engine)
print(f"Table {cls.__tablename__} created.")
else:
print(f"Table {cls.__tablename__} already exists.")
Then I have the following fucntion to send data to the database
def send_to_db(contenido):
load_dotenv()
engine = create_engine(
f"postgresql+psycopg://{os.getenv("POSTGRES_USER")}:{os.getenv("POSTGRES_PASSWORD")}@{os.getenv("POSTGRES_HOST)}:{os.getenv("POSTGRES_PORT")}/{os.getenv("POSTGRES_DB")}",
)
Session = sessionmaker(bind=engine)
CallData.init_table(engine)
with Session() as session:
# Assuming `contenido` is a list of dictionaries with the data to be inserted
for entry in contenido:
call_data = CallData(
nombre=entry["convocatoria"],
entidad=entry["entidad"],
descripcion=entry["descripcion"],
)
session.add(call_data)
session.commit()
However, I am getting this error when the CallData.init_table()
command is run. Particulary, on if not inspect(engine).has_table(self.__tablename__)
(psycopg.OperationalError) connection failed: connection to server at ${POSTGRES_HOST}, port ${POSTGRES_PORT} failed: FATAL: password authentication failed for user ${POSTGRES_USER}
connection to server at ${POSTGRES_HOST}, port ${POSTGRES_PORT} failed: FATAL: password authentication failed for user ${POSTGRES_USER}"
(Background on this error at: https://sqlalche.me/e/20/e3q8)
I have composed down and up again, and I got the same error. I think it's because I have to parse through the password on inspect, somehow. How can I solve it?
After two days (no kidding) trying to solve the issue, I've managed to. Fyi, I have been running the python script outside of a docker container and it tried to connect to the docker container of the postgres. Besides, I am running the code inside WSL2. The solution was rather weird, but I leave it in case someone stambles in the same exact issue. I had postgresql install on my local WSL abd I uninstalled postgresql and purge it
sudo apt-get --purge remove postgresql
sudo apt-get purge postgresql*
This is the explanation i found from a friend. PostgreSQL installed natively on Ubuntu was likely binding to localhost:5432, blocking the Docker container (which was also trying to expose its PostgreSQL service on the same port). So when Python tried to connect to localhost:5432, it hit the local PostgreSQL server — not the one running in Docker — and the credentials didn't match.
I hope this helps if anyone has a similar issue as mine
EDIT 2: Technically speaking, changing the port to something else or disabling native postgres should also solved it as explained by @snakecharmerb. But in my case, I don't need native postgres.