I created a sample Pyramid application using the official cookie cutter to describe my issue. I am using MySQL as the database server.
My DB model is the original code of the cookie-cutter:
class MyModel(Base):
__tablename__ = 'models'
id = Column(Integer, primary_key=True)
name = Column(Text)
value = Column(Integer)
Index('my_index', MyModel.name, unique=True, mysql_length=255)
I have a simple view with a post:
data = {"error": False, "message": ""}
if request.method == "POST":
dct = variable_decode(request.POST)
added, message = add_record(request, dct["code"], dct["name"])
data["error"] = not added
data["message"] = message
return {"data": data}
Here is the first version of the function add_record:
try:
new_record = MyModel(id=code, name=name, value=1)
request.dbsession.add(new_record)
return True, ""
except IntegrityError:
message = "Code {} already exist".format(code)
print(message)
return False, message
With this first version, if I duplicate the primary key, the code cannot trap the exception.
...
File "/home/cquiros/temp/pyramid/pyramid/lib/python3.10/site-packages/pyramid_tm/__init__.py", line 82, in _finish
finisher()
...
sqlalchemy.exc.IntegrityError: (mysql.connector.errors.IntegrityError) 1062 (23000): Duplicate entry '1' for key 'models.PRIMARY'
[SQL: INSERT INTO models (id, name, value) VALUES (%(id)s, %(name)s, %(value)s)]
[parameters: {'id': '1', 'name': 'd', 'value': 1}]
(Background on this error at: https://sqlalche.me/e/20/gkpj)
In a second version of the function, I flush the session:
def add_record(request, code, name):
try:
new_record = MyModel(id=code, name=name, value=1)
request.dbsession.add(new_record)
request.dbsession.flush()
return True, ""
except IntegrityError:
message = "Code {} already exist".format(code)
print(message)
return False, message
If I flush the session and duplicate the primary key, the code can trap the Exception. However, If I use the same session to add another record like:
add_record(request, 100, "test")
I get the following error showing they primary key Exception that was trapped before:
sqlalchemy.exc.PendingRollbackError: This Session's transaction has been rolled back due to a previous exception during flush. To begin a new transaction with this Session, first issue Session.rollback(). Original exception was: (mysql.connector.errors.IntegrityError) 1062 (23000): Duplicate entry '1' for key 'models.PRIMARY'
[SQL: INSERT INTO models (id, name, value) VALUES (%(id)s, %(name)s, %(value)s)]
[parameters: {'id': '1', 'name': 'd', 'value': 1}]
The error points to a rollback. So here is my version 3 of the function add_record() where I rollback the session:
def add_record(request, code, name):
try:
new_record = MyModel(id=code, name=name, value=1)
request.dbsession.add(new_record)
request.dbsession.flush()
return True, ""
except IntegrityError:
request.dbsession.rollback()
message = "Code {} already exist".format(code)
print(message)
return False, message
However, If I use the same session to add another record like:
add_record(request, 300, "test")
I get the error:
File "/home/cquiros/temp/pyramid/pyramid/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 943, in _raise_for_prerequisite_state
raise sa_exc.ResourceClosedError("This transaction is closed")
sqlalchemy.exc.ResourceClosedError: This transaction is closed
At this point, I wonder if the code is not the correct way to control duplicate primary keys. However, I cannot find a different way around it in the Pyramid documentation or anywhere else on the Internet.
Any ideas are appreciated.
I'm pretty sure the default cookie cutter uses pyramid_tm
to manage an application wide transaction. The sqlalchemy/db transactions are managed by this larger transaction mechanism. The idea behind that is that you can put other items, like sending emails, queuing jobs, etc. all into the same all or nothing transaction with database transactions. The downside is that you lose some granularity when handling db transactions. For a small case like this I think failing and re-trying the entire request would not make sense and/or be total overkill.
An alternative and arguably better solution to this problem is to use an "upsert"
. Ie. try to insert and if that fails because it would create a duplicate then update the pre-existing record as needed. If you need to manually handle the commit/rollback process you should still be able to use savepoints. I think mysql
has some support for both of these.
You can read about savepoints in the pyramid_tm
documentation here:
savepoints
You can read about "upserts"
in the sqlalchemy
documentation here: orm-queryguide-upsert