I am trying to do some monkey patch work on an existing third-party library I am using.
A.__new__ using mynewA.__new__ again using object.__new__ to reset the previousA.__new__ using object.__new__I need to override the __new__ method for that library. Assume A is a class from the third party. Everything works OK as follows:
class A:
def __init__(self, x):
self.x = x
def mynew(cls, *args, **kwargs):
print("Calling mynew")
return object.__new__(cls)
A.__new__ = mynew
A(10)
# Calling mynew
# <__main__.A object at 0x7f43806d8440>
The problem is when I want to reset the class to the existing __new__ method (which the third-party class does not implement, so it should be object.__new__, I run into an error:
A.__new__ = object.__new__
A(10)
#TypeError: object.__new__() takes exactly one argument (the type to instantiate)
I somehow understand this... Something to do with the difference in object's definition which does not take any arguments, while the signature is defined to take arguments for subclassing compatibility.
What tripped me off, however, is that if I set A.__new__ = object.__new__, without first overriding it using the mynew method, it will work ok. That is running this code block only, it will work
class A:
def __init__(self, x):
self.x = x
A.__new__ = object.__new__
A(10)
# works OK
What is going on?
As noted in the comments, you are hitting a problem that I myself had found while answering another question here years ago.
So, indeed, it is a strange behavior probably caused by cPython optimizations to get fast to __new__. To be clear, to "restore the original behavior of __new__" for a class that didn't feature a custom __new__ method one should just delete the custom, monkey-patched, method from the class, and not try to set it to object.__new__.
SO, this should work - but it actually doesn't:
class A:
def __init__(self, x):
self.x
def mynew(cls, *args, **kwargs):
print("Calling mynew")
return object.__new__(cls)
A.__new__ = mynew
A(10)
del A.__new__
A(10)
Which means, Python does mark the __new__ slot as "having been used" at some point, and even if one deletes it back, it still will not ignore the extra arguments passed to object.__new__. This is related to the very nature of the exceptional behavior to __new__ arguments, and is an internal implementation detail.
(Note that the code snippet above del A.__new__ will just erase a __new__ method explicitly declared in that class: after the del you can still type A.__new__: that will trigger the usual lookup in superclasses and retrieve object.__new__ - but as you noted, it now behaves differently from the initial, __new__ less class A
The only way I could restore the original behavior of A.__new__ was to recreate the class A with a programatic call to type, passing it the "tainted" A.__dict__ as namespace:
class Base:
pass
class A(Base):
def __init__(self, x):
self.x = x
def mynew(cls, *args):
print("At mynew", *args)
return object.__new__(cls)
A(23)
...
del A.__new__
A(23)
# Raises TypeError
# Recreate a shallow copy of A:
A = type(A.__name__, A.__bases__, dict(A.__dict__)
A(23)
...
FInally, please keep in mind this is an implementation side-effect of cPython itself, and possibly should even be reported as a bug in the language itself.
pypy, arguably the second most conformant Python implementation doesn't suffer from this problem, and the monkey-patched __new__ method can just be deleted away:
Python 3.10.16 (4e39ed95f7ff, Feb 26 2025, 16:35:58)
[PyPy 7.3.19 ...] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>>> class A():
.... def __init__(self, x):
.... self.x = x
....
.... def mynew(cls, *args):
.... print("At mynew", *args)
.... return object.__new__(cls)
....
>>>> A.__new__ = mynew
>>>> A(23)
At mynew 23
<__main__.A object at 0x00007f5dc0541830>
>>>> del A.__new__
>>>> A(23)
<__main__.A object at 0x00007f5dc0554de8>