Given a function:
def foobar(foo: int, bar: str, spam: SpamService) -> str:
return spam.serve(foo, bar)
This function, similar in look to FastAPI endpoints, define two parameters as "normal" parameters, and one "Service", an abstract class. I want to "reuse" the foobar
function like I reuse a FastAPI endpoint in a router, and register n
"version" of the function given n
dependencies.
Example:
foobar_rabbit = inject(foobar, RabbitService)
foobar_snake = inject(foobar, SnakeService)
foobar_rabbit(1, "rabot")
foobar_snake(2, "sniky")
I can use functools.partial
to do that, but I want the dependancy to be injected as a correct parameter without relying on position or keyword args.
This mean that a function that require two dependencies like:
def foobar(foo: int, egg: EggService, spam: SpamService) -> str:
return spam.serve(foo, egg.do_stuff())
Can be registered like this:
foobar_1 = inject(foobar, SpamService1, EggService2)
foobar_1_ = inject(foobar, EggService2, SpamService1) # result in the same Partial
To do that, I did this code (should run as is on python 3.11, no external dep):
import abc
import functools
import inspect
import typing
class Service(abc.ABC):
...
class ServiceA(Service):
@staticmethod
@abc.abstractmethod
def method_a(a: int) -> str:
"""
This method do something.
"""
class ServiceA1(ServiceA):
@staticmethod
def method_a(a: int) -> str:
return f"A1: {a}"
def inject(
func: typing.Callable,
*services: typing.Type[Service]
) -> functools.partial:
annotations = inspect.get_annotations(func)
del annotations["return"]
bind_services = {
key: service
for key, value in annotations.items()
if issubclass(value, Service)
for service in services
if issubclass(service, value)
}
return functools.partial(func, **bind_services)
def foobar(foo: int, spam: ServiceA) -> str:
return spam.method_a(foo)
foobar_A1 = inject(foobar, ServiceA1)
if __name__ == '__main__':
print(foobar_A1(1)) # A1: 1
The issue is the signature of foobar_A1
. If I don't send any arguments, Pycharm won't raise a warning, and mypy won't find any error.
I tried many alternative using typing.TypeVar
for example but nothing works.
Here an example of a non working solution:
_SERVICE = typing.TypeVar("_SERVICE", bound=Service)
_RETURN = typing.TypeVar("_RETURN")
def inject(
func: typing.Callable[[..., _SERVICE], _RETURN],
*services: typing.Type[Service]
) -> functools.partial[typing.Callable[[_SERVICE, ...], _RETURN]]:
But mypy complains and it's not creating the expected signature (I'm not used to this kind of annotation wizardry yet).
Expected signature: (foo: int) -> str
As I stated in my initial comment, the current Python typing system is unfortunately not powerful enough to make your inject
function look the way you want. (Just like there is no way to properly annotate functools.partial
.)
It can accept a function f
with an arbitrary signature and returns a function g
with a subset of the parameters of f
. But which exact parameters of f
remain in g
depends on which of them were of a specific type. Even with generics (TypeVar
and ParamSpec
) you cannot express this fully in Python.
What you can do is tell type checkers that inject
returns a function of the same type as the function that you pass to it:
from collections.abc import Callable
from typing import ParamSpec, TypeVar
P = ParamSpec("P")
R = TypeVar("R")
class Service:
...
def inject(func: Callable[P, R], *services: type[Service]) -> Callable[P, R]: # type: ignore[empty-body]
...
class ServiceA(Service):
@staticmethod
def meth(n: int) -> str:
return "a" * n
def foo(n: int, spam: type[ServiceA]) -> str:
return spam.meth(n)
foo_a = inject(foo, ServiceA)
reveal_type(foo_a) # def (n: builtins.int, spam: Type[ServiceA]) -> builtins.str
Now if all you want to do with the inject
function is reuse it internally in your own package without documenting/exposing it publicly, you can simply tell a type checker, what type the injected "partial" functions have by annotating them accordingly. Here you have a couple of options.
Callable
Knowing what the call signature of your injected functions will be, you can simply annotate them with a corresponding Callable
subtype:
...
def foo(n: int, spam: type[ServiceA]) -> str:
return spam.meth(n)
foo_a: Callable[[int], str] = inject(foo, ServiceA)
reveal_type(foo_a) # def (builtins.int) -> builtins.str
That assignment will of course cause a type checker to raise an error because it will see that Callable[[int], str]
is not a supertype of what inject
returns in that call, namely Callable[[int, Type[ServiceA]], str]
.
But again, since you are just doing this for your own benefit, you do not need to worry about that too much. You can either 1) silence the type checker with a type: ignore
directive or 2) force it to accept the type by cast
ing it explicitly:
...
def foo(n: int, spam: type[ServiceA]) -> str:
return spam.meth(n)
from typing import cast
foo_a_1: Callable[[int], str] = inject(foo, ServiceA) # type: ignore[assignment]
foo_a_2 = cast(Callable[[int], str], inject(foo, ServiceA))
reveal_type(foo_a_1) # def (builtins.int) -> builtins.str
reveal_type(foo_a_2) # def (builtins.int) -> builtins.str
Both of those work, but the solution has a few drawbacks. Callable
is not expressive enough to to define parameter names (notice how the revealed type is missing the name n
as opposed to the very first example). You also cannot make distinctions between positional/keyword arguments at all. If those are issues for you, you might want to use a different approach.
Protocol
To have more fine-grained control over the call signature, you can define your own protocol for the function in question. Say for example you wanted the n
parameter in foo
to be positional-only:
...
def foo(n: int, /, spam: type[ServiceA]) -> str:
return spam.meth(n)
from typing import Protocol, cast
class PartialFoo(Protocol):
def __call__(self, n: int) -> str: ...
class PartialFoo(Protocol):
def __call__(self, n: int, /) -> str: ...
foo_a = cast(PartialFoo, inject(foo, ServiceA))
reveal_type(foo_a) # PartialFoo
foo_a_output = foo_a(1)
reveal_type(foo_a_output) # builtins.str
foo_a(1, ServiceA) # error: Too many arguments for "__call__" of "PartialFoo" [call-arg]
foo_a(n=1) # error: Unexpected keyword argument "n" for "__call__" of "PartialFoo" [call-arg]
Again, you have the option of either using # type: ignore
or cast
to enforce your type for foo_a
; I chose the latter here because PyCharm seems to get confused with the former. (Bug)
As you can see, the drawback here is that Mypy for example will display the type of foo_a
as PartialFoo
because that is what we called the protocol and not display something like (n: int) -> str
. That is of course because a protocol can be much more than just a callable, so reducing its representation to the call signature would not make much sense.
But you can also see that it allows you to correctly specify everything about the callable including parameter names and categories.
TYPE_CHECKING
hookSince you only care about influencing static analysis and don't want to affect the runtime at all, another option is to use the TYPE_CHECKING
constant and effectively create a stub for your function withing a conditional block that is only ever looked at by a type checker and never actually executed:
...
def foo(n: int, spam: type[ServiceA]) -> str:
return spam.meth(n)
from typing import TYPE_CHECKING
if TYPE_CHECKING:
def foo_a(n: int) -> str: ...
else:
foo_a = inject(foo, ServiceA)
reveal_type(foo_a) # def (n: builtins.int) -> builtins.str
Just to be clear: This is a literal ellipsis in the body of that foo_a
stub! There is no need to provide an actual (even a dummy) implementation.
This may be the best option in your case because you still get the default function representation from the type checker for foo_a
(not a protocol name) and you have all the freedom to express the signature with names and such.
The drawback is that it arguably makes the code a bit harder to read, unless you know what purpose those conditionals serve.
Depending on how flexible you are with the design of the functions that you want to pass to inject
, you can actually design it in a generic way to make the tricks mentioned above unnecessary.
If you could restrict yourself to making those functions (like foo
above) take the Service
classes as their very first arguments only, you could write a generic signature for inject
using Concatenate
and ParamSpec
that expresses this "swallowing" of the first n function parameters.
The idea is to then define multiple specific overload
s for the anticipated/common uses of inject
and one catch-all signature for the rest.
Something like this could work:
from collections.abc import Callable
from typing import Concatenate, ParamSpec, TypeVar, overload
P = ParamSpec("P")
R = TypeVar("R")
S1 = TypeVar("S1", bound="Service")
S2 = TypeVar("S2", bound="Service")
S3 = TypeVar("S3", bound="Service")
class Service:
...
@overload
def inject(
func: Callable[Concatenate[type[S1], P], R],
service1: type[S1],
/,
) -> Callable[P, R]: ...
@overload
def inject(
func: Callable[Concatenate[type[S1], type[S2], P], R],
service1: type[S1],
service2: type[S2],
/,
) -> Callable[P, R]: ...
@overload
def inject(
func: Callable[Concatenate[type[S1], type[S2], type[S3], P], R],
service1: type[S1],
service2: type[S2],
service3: type[S3],
/,
) -> Callable[P, R]: ...
@overload
def inject(
func: Callable[P, R],
/,
*services: type[Service],
) -> Callable[P, R]: ...
def inject( # type: ignore[empty-body]
func: Callable[..., R],
/,
*services: type[Service],
) -> Callable[..., R]:
...
And here is how you would use it:
class ServiceA(Service):
@staticmethod
def meth(a: int) -> str:
return "A..."
class ServiceA1(ServiceA):
@staticmethod
def meth(a: int) -> str:
return f"A1: {a}"
class ServiceA2(ServiceA):
@staticmethod
def meth(a: int) -> str:
return f"A2: {a}"
def foo(spam: type[ServiceA], n: int) -> str:
return spam.meth(n)
def bar(spam: type[ServiceA], eggs: type[ServiceA], n: int) -> str:
return spam.meth(n) + eggs.meth(n)
foo_A1 = inject(foo, ServiceA1)
bar_A1 = inject(bar, ServiceA1)
bar_A2 = inject(bar, ServiceA2)
bar_A1_A2 = inject(bar, ServiceA1, ServiceA2)
reveal_type(foo_A1) # def (n: builtins.int) -> builtins.str
reveal_type(bar_A1) # def (eggs: Type[ServiceA], n: builtins.int) -> builtins.str
reveal_type(bar_A2) # def (eggs: Type[ServiceA], n: builtins.int) -> builtins.str
reveal_type(bar_A1_A2) # def (n: builtins.int) -> builtins.str
As you can see, a competent type checker is able to correctly infer the types of those foo
and bar
variations after applying the inject
function because our overloads accommodate those injection calls.
Obviously, this particular version will break down once we want to use it with a function that takes more than three service arguments in the beginning and we will be left in the same situation as the one I laid out in the very beginning.
Also, as I mentioned above, the limitations of Concatenate
necessitate that the functions passed to inject
take those service classes before all the other arguments. The parameter specification passed to Concatenate
must be at the end, so there is no way to express "swallowing" an argument after the generic parameter type variable.
But once again, if you are the one in control of where inject
is used (as opposed to the user of your package), this may be good enough for you.