Say I have a function that takes a value and a arbitrary number of functions, let's call the function for chain_call.
Without types a simple naive implementation would be:
def chain_call(input_value, *args):
for function in args:
input_value = function(input_value)
return input_value
As you imagine, input_value could be anything really but it's always the same as the first and only required argument of the first Callable
in *args: List[Callable]
.
From here and forward the Callable
´s first and only required argument is the same type as the previous items return-type.
So far I've managed to define a quite generic type but it's too loose.
def chain_call(input_value: Any, *args: List[Callable[Any], Any]) -> Any: ...
What I'd really like is something like
T = TypeVar('T')
def chain_call(input_value: T, *args: List[Callable[T, ...], tr]) -> tr: ...
Where T
for Callable n+1
is tr
of Callable n
and the final return-type is tr of Callable n_max
. I'm not sure how to express this with the type system and would love any guidance.
This fully typed function exists in dry-python/returns
.
We call it flow
:
from returns.pipeline import flow
assert flow('1', int, float, str) == '1.0'
The thing is that flow
is fully typed via a custom mypy
plugin we ship with our library. So, it will catch this error case (and many others):
from returns.pipeline import flow
def convert(arg: str) -> float:
...
flow('1', int, convert)
# error: Argument 1 to "convert" has incompatible type "int"; expected "str"
Docs: https://returns.readthedocs.io/en/latest/pages/pipeline.html