I've run into an issue with combineReducers
not being strict enough and I'm not sure how to get around it:
interface Action {
type: any;
}
type Reducer<S> = (state: S, action: Action) => S;
const reducer: Reducer<string> = (state: string, action: Action) => state;
const reducerCreator = (n: number): Reducer<string> => reducer;
interface ReducersMapObject {
[key: string]: Reducer<any>;
}
const reducerMap: ReducersMapObject = {
test: reducer,
test2: reducerCreator
}
I would expect reducerMap
to throw an error because reducerCreator isn't a reducer (it's a function that takes a string and returns a reducer), but TypeScript is fine with this.
It seems that the source of the issue is that Reducer essentially boils down to any => any
because functions with fewer parameters are assignable to functions that take more params..
This means that the ReducersMapObject
type is basically just {[key: string]: function}
Is there a way to make the Reducer
type stricter about requiring both parameters or another way to get more confidence that the ReducersMapObject actually contains reducer functions?
This code all compiles in the TypeScript playground if you're trying to replicate
Nice question... there are two viable options toward the end of this rather long answer. You asked two questions, I answered each separately.
Is there a way to make the Reducer type stricter about requiring both parameters...
It will be difficult to achieve that, because of two obstacles in TypeScript functions.
One obstacle, which you already noted, is documented here under the heading "Comparing Two Functions." It says that "we allow 'discarding' parameters." That is, functions with fewer parameters are assignable to functions with more parameters. The rationale is in the FAQ. In short, the following assignment is safe because the function with fewer parameters "can safely ignore extra parameters."
const add: (x: number, y: number) =
(x: number) => { return x; }; // works and is safe
A second obstacle is that function parameters are bivariant. That
means we cannot work around this problem with a user-defined type parameter. In some languages, we could define Pair
along with a function that accepts a Pair
.
class Pair {
x: number;
y: number;
}
let addPair: (p: Pair) => number;
In languages with covariant functions, the above would restrict arguments to subtypes of Pair
.
In TypeScript, simple type assignment follows expected substitutability rules, but functions follow bivariant rules. In its simple type assignment, TypeScript allows us to assign type Pair
to type Single
but not to assign type Single
to type Pair
. This is an expected substitution.
class Single {
x: number;
}
let s: Single = new Pair(); // works
let p: Pair = new Single(); // fails because of a missing property.
TypeScripts functions, though, are bivariant, and are not held to the same restrictions.
let addSingle: (s: Single) => number;
addSingle = (p: Pair) => p.x + p.y; // as expected, Pair is assignable to Single.
let addPair: (p: Pair) => number;
addPair = (s: Single) => s.x; // surprise, Single is assignable to Pair!
The result is that a function that expects a Pair
will accept a Single
.
Reducers
Neither of the following two techniques will enforce the number of parameters (or class properties) that Reducer
implementations must accept.
class Action { }
// no restriction - TypeScript allows discarding function parameters
type Reducer01<S> = (state: S, action: Action) => S;
const reducer01: Reducer01<number> = (state: number) => 0; // works
// no restriction - TypeScript functions have parameter bivariance
class ReducerArgs<S> {
state: S;
action: Action;
}
type Reducer02<S> = (args: ReducerArgs<S>) => S;
const reducer02 = (args: { state: number }) => 0; // works
That probably will not be a problem practically, because letting ReducersMapObject
accept a Reducer
with fewer parameters is safe. The compiler will still ensure that:
Reducer
includes all the Reducer
arguments, and Reducer
only operates on its (possibly short) parameter list....or another way to get more confidence that the ReducersMapObject actually contains reducer functions?
One thing that we're trying to do is to make the reducerCreator
function (and other functions of unusual shape) incompatible with the Reducer<S>
function type. Here are two viable options.
The second technique from above, using a user-defined type called ReducerArgs<S>
, will give us more confidence. It will not provide complete confidence, because we will still have bivariance, but it will ensure that the compiler rejects reducerCreator
. Here is how it might look:
interface Action {
type: any;
}
// define an interface as the parameter for a Reducer<S> function
interface ReducerArgs<S> {
state: S;
action: Action
}
type Reducer<S> = (args: ReducerArgs<S>) => S;
const reducer: Reducer<string> = (args: ReducerArgs<string>) => args.state;
const reducerCreator = (n: number): Reducer<string> => reducer;
interface ReducersMapObject {
[key: string]: Reducer<any>;
}
const reducerMap: ReducersMapObject = {
test: reducer,
test2: reducerCreator // error!
}
Another option is to use a generic ReducerMapObject<T>
like this:
interface ReducersMapObject<T> {
[key: string]: Reducer<T>;
}
And then to parameterize it with a union type that lists the types of all the reducers.
const reducer: Reducer<string> = (state: string, action: Action) => state;
const reducer1: Reducer<number> = (state: number, action: Action) => state;
const reducerMap: ReducersMapObject<string | number> = {
test: reducer,
test1: reducer1,
test2: reducerCreator // error!
}
The result will be that any => any
becomes T => T
, where T
is one of the types listed in the union. (As an aside, it would be great to have a type that says, "x can be any type, so long as it is that same type as y.")
While both of the above involve more code and are a bit clunky, they do serve your purpose. This was an interesting research project. Thank you for the question!