I'm trying to create the type of an array of objects. The types of key1
and key2
in each object must be be the same, but that type can be anything. Here's an example of a valid array:
[{
key1: "hi",
key2: "world"
},{
key1: 1,
key2: 2
},{
key1: true,
key2: false
}]
This is what I've come up with but it doesn't exactly work. I have a generic type to define the object in the array. When calling it to generate the array type, an error is raised.
type ArrayItem<T> = {
key1: T,
key2: T
}
// This raises an error Generic Type ArrayItem requires 1 type argument
type Array = ArrayItem<T>[]
What is the best way to type a nested object like this (with type inference support)?
If you don't have a finite list of possible types for T
in ArrayItem<T>
, there's no concrete type in TypeScript corresponding to Array<ArrayItem<T>>
. To represent such a thing as a non-generic type would require something like existential types, which TypeScript doesn't directly support.
(If you do have a finite list, like ArrayItem<string> | ArrayItem<number> | ArrayItem<boolean>
, then you can just use a union like in the other answer.)
The closest you can come to this in TypeScript is as a generic type, and the best you'll do in terms of inference and compiler warnings will be to represent this as something like a generic constraint.
One way to do this is to write a generic helper function asMyArray()
accepting a tuple, and the compiler will check each element of the tuple to make sure it meets the constraint. One snag is that {key1: "hi", key2: 2}
does meet the constraint if you allow things like string | number
as T
. To prevent the compiler from happily accepting all pairs of types, I will try to make it infer T
from key1
only (see microsoft/TypeScript#14829 to see ways to prevent inferring from a particular inference site), and then just check that key2
matches that:
type NoInfer<T> = [T][T extends any ? 0 : 1]
const asMyArray = <T extends readonly any[]>(
x: [...({ [K in keyof T]: { key1: T[K], key2: NoInfer<T[K]> } })]) =>
x;
The generic type parameter T
is a tuple corresponding to the key1
values for each element of the passed-in array. The passed-in array, x
, is of a mapped tuple type. The & {}
bit lowers the inference priority of key2
. The [... ]
bit just prompts the compiler to infer a tuple and not an array (where it wouldn't be able to tell the different elements apart), Lets test it out:
const myArray = asMyArray([{
key1: "hi",
key2: "world"
}, {
key1: 1,
key2: 2
}, {
key1: true,
key2: false
}])
// const asMyArray: <[string, number, boolean]>(...)
You can see that T
is inferred as [string, number, boolean]
. This succeeds, while the following, in which T
is inferred the same way, fails:
const badArray = asMyArray([{
key1: "hi", key2: 123 // error!
// -------> ~~~~
// number not assignable to string
}, {
key1: 1, key2: "world" // error!
// ----> ~~~~
// string not assignable to number
}, {
key1: true, key2: false
}]);
Looks like what you want.