We work on the following interface
interface A {
a: string
b: string
c?: number
d?: number
}
And we have a type which makes every key in T
optional if their type is string
and required if it is number
type B<T> = {
[K in keyof T as T[K] extends (number|undefined) ? K : never]-?: T[K]
} & {
[K in keyof T as T[K] extends (string|undefined) ? K : never]+?: T[K]
}
/* The resulting type will be:
type B<A> = {
c: number;
d: number;
} & {
a?: string | undefined;
b?: string | undefined;
}
*/
However, if we change the interface we are working on to only include one of the types specified in the condition, {}
, which almost corresponds to any
will be added to the resulting type
interface A1 {
a: string
b: string
}
/* The resulting type will be:
type B<A1> = {} & {
a?: string | undefined;
b?: string | undefined;
}
*/
This will allow assigning many unwanted types to B, defeating out purpose. For example
const b: B<A1> = "We don't want this to happen." // <-- We need an error here.
How to prevent the resulting type from including {}
? I want B<A1>
to result in the following type
{
a?: string | undefined;
b?: string | undefined;
}
I have simplified the type by removing the generic, so that resulting types are visible. You can check it here
@Aplet123 points to https://github.com/microsoft/TypeScript/issues/42864 as the root of the problem. A workaround (probably there are others even simpler): a helper that forces it to be an object type:
type Obj<T> = { [K in keyof T]: T[K] };
const b: Obj<B<A1>> = "We don't want this to happen." // Error! Good.