typescript

Intersection of discriminated unions behaves strange


Here we have an example with discriminated unions:

type A = { a: true } | { a?: false }
type B = { b: true } | { b?: false }

type C = A & B

let c: C = {}

c = {
    a: true,
}

c = {
    b: false,
}

c = {
    a: true as boolean,
    b: false as boolean
}

c = { // ts err
    a: true as boolean,
}

c = { // ts err
    b: false as boolean,
}

How should I write a C type to make all of these examples valid?


Solution

  • Generally speaking TypeScript does not consider an object type with a union of properties to be assignable to a union of object types:

    const v: { x: string } | { x: number } =
        { x: Math.random() < 0.5 ? "a" : 1 }; // error
    

    You might say "well, {a: string | number} is always either a {a: string} or a {a: number} so they should be equivalent." But such equivalences tend to be fragile (they break if you add random properties) and hard to discover without a lot of extra work (they scale badly with the size of the union) so TypeScript mostly just fails to verify them.

    Before TypeScript 3.5 it always failed to see such equivalences. But TypeScript 3.5 introduced so-called smarter union checking, as implemented in microsoft/TypeScript#30779. Now if you try to assign an object-of-unions to a union-of-objects it might succeed, if the type you're assigning to is a discriminated union and the object you're assigning from expands to fewer than 25 union members when it tries to analyze them, and... well, it's fragile and complicated, you can read about it in microsoft/TypeScript#30779 above. And there are known bugs or limitations in it, such as microsoft/TypeScript#57013 and microsoft/TypeScript#59716.

    When in doubt, you should not rely on such equivalences in your code.


    So instead of trying to assign {a?: boolean, b?: boolean} to A & B, which is {a: true, b: true} | {a: true, b?: false} | {a?: false, b: true} | {a?: false, b?: false}, I'd say you should just try to compute {a?: boolean} from A and {b?: boolean} from B before the intersection, which gives you {a?: boolean, b?: boolean}.

    Maybe like this:

    type Merge<T> = [T] extends infer U extends [any] ?
        { [K in keyof U[0]]: U[0][K] } : never
    
    type C = Merge<A> & Merge<B>
    /* type C = {
        a?: boolean | undefined;
    } & {
        b?: boolean | undefined;
    } */
    

    The Merge<T> type mostly just performs an identify mapped type on the properties of T, but it keeps T as a single object type. If you tried to refactor that to {[K in keyof T]: T[K]}, TypeScript would automatically distribute that across unions in T, because T is a bare generic type parameter. You explicitly don't want Merge<X | Y> to be evaluated as Merge<X> | Merge<Y>.

    To avoid distribution over unions, we need to write the mapped type over something other than a bare generic type parameter. I've used infer to "copy" [T] into a new generic type parameter U, so U[0] is the original T. But since U[0] is not a bare type parameter, mapping over K in keyof U[0] doesn't trigger the distributive behavior.

    Anyway, Merge<A> produces {a?: boolean} and Merge<B> produces {b?: boolean}, and the intersection works with all your examples:

    let c: C;
    c = {}; // okay
    c = { a: true }; // okay
    c = { b: false }; // okay
    c = { a: Math.random() < 0.5, b: Math.random() < 0.5 }; // okay
    c = { a: Math.random() < 0.5 }; // okay
    c = { b: Math.random() < 0.5, }; // okay
    

    Looks good!

    Playground link to code