typescriptcasting

Type to cast from string into number


I want to create a type which will be able to extract number from strings which can be converted into number

The best what I could achieve is this

type StringToNumber<T extends `${number}`> = T extends `${infer N extends number}` ? N : "NaN";

I started to experiment in playground and wrote this:

type StringToNumber<T extends `${number}`> = T extends `${infer N extends number}` ? N : "NaN";

type test1 = StringToNumber<"255">; // 255
type test2 = StringToNumber<"0xff">; // number
type test3 = StringToNumber<"0b11111111">; // number

type test4 = StringToNumber<"NotANumber">; // error

As you can see TypeScript can easily understand that string is not like valid number format so test4 gives error

But when I pass number in binary or hex format then it just return number. It is valid answer but not precise like when I pass integer or float number

Is there any way to get 255 in first 3 tests?


More detailed:

I'm writing a typescript library where I'm defining overloads for standard JS functions/methods

At this moment I'm writing overloads for Number() function and defined two overloads:

interface NumberConstructor {
    <Value extends number>(value: Value): Value;

    <Value extends number>(value: `${Value}`): Value;
}

And wrote simple tests:

import { expectTypeOf } from 'vitest';

const TEST_1 = Number(255);
expectTypeOf(TEST_1).toEqualTypeOf<255>();

const TEST_2 = Number(0xff);
expectTypeOf(TEST_2).toEqualTypeOf<255>();

const TEST_3 = Number('255');
expectTypeOf(TEST_3).toEqualTypeOf<255>();

const TEST_4 = Number('0xff');
expectTypeOf(TEST_4).toEqualTypeOf<255>();

And here is the actual result:

enter image description here

As you can see I can't pass 4th test. And I can't understand the reason...

  1. TypeScript can easily understand that is the string number format or not
  2. TypeScript can convert 0xff into 255 if it was written as number

So if TypeScript is able to parse the string by itself and can convert hex (and other) format into decimal also by itself why it isn't able to mix these two functionalities?


Solution

  • The type

    type StringToNumber<T extends `${number}`> = T extends `${infer N extends number}` ? N : "NaN";
    

    only works at all since TypeScript 4.8 introduced improved inference for infer types in template literal types, as implemented in microsoft/TypeScript#48094. Note the semantics of what it means: it tries to infer a numeric type N such that T extends `${N}` is satisfied.

    So if T is "255", then N can be inferred as the numeric literal type 255, because `${255}` is "255". In JavaScript if you evaluated the template literal string `${255}` you'd get "255", and TypeScript's template literal types mostly mirror JavaScript's behavior.

    But if T is "0xff", then N cannot be inferred as the numeric literal type 0xff (which is the same as the numeric literal type 255, you can demonstrate this by evaluating type X = 0xff and inspecting X). Because the template literal string `${0xff}` in JavaScript still produces the string "255". After all, the number 0xff and 255 are the same, and when you convert either into a string, you get "255". Since `${0xff}` is "255", then you're checking "0xff" extends `${0xff}` which is "0xff" extends "255", which is false. So the inference fails and you fall back to number (personally I think the inference should just fail, since there's no numeric value n such that `${n}` produces "0xff", see my rant discussion if you're interested).

    Again, the reason you can't use StringToNumber to emulate the output of Number() is because it actually emulates the input of String(). It would be wonderful if TypeScript had a native StringToNumber operator, perhaps as part of a suite of tools to do type literal juggling for numeric types (see microsoft/TypeScript#26382). But it doesn't have this. So whenever you run into an obstacle like this you can either give up, or you have to start jumping through lots of crazy hoops to implement user-defined types that behave how you want. You can try to do this (see one possible approach linked by @mattkantor in a comment above), but it's not easy. You'd need to split "ff" into characters and then implement both base-10 addition and multiplication yourself (remember to carry). And even if that works it's likely to have weird edge cases, especially around compiler performance. So giving up seems like the more sane approach, unless you can be sure that your use cases warrant the complexity.