When working with numbers in JavaScript there are two primitive types to choose from - BigInt and Number. One could expect implicit conversion from "smaller" type to "bigger" type which isn't a case in JavaScript.
When computing some combination of BigInt and Number user could expect implicit cast from Number to BigInt like in below example:
const number = 16n + 32; // DOESN'T WORK
// Expected: Evaluates to 48n
Expressions operating on both BigInt and Number are throwing an error:
const number = 16n + 32;
// Throws "TypeError: Cannot mix BigInt and other types, use explicit conversions"
Or in other words what is the reason behind this design?
This is documented in the original BigInt proposal: https://github.com/tc39/proposal-bigint/blob/master/README.md#design-goals-or-why-is-this-like-this
When a messy situation comes up, this proposal errs on the side of throwing an exception rather than rely on type coercion and risk giving an imprecise answer.