Note: when working with non-primitive types, it is customary to use UpperCamelCase for better differentiation from primitive types and variable/property names. This convention will be followed in the explanations that follow.
This question consists of multiple components, so the answer will also have several parts to address each aspect.
Let's start by addressing the simpler ones:
type G = 1 & 2 // results in never
The recent implementation in TypeScript starting from version 3.6 now reduces empty intersections like `1 & 2` to `never`. Previously, `1 & 2` was essentially behaving similarly to `never`, as there was no value that could satisfy this intersection. Conceptually, there is no distinction between `1 & 2` and `never`, although minor differences may arise due to compiler implementation specifics.
Moving on to another example:
type I = ((x: 1 & 2) => 0) // why x not never
In this case, even though `x` indeed evaluates to `never`, the reduction happens only when it is actually utilized:
type IParam = Parameters<I>[0]; // results in never
This deferred behavior was introduced in TypeScript version 3.9 through a specific update detailed in microsoft/TypeScript#36696. Prior to this update, `x` would have been instantly reduced to `never` similar to what happened in the previous scenario.
Now let's delve into more complex scenarios:
type H = ((x: 1) => 0) & ((x: 2) => 0) // why H not never
There are various reasons why `H` does not result in `never`:
From a TypeScript perspective, an intersection of function types behaves similar to an overloaded function with multiple call signatures:
...
Furthermore, according to the rules of contravariance with function parameter types, the system should return the union of parameter types for an intersection of functions. Hence, `1 | 2` is not equivalent to `never` within this context.
Even if hypothetically `H` were entirely uninhabited, TypeScript currently limits the reduction of intersections to `never` in specific cases, which excludes intersections of functions based on existing implementations in the compiler source code.
Lastly, we'll consider one final example:
type E = (((x: 1) => 0) & ((x: 2) => 0)) extends (x: infer L) => 0 ? L : never;
// why E is 2 not never or 1?
In this instance, due to TypeScript's current design limitations regarding type inference with overloaded function types, the compiler selects just one call signature when inferring types, ignoring all other possibilities. Thus, in `E`, the compiler ends up inferring `L` as `2` instead of potentially considering `1 | 2`. This behavior is acknowledged as a limitation, as highlighted in relevant Github discussions such as microsoft/TypeScript#27027.
Access the Playground link here