In my quest to create a type called GoodNestedIterableType
, I aim to transform something from
Iterable<Iterable<A>>
to just A
.
To illustrate, let's consider the following code snippet:
const arr = [
[1, 2, 3],
[4, 5, 6],
]
type GoodNestedIterableType<A> = A extends Iterable<infer B>
? B extends Iterable<infer C>
? C : never
: never
type GoodExtractedType = GoodNestedIterableType<typeof arr> // number
type BadNestedIterableType<A> = A extends Iterable<Iterable<infer B>>
? B
: never
type BadExtractedType = BadNestedIterableType<typeof arr> // unknown
The GoodExtractedType
correctly resolves to number
, but surprisingly, the BadExtractedType
ends up as unknown
.
Despite expectations, even this scenario functions as intended:
BadNestedIterableType<Iterable<Iterable<number>> // number
I am seeking insights into the underlying mechanics of this phenomenon. Can anyone shed light on this behavior?