It appears that Typescript infers field types based solely on the private variable of a field, rather than taking into account the getter's return type union (1) or inferring from the getter itself (2):
test('field type inference', () => {
class A {
x_: number;
// 1: No type checking for the getter
get x(): number | undefined {
if (this.x_ === 1) return undefined;
return this.x_;
}
set x(v: number | undefined ) {
this.x_ = +v;
}
}
const a = new A();
a.x = 1;
// 2: Inferred type is number (based on x_, not getter)
const x: number = a.x;
console.log(a.x) // outputs 'undefined'
})
Questioning if this behavior is specified or expected?
Note that even with strictNullCheck
, these issues are not caught. It only detects missing initialization and setter errors.
An updated example to fix warnings with strictNullCheck
:
test('field type inference', () => {
class A {
x_: number;
get x(): number | undefined {
if (this.x_ === 1) return undefined;
return this.x_;
}
set x(v: number | undefined ) {
this.x_ = +(v ?? 0);
}
constructor(value: number) {
this.x_ = value;
}
}
const a = new A(2);
a.x = 1;
const x: number = a.x;
console.log(a.x)
});