How do you specify that a class property is an integer?

Typescript

Typescript Problem Overview


I'm experimenting with TypeScript, and in the process of creating a class with an ID field that should be an integer, I have gotten a little confused.

First off, in Visual Studio 2012 with the TypeScript plugin, I see int in the intelliSense list of types. But I get a compile error that says:

> the name 'int' does not exist in the current scope.

I reviewed the language specs and see only the following primitive types: number, string, boolean, null, and undefined. No integer type.

So, I'm left with two questions:

  1. How should I indicate to users of my class that a particular field is not just a number but an integer (and never a floating point or decimal number)?

  2. Why do I see int in the intellisense list if it's not a valid type?

Update: All the answers I've gotten so far are about how JavaScript doesn't have an int type, it would be hard to enforce an int type at runtime... I know all that. I am asking if there is a TypeScript way to provide an annotation to users of my class that this field should be an integer. Perhaps a comment of some particular format?

Typescript Solutions


Solution 1 - Typescript

  1. I think there is not a direct way to specify whether a number is integer or floating point. In the TypeScript specification section 3.2.1 we can see: > "...The Number primitive type corresponds to the similarly named JavaScript primitive type and represents double-precision 64-bit format IEEE 754 floating point values..."

  2. I think int is a bug in Visual Studio intelliSense. The correct is number.

Solution 2 - Typescript

TypeScript is a superset of JavaScript, which doesn't have a concept of an int. It only has the concept of a number, which has a floating point.

Generally speaking, the amount of work the compiler would have to do to enforce only whole numbers for a TypeScript int type could potentially be massive and in some cases it would still not be possible to ensure at compile time that only whole numbers would be assigned, which is why it isn't possible to reliably add an int to TypeScript.

When you initially get intelliSense in Visual Studio, it isn't possible for the tooling to determine what to supply, so you get everything, including int - but once you are dealing with something of a known type, you'll get sensible intelliSense.

Examples

var myInt: number;
var myString: string;

myInt. // toExponential, toFixed, toPrecision, toString
myString. // charAt, charCodeAt, concat, indexOf, lastIndexOf, length and many more...

Solution 3 - Typescript

In TypeScript you can approximate what is sometimes called an opaque type using a marker.

// Helper for generating Opaque types.
type Opaque<T, K> = T & { __opaque__: K };

// 2 opaque types created with the helper
type Int = Opaque<number, 'Int'>;
type ID = Opaque<number, 'ID'>;

// using our types to differentiate our properties even at runtime
// they are still just numbers
class Foo {
	someId: ID;
	someInt: Int;
}

let foo = new Foo();

// compiler won't let you do this due to or markers
foo.someId = 2;
foo.someInt = 1;

// when assigning, you have to cast to the specific type
// NOTE: This is not completely type safe as you can trick the compiler 
// with something like foo.someId = 1.45 as ID and it won't complain.
foo.someId = 2 as ID;
foo.someInt = 1 as Int;

// you can still consume as numbers
let sum: number = foo.someId + foo.someInt;

Doing this allow you to be more explicit in your code as to what types your properties expect, and the compiler won't allow you to assign a primitive value without a cast. This doesn't produce any additional .js output, and you can still consume and use the values as whatever types they are based on. In this example I'm using numbers, but you can use on strings and other types as well.

You can still trick the compiler into accepting something that isn't an Int or an Id in this example, but it should jump out if you were trying to assign 1.45 as Int or something like that. You also have the option of creating helper functions that you use to create your values to provide runtime validation.

There's a number of different ways you can create "marked" types. Here's a good article: https://michalzalecki.com/nominal-typing-in-typescript/

Solution 4 - Typescript

There is no integer or float but number type in TypeScript like in JavaScript. But if you want tell programmer that you expect integer type you can try to use Type Aliases like

type integer = number;
type float = number;

// example:
function setInt(id: integer) {}

but this is still number type and you can get float.

Part of description from documentation:
"Aliasing doesn’t actually create a new type - it creates a new name to refer to that type. Aliasing a primitive is not terribly useful, though it can be used as a form of documentation."

Solution 5 - Typescript

This is the top result on Google for me so I figure I should provide the solutions I found.

Using bigint

Now that it's 2020 and bigint has been accepted, it deserves a mention. You can simply do the below. Beware that bigints come with a bigger performance impact compared to a number.

const myNumber: bigint = 10n


Using a nominal type / tagged type / opaque type

An alternative is to use a nominal type, but it's arguably less ergonomic and I'm not sure if it's any faster than bigint, but the pattern does generalise to any type, not just number. TypeScript doesn't have "first-class" support for this so you have to do a cheeky hack. There's a library for this called newtype-ts that includes common types like Integer so you might just want to just use that, but I'll explain the workings below.

To start out with we define the integer type.

const TAG = Symbol()
type integer = number & { readonly [TAG]: unique symbol }

The TAG ensures we have a unique value so that we don't accidentally make an object with the same key, and we make the field a unique symbol too for the same reason. Now, your integer won't actually have this object field but that's fine.

With this you can still add integer to number using +. Not good. So you can enforce type safety on the arguments here by massaging the type system with a function. I'm just gonna call it guard, and again as you can see it isn't specific to integers – you could make more opaque types and use this again.

type guard = <A>(f: (...ns: Array<A>) => A, ...ns: Array<A>) => A
const guard: guard = (f, ...ns) => f(...ns)

If you try to call that with a number

const bad: integer = guard((a, b) => a + b as integer, myCoolInteger, 10)

you'll get an error like below

Argument of type '10' is not assignable to parameter of type 'integer'.
  Type '10' is not assignable to type '{ readonly [TAG]: unique symbol; }'.(2345)

Note that you aren't enforcing the return type here (because you have to use as integer) and some operators like / will return floating point numbers so you probably still want to do runtime checks or add a Math.round to a specialised version of guard, but this will at least ensure you're not trying to use two separate numeric types together – imagine you have GBP and USD and try to add those, that would likely not be what you intended.

Solution 6 - Typescript

AssemblyScript supports integer values (with its asm.js backend). You can generate a JavaScript file using --jsFile.

To do it manually, you can force a value to be an integer by adding | 0.

This improves the performance because the browser engine is now able to use integers, which are faster than floats in the operations.

Say you intend to write:

function add(a: i32, b: i32) {
  return a + b
}

You should write it as the following

type i32 = number
function add(a: i32, b: i32) {
  a = a | 0
  b = b | 0
  return a + b
}

or in JavaScript (this is what asc --jsFile generates)

function add(a, b) {
  a = a | 0
  b = b | 0
  return a + b
}

For class properties:

type i32 = number
class MyClass {
  a: i32 = a | 0
}

Solution 7 - Typescript

Well, as you have seen, typescript haven't float data type such as javascript language. Only have the number that cover all int and double at same time; maybe you must make a function that take a number and check it if it's a int or double, by returning some state in case error/success. Something like this as method of your class:

function SetN(x:number) {
   var is_int = parseInt(x) === parseFloat(x);
   if(is_int) this.n = x;
   return is_int;
}

//..
y = 10.5;
if(SetN(y)) {
  //OK
} else {
   //error not set y isn't a int
}

Note: it doest not works for 10.0 e.g. If you want no really it, maybe you must conver it to string and try to find a ..

Solution 8 - Typescript

int was reserved for future use keyword in earlier versions of javascript (ECMAScript if you prefer). But it is a valid word now (where "now" equates to "in the latest spec").

For instance, in 262 it was still reserved, http://www.ecma-international.org/publications/files/ECMA-ST/Ecma-262.pdf

It would make nice addition to typescript to have an int datatype implemented but with all compile-time type checking and casting rules available.

Solution 9 - Typescript

Here is an implementation of number interface that doesn't do boxing. I think it would be possible to use this design to create an Integer type

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionJoshView Question on Stackoverflow
Solution 1 - TypescriptDiulleiView Answer on Stackoverflow
Solution 2 - TypescriptFentonView Answer on Stackoverflow
Solution 3 - TypescriptbinglesView Answer on Stackoverflow
Solution 4 - TypescriptMariusz CharczukView Answer on Stackoverflow
Solution 5 - TypescriptShouView Answer on Stackoverflow
Solution 6 - TypescriptAminView Answer on Stackoverflow
Solution 7 - TypescriptJackView Answer on Stackoverflow
Solution 8 - TypescriptArek BalView Answer on Stackoverflow
Solution 9 - TypescriptJames WakefieldView Answer on Stackoverflow