With integers, we generally leave off the decimal point.
1 is technically written:
...000001.000000... in decimal, so there are a load of ways of representing it. 1.0, 01.0, 1.000, etc. Most of the time you ignore the extra 0s since they don't really help
1 is technically written:
...000001.000000... in decimal, so there are a load of ways of representing it. 1.0, 01.0, 1.000, etc. Most of the time you ignore the extra 0s since they don't really help
