Only one type of number in JavaScript whether it has decimal point or not.

const age = 100;
const money = 1000.50
console.log(typeof age); // number
console.log(typeof money); // number
"10" * "10" // 100 (number) - converts the strings to number

The above works with multiplication, division and subtraction and not addition, because the + sign is also used for concatenation.

0.1 + 0.2 // 0.30000000000000004

Why? Explanation

So, when working with money, don't store them as dollars and cents. Store all of the money in cents as you won't have to deal with fractions only whole nos. When need to display to user, just convert them back.

typeof Infinity; // number

typeof -Infinity; // number

10 / 'dog' // NaN

typeof NaN // number