We are going to talk about the various types of numbers in Swift. Let’s begin with integers. Integers is a whole number with no fractional components. They can be signed or unsigned. Signed integers can have a positive or negative value. Unsigned integers can only have a positive value. Swift’s basic integer type is
Int. Let’s declare a variable of type Int:
var x : Int
Swift assumes an Int to be signed, so we can assign it a negative value:
var x : Int = -23
Now to make it unsigned we simply declare our variable of type
UInt. Most of the time we recommend simply using Int to declare integers.
// UInt types need to be positive var x : UInt = -23 // error: Integer literal overflows when stored into 'UInt"
How would we add a decimal component to our number? There are two different types in Swift, but you’ll be using one more than other. You can use a
Double or a
Float. The difference is that a double is a 64-bit number and a float is a 32-bit number. In most cases you can simply use a double.
Now we should talk briefly about type inference. If declare a variable with a value of 5 without explicitly specifying a type, Swift infers that is an integer.
// explicit integer var x : Int = 5 // implicit integer var x = 5
Now we declare a variable with a value of
5.5. Swift infers that x is a double. As we said before, doubles and floats can both have decimal components, but Swift will always choose double over float. So in this case the value of X is a double rather than a float.
// implicit double var x = 5.5
There are other ways we can enhance these numbers through what are called literals. I can turn variable into a binary number by using an 0b prefix and a series of 1’s and 0’s.
var x = 0b10001010
We can make it into a hex number with a 0x prefix and something like 4f.
var x = 0x4f
I can even use exponential notation where we can say something like 5.6e3, which is equal to 5,600.
var x = 5.6e3
Let’s move on to things we can do to numbers without actually changing their values. Say we have a large number with a lot of zeroes.
var x = 1000000000
In this case our number is 1 billion, but to someone reading it may not be immediately obvious. What we can do is add underscores without hanging the value of the number to make it more readable.
var x = 1000000000 var y = 1_000_000_000 // x = y = 1000000000
Finally let’s talk about the keyword typealias and it is exactly what it says: it is an alias for another type. Say we want to refer to an unsigned integer but in this context it makes sense to call our type Filesize rather than FileSize. Now under the hood it is being converted to UInt, but it might be easier for someone reading over this to refer to it as FileSize.
typealias FileSize = UInt var x : FileSize