The code below is returning 0. I've tried all the samples I can find, but it still always returns 0. I'm on XCode 9.4.1
// Screen Size
let screenSize = UIScreen.main.bounds
let screenWidth = screenSize.width
let screenHeight = screenSize.height
NSLog("Screen Height is %i", screenHeight);
NSLog("Screen Width is %i", screenWidth);
// Screen Size
Just print the statement with print()
print("Screen Height is \(screenHeight)")
print("Screen Width is \(screenWidth)")
In your case, %i
is requesting an integer and screenHeight
and screenWidth
are floats. Therefore use %f
NSLog("Screen Height is %f", screenHeight);
NSLog("Screen Width is %f", screenWidth);