I am trying to draw a ruler, for any available iOS device, with accurate 1mm distances between start of lines/ticks. Usually i would get the PPI and calculate my distance of the pixels. Using Objective-C, it does not seem to work this way.
linesDist
should contain my 1mm distance in "screen coordinate pixels".
Any ideas, how i can achieve this?
my basic code looks like this: RulerView.m, which is a UIView:
-(void)drawRect:(CGRect)rect
{
[[UIColor blackColor] setFill];
float linesDist = 3.0; // 1mm * ppi ??
float linesWidthShort = 15.0;
float linesWidthLong = 20.0;
for (NSInteger i = 0, count = 0; i <= self.bounds.size.height; i = i + linesDist, count++)
{
bool isLong = (int)i % 5 == 0;
float linesWidth = isLong ? linesWidthLong : linesWidthShort;
UIRectFill( (CGRect){0, i, linesWidth, 1} );
}
}
EDIT ppi detect (really ugly), based on the answer below:
float ppi = 0;
switch ((int)[UIScreen mainScreen].bounds.size.height) {
case 568: // iPhone 5*
case 667: // iPhone 6
ppi = 163.0;
break;
case 736: // iPhone 6+
ppi = 154.0;
break;
default:
return;
break;
}
iPhones (with the possible exception of the iPhone6+) are 163 "logical" points per inch. Obviously phones from 4 onwards have double or more the resolution but that doesn't make any difference for the coordinate system.
1mm therefore is 163/25.4 or approximately 6.4. iPad is 5.2 points per mm, iPad mini is the same as iPhone.
-(void)drawRect:(CGRect)rect
{
[[UIColor blackColor] setFill];
float i;
float linesDist = 163.0/25.4; // ppi/mm per inch (regular size iPad would be 132.0)
float linesWidthShort = 15.0;
float linesWidthLong = 20.0;
for (i = 0, count = 0; i <= self.bounds.size.height; i = i + linesDist, count++)
{
bool isLong = (int)count % 5 == 0;
float linesWidth = isLong ? linesWidthLong : linesWidthShort;
UIRectFill( (CGRect){0, i, linesWidth, 1} );
}
}
You want to use a float for i in order to avoid rounding errors when adding the distances up and to avoid unnecessary conversions.