The method I'm trying to use to get a bold faced iOS UIFont given a font family name only seems to work for some fonts. For example:
UIFont* font = [UIFont fontWithName:@"TimesNewRomanPS-BoldMT" size:12];
NSLog(@"A: Font name: %@", [font fontName]);
// desired method works for Helvetica Neue
UIFontDescriptor *desc = [[UIFontDescriptor alloc] init];
desc = [desc fontDescriptorWithFamily:@"Helvetica Neue"];
desc = [desc fontDescriptorWithSymbolicTraits:UIFontDescriptorTraitBold];
font = [UIFont fontWithDescriptor:desc size:12];
NSLog(@"B: Font name: %@", [font fontName]);
// desired method fails for Times New Roman
desc = [[UIFontDescriptor alloc] init];
desc = [desc fontDescriptorWithFamily:@"Times New Roman"];
NSLog(@"desc: %@", desc);
desc = [desc fontDescriptorWithSymbolicTraits:UIFontDescriptorTraitBold];
NSLog(@"desc bold: %@", desc);
prints (on iOS 8.1 simulator):
A: Font name: TimesNewRomanPS-BoldMT
B: Font name: HelveticaNeue-Bold
desc: UICTFontDescriptor <0x7f9be0d05e80> = { NSFontFamilyAttribute = "Times New Roman"; }
desc bold: (null)
Is this a bug, or is it not designed to work for every font family (which actually has a bold variant)? I really do not want to be forced to parse font names looking for "Bold" or some such thing.
This is a bug. fontDescriptorWithSymbolicTraits:
is guaranteed to return a font descriptor; returning nil
is therefore unexpected behavior.
In fact, if you rewrite the same thing with Swift, it will cause a crash because desc
isn't optional:
var desc = UIFontDescriptor()
desc = desc.fontDescriptorWithFamily("Times New Roman")
desc = desc.fontDescriptorWithSymbolicTraits(.TraitBold)
println(desc); //crash
Whether or not the UIFontDescriptor
would return a UIFont
is a separate question. You should file a radar.