Unicode Character from Code in String (Obj-C)

I have a unicode hex value in an NSString – how do I output the character; here’s what I have:

NSLog(@"\U0001D000");

NSMutableString *hexString = [[NSMutableString alloc] initWithString:@"0001D000"];
[hexString insertString:@"\\U" atIndex:0];
NSLog(@"%@", hexString);

The first NSLog outputs the character; the second just produces the output “\U0001D000”

  • Using Objective C/Cocoa to unescape unicode characters, ie \u1234
  • How can I add a degree sign to a string?
  • Why is Swift counting this Grapheme Cluster as two characters instead of one?
  • Validate Unicode code point in Swift
  • Inconsistent Unicode Emoji Glyphs/Symbols
  • NSString to treat “regular english alphabets” and characters like emoji or japanese uniformly
  • I’ve tried lots of combinations and am at a loss – for example, I tried

    NSLog(@"\U%@", hexString);
    

    But this gives a complier error, as it is looking for a string of numbers after the \U

    Solutions Collect From Internet About “Unicode Character from Code in String (Obj-C)”

    If your character requires a surrogate pair (U+10000 to U+10FFFF), use CFStringGetSurrogatePairForLongCharacter to convert the Unicode code point into a UTF-16 surrogate pair, and then -initWithCharacters:length: to convert it into an NSString. For example:

    UniChar c[2];
    CFStringGetSurrogatePairForLongCharacter(0x1D000, c);
    NSString *s = [[NSString alloc] initWithCharacters:c length:2];
    

    For other characters (CFStringGetSurrogatePairForLongCharacter returns FALSE), you can skip the conversion and go straight to -initWithCharacters:length:.