Extract the CGPathRef from a Kanji - text

I would like to get the outline of a Kanji (Japanese) character.
The following code is working for latin characters:
[letter drawInRect:brect withAttributes:attributes];
[...]
CGGlyph glyph;
glyph = [font glyphWithName: letter];
CGPathRef glyphPath = CTFontCreatePathForGlyph((__bridge CTFontRef) font, glyph, NULL);
CGPathAddPath(path0, &transform, glyphPath);
When letter is a Kanji, for example 男, the character is correctly drawn, but the CGPathRef is a square. What do I need to extract the outline of a Kanji ?

The method glyphWithName: expects a glyphName, not the character. For simple latin characters the glyphName is the same as the character – #"A". As far as I know, Kanji do not have names, although hiragana and katakana do. Kanji are simply too numerous, and many of the glyphs are variants on the same Kanji.
So you have to use a different method with Kanji. Here is an example that works for me.
// Convert a single character to a bezier path
- (UIBezierPath *)bezierPathFromChar:(NSString *)aChar inFont:(CTFontRef)aFont {
// Buffers
unichar chars[1];
CGGlyph glyphs[1];
// Copy the character into a buffer
chars[0] = [aChar characterAtIndex:0];
// Encode the glyph for the single character into another buffer
CTFontGetGlyphsForCharacters(aFont, chars, glyphs, 1);
// Get the single glyph
CGGlyph aGlyph = glyphs[0];
// Find a reference to the Core Graphics path for the glyph
CGPathRef glyphPath = CTFontCreatePathForGlyph(aFont, aGlyph, NULL);
// Create a bezier path from the CG path
UIBezierPath *glyphBezierPath = [UIBezierPath bezierPath];
[glyphBezierPath moveToPoint:CGPointZero];
[glyphBezierPath appendPath:[UIBezierPath bezierPathWithCGPath:glyphPath]];
CGPathRelease(glyphPath);
return glyphBezierPath;
}
Use like this:
NSString *theChar = #"男";
CTFontRef font = CTFontCreateWithName(CFSTR("HiraKakuProN-W6"), 114.0, NULL);
UIBezierPath *glyphBezierPath = [self bezierPathFromChar:theChar inFont:font];
Edit - another way of defining the font that can be localized:
CTFontRef font = CTFontCreateWithName((CFStringRef)#"Helvetica-Bold", 114.0, NULL);

Related

Replace some characters in a string with the next unicode character

I have an input text as following:
inputtext = "This is a test";
I need to replace some of the character (based on a certain criteria) to next unicode character
let i = 0;
for c in inputtext.chars() {
if (somecondition){
// Replace char here
inputtext.replace_range(i..i+1, newchar);
// println!("{}", c);
}
What is the best way to do this?
You can't easily update a string in-place because a Rust string is not just an array of characters, it's an array of bytes (in UTF-8 encoding), and different characters may use different numbers of bytes. For example, the character ߿ (U+07FF "Nko Taman Sign") uses two bytes, whereas the next Unicode character ࠀ (U+0800 "Samaritan Letter Alaf") uses three.
It's therefore simplest to turn the string into an iterator of characters (using .chars()), manipulate that iterator as appropriate, and then construct a new string using .collect().
For example:
let old = "abcdef";
let new = old.chars()
// note: there's an edge case if ch == char::MAX which we must decide
// how to handle. in this case I chose to not change the
// character, but this may be different from what you need.
.map(|ch| {
if somecondition {
char::from_u32(ch as u32 + 1).unwrap_or(ch)
} else {
ch
}
})
.collect::<String>();

HTML5 Canvas: Rendering aliased text

When you render text on an HTML5 canvas (using the fillText command, for example), the text will render anti-aliased, meaning the text looks smoother. The downside is that it becomes very noticable when trying to render small text or specifically non-aliased fonts (such as Terminal). Because of this, what I want to do is render text aliased, rather than anti-aliased.
Is there any way to do so?
Unfortunately, there is no native way to turn off anti-aliasing for text.
The solution is to use the old-school approach of bitmap fonts, that is, in the case of HTML5 canvas a sprite-sheet where you copy each bitmap letter to the canvas. By using a sprite-sheet with transparent background you can easily change the color/gradient etc. of it as well.
An example of such bitmap:
For it to work you need to know what characters it contains ("map"), the width and height of each character, and the width of the font bitmap.
Note: In most cases you'll probably end up with a mono-spaced font where all cells have the same size. You can use a proportional font but in that case you need to be aware of that you need to map each character with an absolute position and include the width and height as well for its cell.
An example with comments:
const ctx = c.getContext("2d"), font = new Image;
font.onload = () => {
// define some meta-data
const charWidth = 12; // character cell, in pixels
const charHeight = 16;
const sheetWidth = (font.width / charWidth)|0; // width, in characters, of the image itself
// map so we can use index of a char. to calc. position in bitmap
const charMap = " !\"#$% '()*+,-./0123456789:;<=>?#ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwxyz{|}~§";
// Draw some demo text
const timeStart = performance.now();
fillBitmapText(font, "Demo text using bitmap font!", 20, 20);
fillBitmapText(font, "This is line 2...", 20, 45);
const timeEnd = performance.now();
console.log("Text above rendered in", timeEnd - timeStart, "ms");
// main example function
function fillBitmapText(font, text, x, y) {
// always make sure x and y are integer positions
x = x|0;
y = y|0;
// current x position
let cx = x;
// now, iterate over text per char.
for(let char of text) {
// get index in map:
const i = charMap.indexOf(char);
if (i >= 0) { // valid char
// Use index to calculate position in bitmap:
const bx = (i % sheetWidth) * charWidth;
const by = ((i / sheetWidth)|0) * charHeight;
// draw in character on canvas
ctx.drawImage(font,
// position and size from font bitmap
bx, by, charWidth, charHeight,
// position on canvas, same size
cx, y, charWidth, charHeight);
}
cx += charWidth; // increment current canvas x position
}
}
}
font.src = "//i.stack.imgur.com/GeawH.png";
body {background:#fff}
<canvas id=c width=640></canvas>
This should produce an output similar to this:
You can modify this to suit your needs. Notice that the bitmap used here is not transparent - I'll leave that to OP.

How to include emoticons in Swift string?

Here is a pretty good article that references iOS emoticons and their code. For example \ue008 for the small camera.
I tried this in my code :
var myText: String = "\ue008"
This is not accepted by Xcode. How to include it ?
If I understand what you are trying to achieve, then:
Press "ctrl + cmd + space" while in XCode. A sample usage of 'hearts' emoticon
cell.textLabel?.text = "❤️" + " \(liker) liked \(userBeingliked)'s photo"
That's from swift documentation:
let dollarSign = "\u{24}" // $, Unicode scalar U+0024
let blackHeart = "\u{2665}" // ♥, Unicode scalar U+2665
let sparklingHeart = "\u{1F496}" // 💖, Unicode scalar U+1F496
You don't need the unicode constants at all. Just use the character viewer and type the character directly. 😝
let sparklingHeart = "💖"
1 Decoding the Unicode:
extension String {
var decodeEmoji: String{
let data = self.data(using: String.Encoding.utf8);
let decodedStr = NSString(data: data!, encoding: String.Encoding.nonLossyASCII.rawValue)
if let str = decodedStr{
return str as String
}
return self
}
}
Usage
let decodedString = yourString.decodeEmoji
2 Encoding the Unicode:
extension String {
var encodeEmoji: String{
if let encodeStr = NSString(cString: self.cString(using: .nonLossyASCII)!, encoding: String.Encoding.utf8.rawValue){
return encodeStr as String
}
return self
}
}
Usage
let encodedString = yourString.encodeEmoji
You could insert the emoji directly using ⌘ ^ Space.
Or, based on Greg's answer:
var myText: String = "\u{e008}"
As Greg posted above, you can directly input the emoji into Swift using the OSx character viewer. The character viewer is disabled by default. Here is how to enable it:
Go to System Preferences > Language and Region > Keyboard Preferences > Keyboard then check Show Keyboard, Emoji, & Symbol Viewers in menu bar. Once checked you can open the character viewer from the top right menu bar next to your Wifi and Date/Time icons.
from your Hex "0x1F52D" to actual Emoji
let c = 0x1F602
next step would possibly getting an Uint32 from your Hex
let intEmoji = UnicodeScalar(c!).value
from this you can do something like
titleLabel.text = String(UnicodeScalar(intEmoji)!)
here you have a "😂"
it work with range of hexadecimal too
let emojiRanges = [
0x1F600...0x1F636,
0x1F645...0x1F64F,
0x1F910...0x1F91F,
0x1F30D...0x1F52D
]
for range in emojiRanges {
for i in range {
let c = UnicodeScalar(i)!.value
data.append(c)
}
}
to get multiple UInt32 from your Hex range for exemple
Chris Slowik's and Greg's answers are close.
The easiest answer is just to "rephrase" your String from this:
var myText: String = "\ue008"
To this:
var myText: String = "\u{008}"
The Unicodes found on the link you've attached are not wrong, as someone else claimed. You just need to rephrase it inside the String.
The important piece of code in your example above is the "008" part.
I've created a simple function to convert these kinds Unicode to their corresponding Emojis:
func convertHexToEmoji(_ u:Int) -> String {
return "\(UnicodeScalar(u)!)" }
To use:
let myText = convertHexToEmoji(008)
print(myText)
This took me a bit of time to figure out in MacOS 11, so I thought I would share.
If you prefer to input the unicode characters rather than pasting literal emojis, you can find out the unicode for the system emojis like this:
Focus/click into a text field (e.g. the search bar in your web browser).
Press ctrl+cmd+space or go to Edit->Emoji & Symbols in the menu bar.
Scroll up in the character viewer until you see the window expand icon in the upper right:
In the expanded Character Viewer window, press the upper left button and select Customize List....
Scroll down to Code Tables minimized list, expand the list, toggle on Unicode, and press Done (system changed this window to dark mode for whatever reason).
Now, click the different emojis and you should see the unicode underneath the image.
Then you inject it the unicode like this:
var myText: String = "\u{e008}"

How to remove accents / diacritic marks from a string in Qt?

How to remove diacritic marks from a string in Qt. For example, this:
QString test = QString::fromUtf8("éçàÖœ");
qDebug() << StringUtil::removeAccents(test);
should output:
ecaOoe
There is not straighforward, built-in solution in Qt. A simple solution, which should work in most cases, is to loop through the string and replace each character by their equivalent:
QString StringUtil::diacriticLetters_;
QStringList StringUtil::noDiacriticLetters_;
QString StringUtil::removeAccents(QString s) {
if (diacriticLetters_.isEmpty()) {
diacriticLetters_ = QString::fromUtf8("ŠŒŽšœžŸ¥µÀÁÂÃÄÅÆÇÈÉÊËÌÍÎÏÐÑÒÓÔÕÖØÙÚÛÜÝßàáâãäåæçèéêëìíîïðñòóôõöøùúûüýÿ");
noDiacriticLetters_ << "S"<<"OE"<<"Z"<<"s"<<"oe"<<"z"<<"Y"<<"Y"<<"u"<<"A"<<"A"<<"A"<<"A"<<"A"<<"A"<<"AE"<<"C"<<"E"<<"E"<<"E"<<"E"<<"I"<<"I"<<"I"<<"I"<<"D"<<"N"<<"O"<<"O"<<"O"<<"O"<<"O"<<"O"<<"U"<<"U"<<"U"<<"U"<<"Y"<<"s"<<"a"<<"a"<<"a"<<"a"<<"a"<<"a"<<"ae"<<"c"<<"e"<<"e"<<"e"<<"e"<<"i"<<"i"<<"i"<<"i"<<"o"<<"n"<<"o"<<"o"<<"o"<<"o"<<"o"<<"o"<<"u"<<"u"<<"u"<<"u"<<"y"<<"y";
}
QString output = "";
for (int i = 0; i < s.length(); i++) {
QChar c = s[i];
int dIndex = diacriticLetters_.indexOf(c);
if (dIndex < 0) {
output.append(c);
} else {
QString replacement = noDiacriticLetters_[dIndex];
output.append(replacement);
}
}
return output;
}
Note that noDiacriticLetters_ needs to be a QStringList since some characters with diacritic marks can match to two single characters. For example œ => oe
Your question is a bit misleading. You seem to want to do more than merely remove diacritical marks (œ is a ligature letter without diacritics). I guess you want to turn any Unicode string into a roughly corresponding ASCII string?
For diacritics, you could perform a decomposing Unicode normalization (NFD or NFKD, depending on your specific needs) and then remove all characters of the "Mark" categories (QChar::Mark_NonSpacing, QChar::Mark_SpacingCombining and QChar::Mark_Enclosing).
For everything else (e.g. œ), I don't know of a generic solution. Create a look-up table with all your desired replacements and then search and replace (see Laurent's answer).
A partial solution is to use QString::normalized, than remove the special characters.
QString test = QString::fromUtf8("éçàÖœ");
QString stringNormalized = test.normalized (QString::NormalizationForm_KD);
stringNormalized.remove(QRegExp("[^a-zA-Z\\s]"));
This is however a partial solution because it will not convert "œ" into "oe".
There is crude way to partial solve your problem (just accents, but not ligatures such as "oe").
QString title=QString::fromUtf8("éçàÖ");
qDebug("%s\n", title.toLocal8Bit().data());

Win32 Text Drawing Puzzle

I've got a little text drawing puzzle under Win32. I'm trying to draw some instructions for users of my application at the top of the window.
Please refer to the following window (I've changed the background color on the text so you can see the boundaries)
(source: billy-oneal.com)
I'm currently using DrawTextEx to draw the text to my window, but the problem is that it does not fill the entire RECTangle that I give it. Not drawing that area is just fine, until the window resizes:
(source: billy-oneal.com)
When the text is re wrapped due to the window sizing, because DrawTextEx doesn't clear it's background, these artifacts are leftover.
I tried using FillRect to fill in the area behind the text drawing call, which does eliminate the visual artifacts, but then causes the text to flicker constantly, as it is completely erased and then completely redrawn to the display.
Any ideas on how one might get the area not containing text to be drawn with the background color?
EDIT: I'd like to avoid having to double buffer the form if at app possible.
EDIT2: I solved the problem by only redrawing the text when I detect that the wrapping changes during a resize.
Use double buffering?
Draw everything to a bitmap and draw the bitmap to the window. Flickering is commonly a double buffering issue.
There are many possible solutions and without seeing your code, it's hard to tell which method would be best so I'd suggest taking a look at this article on flicker free drawing
SetBkMode + SetBkColor ?
Well since nobody seems to know what to do about it, I implemented it this way:
std::vector<std::wstring> wrapString(HDC hDC, const std::wstring& text, const RECT& targetRect, HFONT font)
{
std::vector<std::wstring> result;
RECT targetRectangle;
CopyRect(&targetRectangle, &targetRect);
//Calculate the width of the bounding rectangle.
int maxWidth = targetRectangle.right - targetRectangle.left;
//Draw the lines one at a time
std::wstring currentLine;
for(std::wstring::const_iterator it = text.begin(); it != text.end(); currentLine.push_back(*it), it++)
{
if(*it == L'\r' || *it == L'\n')
{ //Hard return
while(it != text.end() && (*it == L'\r' || *it == L'\n')) it++;
result.push_back(currentLine);
currentLine.clear();
}
else
{ //Check for soft return
SIZE sizeStruct;
GetTextExtentPoint32(hDC, currentLine.c_str(), static_cast<int>(currentLine.length()), &sizeStruct);
if (sizeStruct.cx > maxWidth)
{
std::wstring::size_type lineLength = currentLine.find_last_of(L' ');
if (lineLength == currentLine.npos)
{ //Word is longer than a line.
for(;it != text.end() && !iswspace(*it);it++) currentLine.push_back(*it);
}
else
{ //Clip word to line.
//Backtrack our scan of the source text.
it -= currentLine.length() - lineLength - 1;
//Remove the clipped word
currentLine.erase(lineLength);
}
result.push_back(currentLine);
currentLine.clear();
}
}
}
//Last remaining text.
result.push_back(currentLine);
return result;
}
void DrawInstructionsWithFilledBackground(HDC hDC, const std::wstring& text, RECT& targetRectangle, HFONT font, COLORREF backgroundColor)
{
//Set up our background color.
int dcIdx = SaveDC(hDC);
HBRUSH backgroundBrush = CreateSolidBrush(backgroundColor);
SelectObject(hDC, backgroundBrush);
SelectObject(hDC, font);
SetBkColor(hDC, backgroundColor);
std::vector<std::wstring> lines(wrapString(hDC, text, targetRectangle, font));
for(std::vector<std::wstring>::const_iterator it = lines.begin(); it!=lines.end(); it++)
{
RECT backgroundRect = targetRectangle;
DrawText(hDC, const_cast<LPWSTR>(it->c_str()), static_cast<int>(it->length()), &backgroundRect, DT_CALCRECT | DT_NOCLIP | DT_SINGLELINE);
backgroundRect.left = backgroundRect.right;
backgroundRect.right = targetRectangle.right;
if (backgroundRect.right >= backgroundRect.left)
FillRect(hDC, &backgroundRect, backgroundBrush);
ExtTextOut(hDC, targetRectangle.left, targetRectangle.top, ETO_OPAQUE, NULL, it->c_str(), static_cast<UINT>(it->length()), NULL);
targetRectangle.top += backgroundRect.bottom - backgroundRect.top;
}
instructionsWrap = lines;
//Restore the DC to it's former glory.
RestoreDC(hDC, dcIdx);
DeleteObject(backgroundBrush);
}
Get/Calculate the rect used by the DrawText call and clip it with something like ExcludeClipRect before calling FillRect

Resources