I am creating a UI engine and am tackling font rendering. I have just started and am trying to implement FreeType to initially load and render the .ttf files. This is the code I have for loading the .ttf and creating a Direct2D Bitmap:
FT_Error Error = FT_Init_FreeType(&Library);
assert(!Error);
FT_Face Face;
Error = FT_New_Face(Library, "Roboto.ttf", 0, &Face);
assert(!Error);
Error = FT_Set_Char_Size(Face, 0, 16 * 64, 96, 96);
assert(!Error);
std::cout << "Loading Font " << Face->family_name << "\n\tNum Glyphs: " << Face->num_glyphs << std::endl;
uint32 Index = FT_Get_Char_Index(Face, TEXT('L'));
std::cout << "\tCharacter 'L' is at index " << Index << "\n";
Error = FT_Load_Glyph(Face, Index, 0);
assert(!Error);
Error = FT_Render_Glyph(Face->glyph, FT_RENDER_MODE_NORMAL);
assert(!Error);
for (int i = 0; i < Face->glyph->bitmap.pitch; i++)
{
for (int j = 0; j < Face->glyph->bitmap.rows; j++)
{
std::cout << (uint8)Face->glyph->bitmap.buffer[i + j];
}
}
FT_Bitmap FTBitmap;
FT_Bitmap_New(&FTBitmap);
FT_Bitmap Src = Face->glyph->bitmap;
Error = FT_Bitmap_Convert(Library, &Src, &FTBitmap, 8);
assert(!Error);
auto render = static_cast<Direct2DRenderer*>(static_cast<WindowsWindow*>(Window)->GetRenderer());
D2D1_PIXEL_FORMAT format;
format.format = DXGI_FORMAT_B8G8R8A8_UNORM; // I have tried using R8_UNORM but it is not supported
format.alphaMode = D2D1_ALPHA_MODE_IGNORE;
render->GetRenderTarget()->CreateBitmap(D2D1::SizeU(FTBitmap.width, FTBitmap.pitch), FTBitmap.buffer, FTBitmap.pitch, D2D1::BitmapProperties(format), &Bitmap);
FT_Bitmap_Done(Library, &FTBitmap);
This seems to work... but when I render the Bitmap I get this result: enter image description here
I've tried many pixel formats for D2D, tried FT_RENDER_MODE_LCD (3x8 bit channels) and FT_RENDER_MODE_NORMAL (1x8 bit channels) but either I don't fully understand this or I am doing something wrong from the start. Any help is appreciated. Have also tried messing with the FT_Set_Char_Size values, the D2D pixel format values and the FT_Render_Glyph values but nothing changes, don't think its supposed to anyways.
(The Direct2DRenderer class is just a wrapper for a HwndRenderTarget and constantly renders the frame. Also, the code I've posted is just me trying to get it to work, not shipping code, I know it's not pretty!)
The freetype glyph bitmap is essentially an opacity mask, it's a 1 byte (0-255 level) per pixel, like an alpha channel bitmap.
Direct2D's ID2D1HwndRenderTarget only supports a limited set of formats, the most useful being DXGI_FORMAT_B8G8R8A8_UNORM + D2D1_ALPHA_MODE_PREMULTIPLIED, so a 4-bytes (BGRA) per pixel format.
Here is how you can convert the glyph format into a Direct2D ID2D1Bitmap (error checks ommited):
auto ftb = Face->glyph->bitmap;
// build a memory buffer (1 pixel => 4 bytes)
// from glyph bitmap (1 pixel => 1 byte)
auto bits = new unsigned char[ftb.width * 4 * ftb.rows];
for (unsigned int row = 0; row < ftb.rows; row++)
{
auto ptr = (unsigned int*)bits + row * ftb.pitch;
for (unsigned int i = 0; i < ftb.width; i++)
{
auto opacity = (unsigned int)ftb.buffer[row * ftb.pitch + i];
*ptr = opacity << 24; // ARGB format with A = opacity from glyph
ptr++;
}
}
auto bmpSize = D2D1::SizeU(ftb.width, ftb.rows);
D2D1_BITMAP_PROPERTIES props{};
props.pixelFormat.format = DXGI_FORMAT_B8G8R8A8_UNORM;
props.pixelFormat.alphaMode = D2D1_ALPHA_MODE_PREMULTIPLIED;
ID2D1Bitmap* bmp;
m_pRenderTarget->CreateBitmap(bmpSize, (const void*)bits, ftb.width * 4, props, &bmp);
delete[] bits;
...
// do some work with bitmap ...
...
bmp->Release();