I'm trying to create a SDL texture from Perlin noise. I have created an array of uint8_t
and filled it with noise values times 255. When I'm printing out the values they seem to be ok:
127, 137, 134, 136, 127, 127, 127, 118, 120, 117, 137, 148, 144,
146, 137, 137, 137, 128, 130, 127, 134, 144, 141, 143, 134, 134,
134, 125, 127, 124, 136, 146, 143, 145, 136, 136, 136, 127, 129,
126, 127, 137, 134, 136, 127, 127, 127, 118, 120, 117, 127, 137,
134, 136, 127, 127, 127, 118, 120, 117, 127, 137, 134, 136, 127,
127, 127, 118, 120, 117, 118, 128, 125, 127, 118, 118, 118, 109,
111, 108, 120, 130, 127, 129, 120, 120, 120, 111, 113, 110, 117,
127, 124, 126, 117, 117, 117, 108, 110, 106
(e.g. 10x10 Perlin noise)
After that I call SDL's CreateSurfaceFrom
with valid width, height, pitch (I think) and pixel format as SDL_PIXELFORMAT_INDEX8
and then I create texture from this surface and try to render it. Unfortunately, the output is not what I want, mostly black or white screen.
I've also tried those pixel formats, but only the last one gave some output, but still not exactly correct:
SDL_PIXELFORMAT_UNKNOWN
SDL_PIXELFORMAT_INDEX4LSB
SDL_PIXELFORMAT_INDEX4MSB
SDL_PIXELFORMAT_INDEX1LSB
SDL_PIXELFORMAT_INDEX1MSB
And here is the output: SDL_PIXELFORMAT_INDEX1MSB output
Also here is the code:
uint8_t sdlPixels[_perlinWidth*_perlinHeight];
for (int i = 0; i < _perlinWidth*_perlinHeight; i++) {
sdlPixels[i] = static_cast<uint8_t>(_pixels[i]*255);
std::cout << (int)sdlPixels[i] << ", ";
}
SDL_Surface* surface = Renderer::GetInstance()->CreateSurfaceFrom(
sdlPixels, _perlinWidth, _perlinHeight, _perlinWidth, SDL_PIXELFORMAT_INDEX8);
_perlinTexture = Renderer::GetInstance()->CreateTextureFromSurface(surface);
delete [] _pixels; _pixels = nullptr; // clean the _pixel data
And here is the Renderer functions (simply mapped SDL functions):
SDL_Surface* Renderer::CreateSurfaceFrom(void* pixels, int width, int height,
int pitch, Uint32 format) {
return SDL_CreateSurfaceFrom(pixels, width, height, pitch, format);
}
SDL_Texture* Renderer::CreateTextureFromSurface(SDL_Surface* surface) {
SDL_Texture* out = SDL_CreateTextureFromSurface(_renderer, surface);
SDL_DestroySurface(surface); // free surface (we already have a texture)
return out;
}
Is the pixel data invalid in format that I give as a argument or have I missed something while creating the texture from SDL_Surface in SDL3?
The pixel format you want is SDL_INDEX8
. This means that your input sdlPixels
points to an array of 8-bit indices into a palette.
In order to create a texture from that, the palette needs to contain meaningful values.
You can do that with SDL_SetPaletteColors
and SDL_SetSurfacePalette
as follows:
SDL_Palette* pal = SDL_CreatePalette(256 /* colors */);
std::vector<SDL_Color> colors(256);
for (int i = 0; i < 256; i++) {
colors[i] = { i, i, i, 0xFF };
}
SDL_SetPaletteColors(pal, colors.data(), 0, 256);
SDL_Surface* surface = Renderer::GetInstance()->CreateSurfaceFrom(
sdlPixels, _perlinWidth, _perlinHeight, _perlinWidth, SDL_PIXELFORMAT_INDEX8);
SDL_SetSurfacePalette(surface, pal); // <-- use the palette
_perlinTexture = Renderer::GetInstance()->CreateTextureFromSurface(surface);
Although from reading the code this does modify the pixelformat's palette ... 🤔