openglglsldevil

Texture loading with DevIL, equivalent code to texture loading with Qt?


I am working with opengl and glsl, in visual studio c++ 2010. I am writing shaders and I need to load a texture. I am reading code from a book and in there they load textures with Qt, but I need to do it with DevIl, can someone please write the equivalent code for texture loading with DevIL? I am new to DevIL and I don't know how to translate this.

// Load texture file
const char * texName = "texture/brick1.jpg";
QImage timg = QGLWidget::convertToGLFormat(QImage(texName,"JPG"));

// Copy file to OpenGL
glActiveTexture(GL_TEXTURE0);
GLuint tid;
glGenTextures(1, &tid);
glBindTexture(GL_TEXTURE_2D, tid);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, timg.width(), timg.height(), 0,
             GL_RGBA, GL_UNSIGNED_BYTE, timg.bits());
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);

Solution

  • Given that DevIL is no longer maintained, and the ILUT part assumes the requirement for power-of-2 texture dimensions and does rescale the images in its convenience functions, it actually makes sense to take the detour of doing it manually.

    First loading a image from a file with DevIL happens quite similar to loading a texture from an image in OpenGL. First you create a DevIL image name and bind it

    GLuint loadImageToTexture(char const * const thefilename)
    {
    
        ILuint imageID;
        ilGenImages(1, &imageID);
        ilBindImage(imageID);
    

    now you can load an image from a file

    ilLoadImage(thefilename);
    

    check that the image does offer data, if not so, clean up

    void data = ilGetData();
    if(!data) {
        ilBindImage(0);
        ilDeleteImages(1, &imageID);
        return 0;
    }
    

    retrieve the important parameters

    int const width  = ilGetInteger(IL_IMAGE_WIDTH);
    int const height = ilGetInteger(IL_IMAGE_HEIGHT);
    int const type   = ilGetInteger(IL_IMAGE_TYPE); // matches OpenGL
    int const format = ilGetInteger(IL_IMAGE_FORMAT); // matches OpenGL
    

    Generate a texture name

    GLuint textureID;
    glGenTextures(1, &textureID);
    glBindTexture(GL_TEXTURE_2D, textureID);
    

    next we set the pixel store paremeters (your original code missed that crucial step)

    glPixelStorei(GL_UNPACK_SWAP_BYTES, GL_FALSE);
    glPixelStorei(GL_UNPACK_ROW_LENGTH, 0); // rows are tightly packed
    glPixelStorei(GL_UNPACK_SKIP_PIXELS, 0);
    glPixelStorei(GL_UNPACK_SKIP_ROWS, 0);
    glPixelStorei(GL_UNPACK_ALIGNMENT, 1); // pixels are tightly packed
    

    finally we can upload the texture image and return the ID

        glTexImage2D(GL_TEXTURE_2D, 0, format, width, height, 0, format, type, data);
    

    next, for convenience we set the minification filter to GL_LINEAR, so that we don't have to supply mipmap levels.

        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    

    finally return the textureID

        return textureID;
    }
    

    If you want to use mipmapping you can use the OpenGL glGenerateMipmap later on; use glTexParameter GL_TEXTURE_MIN_LOD and GL_TEXTURE_MAX_LOD to control the span of the image pyramid generated.