During the rendering process, the texture file have a gamma of 2.2. But the plain color do not need this gamma correction, especially if the RGB values are from a spectral caracterisation.
So, Why have I to use a gamma of 2.2 into my texture files ? Why in the CG world we used 2.2 gamma images ? Why don't we use a gamma of 1 for images with real colors ?
Most hardware and drivers assume that you're working with a gamma of 2.2 (or sRGB, which is very similar). So if you try to display a raw file with linear RGB info (i.e., a gamma of 1.0) the display image will be too bright.
3D graphics hardware is designed for game developers more than graphics researchers, so they expect that the inputs (for things like texture buffers) are assets encoded with a 2.2 gamma., which gives more uniform steps in perceptual intensity using the same number of bits than a linear encoding would.
If you control the entire rendering pipeline, then you can certainly work with linear RGB (or other linear spectral sampling) and apply a gamma just at the end, when you send the result to a display or a file.