AFAIK HDR images in PNG are saved as integer values in 16-bit depth. But in Android SDK I can read HDR image with configuration Config.RGBA_F16
, which has a very interesting description:
Each pixels is stored on 8 bytes. Each channel (RGB and alpha for translucency) is stored as a half-precision floating point value. This configuration is particularly suited for wide-gamut and HDR content.
Is single channel of a pixel (e.g. red) stored in 8 bits as in the description, or is it stored in 16 bits as name RGBA_F16
may suggest?
And the second question is how to get all 4 channels of a single pixel? E.g. where the top-left pixel is stored: at indices 0, 1, 2, 3
or at indices 0, 1*h*w, 2*h*w, 3*w*h
?
Regarding number of bits, you are mixing up bits and bytes. Each pixel is stored in 8 bytes using 16-bit format. There is no contradiction here, as each pixel has 4 channels, which are stored in 16 bits each. So each each pixel takes up 4*16 = 64 bits of memory, which is the same as 8 bytes.
And for the second question, pixels are stored as consecutive chunks of memory as description of Config.RGBA_F16
suggest:
long color = (A & 0xffff) << 48 | (B & 0xffff) << 32 | (G & 0xffff) << 16 | (R & 0xffff);
This can also be verified in the source code of Bitmap.getColor
. It means that the top-left pixel is stored at bytes with indices 0-7.
One thing that I noticed is that you can't get 16-bit pixel channel values using Bitmap.getColor
, because it will convert colors to 8 bits. Bitmap.getPixels
converts 16 bits to 8 bits too. Using Bitmap.getPixel
doesn't help either, because it returns int.
Therefore the only way to access 16-bit pixels is to use Bitmap.copyPixelsToBuffer
.