From my camera, I get a byte array. Each two bytes (two elements of the array) define one pixel. If I understand correctly, the most of the current devices support only 256 shades of gray. So, for my image I need only MSB. The problem is that this is a streaming video, and creating a new array for a bitmap made up of every second byte takes a long time. Is it possible to output a "degraded" picture (16 bit -> 8 bit) using some classes?
Edit: If there are class that can create bitmap for grayscale with 16 bit per pixel it also interesting for me
To speed up the convert processing, you might think use
OpenGL ES
Use texture to render your color bitmap to pbuffer surface and create a grayscale version of the image in your glsl. the value can calculate from RGB representation like this luminance = 0.21 * R + 0.72 * G + 0.07 * B
. This method take a lot of work to setup GLES environment and the load/read bitmap might not very efficient.
Renderscript
Write a simple renderscript kernel to extract the MSB from original bits, then output the result to bitmap. This method could make the bits extraction procedure faster than the direct implementation in Java I guess.
If you don't know renderscript before, take a look of the official documentation here, it's not so complicated and won't take you too much time to accomplish your task.