Lots of questions have been made about camera2 api and RAW image format, but searching online I have still not found the answer (that's why I am here btw).
I am trying to do some real-time image processing on camera-captured frames using ImageReader and setRepeatingRequest with the front-facing camera. As suggested in some previous posts, I am acquiring the image in a RAW format (specifically Imageformat.yuv_420_888) in order to have a frame-rate around 30fps:
imageReader = ImageReader.newInstance(width, height, ImageFormat.YUV_420_888, 2);
My image-processing algorithm requires an RGB image as input, so I need to convert from YUV to RGB. To do that I use ScriptIntrinsicYuvToRGB
private static Bitmap YUV_420_888_toRGBIntrinsics(Image image) {
if (image == null) return null;
int W = image.getWidth();
int H = image.getHeight();
Image.Plane Y = image.getPlanes()[0];
Image.Plane U = image.getPlanes()[1];
Image.Plane V = image.getPlanes()[2];
int Yb = Y.getBuffer().remaining();
int Ub = U.getBuffer().remaining();
int Vb = V.getBuffer().remaining();
byte[] data = new byte[Yb + Ub + Vb];
Y.getBuffer().get(data, 0, Yb);
V.getBuffer().get(data, Yb, Vb);
U.getBuffer().get(data, Yb + Vb, Ub);
rs = RenderScript.create(context);
ScriptIntrinsicYuvToRGB yuvToRgbIntrinsic = ScriptIntrinsicYuvToRGB.create(rs, Element.U8_4(rs));
Type.Builder yuvType = new Type.Builder(rs, Element.U8(rs)).setX(data.length);
Allocation in = Allocation.createTyped(rs, yuvType.create(), Allocation.USAGE_SCRIPT);
Type.Builder rgbaType = new Type.Builder(rs, Element.RGBA_8888(rs)).setX(W).setY(H);
Allocation out = Allocation.createTyped(rs, rgbaType.create(), Allocation.USAGE_SCRIPT);
final Bitmap bmpout = Bitmap.createBitmap(W, H, Bitmap.Config.ARGB_8888);
in.copyFromUnchecked(data);
yuvToRgbIntrinsic.setInput(in);
yuvToRgbIntrinsic.forEach(out);
out.copyTo(bmpout);
image.close();
return bmpout ;
}
This method is quite fast since I can convert a 1080p image in less than 20ms. The only issue is that the image result is rotated by 270 degrees (i.e. picture is taken in landscape mode). Even if I set JPEG_ORIENTATION in the camera builder settings,captureRequestBuilder.set(CaptureRequest.JPEG_ORIENTATION, characteristics.get(CameraCharacteristics.SENSOR_ORIENTATION));
the result is still the same.
Here my question:
No, there's no built-in rotation for YUV output. To minimize overhead, it's always produced as-is from the image sensor. You can read the SENSOR_ORIENTATION field to determine how the image sensor is placed on the device; typically the long edge of the image sensor lines up with the long edge of the Android device, but that still leaves two rotations that are valid.
Also, if your goal is to have the image 'upright', then you also need to read what the device's orientation is from the accelerometer, and add that in to the rotation.
You're doing a copy already getting the frame from the Image into the Allocation, so doing a 90/180/270 degree rotation then is relatively straightforward, though memory-bandwidth-intensive.
You can also take a look at one of Google's sample apps, HdrViewfinderDemo, which pipes camera data into RenderScript without the intermediate copy you're doing, and then converts to RGB to draw to a SurfaceView. It doesn't have a rotation in it now, but you could adjust the lookup done via rsGetElementAtYuv_uchar_* to do 90-increments.