I'm trying to capture Android's views as bitmaps and save them as .mp4 file.
I'm using MediaCodec to encode bitmaps and MediaMuxer to mux them into .mp4.
Using YUV420p color format I expect input buffers from MediaCodec to be of size resWidth * resHeight * 1.5
but Qualcomm's OMX.qcom.video.encoder.avc
gives me more than that (no matter what resolution I choose). I believe that it wants me to do some alignment in my input byte stream but I have no idea how to find out what exactly it expects me to do.
This is what I get when I pack my data tightly in input buffers on Nexus 7 (2013) using Qualcomm's codec: https://www.youtube.com/watch?v=JqJD5R8DiC8
And this video is made by the very same app ran on Nexus 10 (codec OMX.Exynos.AVC.Encoder
): https://www.youtube.com/watch?v=90RDXAibAZI
So it looks like luma plane is alright in faulty video but what happened with chroma plane is a mystery for me.
I prepared minimal (2 classes) working code example exposing this issue: https://github.com/eeprojects/MediaCodecExample
You can get videos shown above just by running this app (there will be same artefacts if your device utilizes Qualcomm's codec).
There are multiple ways of storing YUV 420 in buffers; you need to check the individual pixel format you chose. MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar
and MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedPlanar
are in practice the same, called planar or I420 for short, while the others, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar
, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedSemiPlanar
and MediaCodecInfo.CodecCapabilities.COLOR_TI_FormatYUV420PackedSemiPlanar
are called semiplanar or NV12.
In semiplanar, you don't have to separate planes for U and V, but you have one single plane with pairs of interleaved U,V.
See https://android.googlesource.com/platform/cts/+/jb-mr2-release/tests/tests/media/src/android/media/cts/EncodeDecodeTest.java (lines 925-949) for an example on how to fill in the buffer for the semiplanar formats.