javaandroidandroid-mediacodecmediamuxer

Regarding the problem of making the bitmaps image collection of ArrayList<Bitmap> type into a video, the duration of the video is too short


There are 300 frames of images in the bitmaps, and the set frame rate is 30, but the video length is only 3 seconds; if there are 600 frames of images, the video length is 6 seconds, etc.;The video can play all content, but the playback rate is faster, at double or triple speed; what is the reason for this? How should it(encodeVideo(ArrayList bitmaps)) be optimized?

private static final String MIME_TYPE = "video/avc"; 
private static final int BIT_RATE = 3000000; 
private static final int FRAME_RATE = 30;
private static final int I_FRAME_INTERVAL = 10;



public void encodeVideo(ArrayList<Bitmap> bitmaps) {
    Executor executor = Executors.newSingleThreadExecutor();
    executor.execute(() -> {
        if (bitmaps == null || bitmaps.isEmpty()) {
            return;
        }


        try {
            width = bitmaps.get(0).getWidth();
            height = bitmaps.get(0).getHeight();
            String timeStamp = new SimpleDateFormat("yyyyMMdd_HHmmss", Locale.getDefault()).format(new Date());
            String fileName1 = "VID_" + timeStamp + ".mp4";
            outputFile = new File(Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DCIM), fileName1);

            
            MediaFormat format = MediaFormat.createVideoFormat(MIME_TYPE, width, height);
            format.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
            format.setInteger(MediaFormat.KEY_BIT_RATE, BIT_RATE);
            format.setInteger(MediaFormat.KEY_FRAME_RATE, FRAME_RATE);
            format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, I_FRAME_INTERVAL);

            
            mediaCodec = MediaCodec.createEncoderByType(MIME_TYPE);
            mediaCodec.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
            inputSurface = mediaCodec.createInputSurface();
            mediaCodec.start();

            
            mediaMuxer = new MediaMuxer(outputFile.getPath(), MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);

            for (Bitmap bitmap : bitmaps) {
                if (Thread.currentThread().isInterrupted()) {
                    return;
                }
                
                Canvas canvas = inputSurface.lockCanvas(null);
                if (canvas != null) {
                    try {
                        canvas.drawBitmap(bitmap, 0, 0, null);
                    } finally {
                        inputSurface.unlockCanvasAndPost(canvas);
                    }
                }

                
                MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
                while (true) {
                    int encoderStatus = mediaCodec.dequeueOutputBuffer(bufferInfo, 0);
                    if (encoderStatus == MediaCodec.INFO_TRY_AGAIN_LATER) {
                        break;
                    } else if (encoderStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
                        if (muxerStarted) {
                            releaseResources();
                            return;
                        }
                        MediaFormat newFormat = mediaCodec.getOutputFormat();
                        videoTrackIndex = mediaMuxer.addTrack(newFormat);
                        mediaMuxer.start();
                        muxerStarted = true;
                    } else if (encoderStatus < 0) {
                        Log.w(TAG, "unexpected result from encoder.dequeueOutputBuffer: " + encoderStatus);
                    } else {
                        ByteBuffer encodedData = mediaCodec.getOutputBuffer(encoderStatus);
                        if (encodedData == null) {
                            releaseResources();
                            return;
                        }
                        if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
                            bufferInfo.size = 0;
                        }
                        if (bufferInfo.size != 0) {
                            if (!muxerStarted) {
                                releaseResources();
                                return;
                            }
                            encodedData.position(bufferInfo.offset);
                            encodedData.limit(bufferInfo.offset + bufferInfo.size);
                            mediaMuxer.writeSampleData(videoTrackIndex, encodedData, bufferInfo);
                        }
                        mediaCodec.releaseOutputBuffer(encoderStatus, false);
                        if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
                            break;
                        }
                    }
                }
            }

            
            releaseResources();

           
            ContentValues values = new ContentValues();
            values.put(MediaStore.Video.Media.DATA, outputFile.getAbsolutePath());
            values.put(MediaStore.Video.Media.MIME_TYPE, "video/mp4");
            Uri uri = context.getContentResolver().insert(MediaStore.Video.Media.EXTERNAL_CONTENT_URI, values);


        } catch (IOException e) {
            releaseResources();
        }
    });
}

Solution

  • Timestamps are used to control the playback speed.

    Setting the frame rate is not enough. You need to assign a timestamp to each frame that you write to the output file.

    Try to use the BufferInfo object for this:

    bufferInfo.presentationTimeUs = nextFrameTimeUs();
    mediaMuxer.writeSampleData(videoTrackIndex, encodedData, bufferInfo);
    

    The implementation of nextFrameTimeUs() is up to you. One way to do it is to map the current frame index to presentation time based on the average frame duration (that you can calculate from the frame rate).

    Note that the presentation time should be given in microseconds (Us).