javaandroidstreamingwebrtcwebrtc-android

Custom video source for WebRTC on Android


Overview

I would like to use a custom video source to live stream video via WebRTC Android implementation. If I understand correctly, existing implementation only supports front and back facing cameras on Android phones. The following classes are relevant in this scenario:

Currently for using front facing camera on Android phone I'm doing the following steps:

CameraEnumerator enumerator = new Camera1Enumerator(false);
VideoCapturer videoCapturer = enumerator.createCapturer(deviceName, null);
VideoSource videoSource = peerConnectionFactory.createVideoSource(false);
videoCapturer.initialize(surfaceTextureHelper, this.getApplicationContext(), videoSource.getCapturerObserver());
VideoTrack localVideoTrack = peerConnectionFactory.createVideoTrack(VideoTrackID, videoSource);

My scenario

I've a callback handler that receives video buffer in byte array from custom video source:

public void onReceive(byte[] videoBuffer, int size) {}

How would I be able to send this byte array buffer? I'm not sure about the solution, but I think I would have to implement custom VideoCapturer?

Existing questions

This question might be relevant, though I'm not using libjingle library, only native WebRTC Android package.

Similar questions/articles:


Solution

  • There are two possible solutions to this problem:

    1. Implement custom VideoCapturer and create VideoFrame using byte[] stream data in onReceive handler. There actually exists a very good example of FileVideoCapturer, which implements VideoCapturer.
    2. Simply construct VideoFrame from NV21Buffer, which is created from our byte array stream data. Then we only need to use our previously created VideoSource to capture this frame. Example:
    public void onReceive(byte[] videoBuffer, int size, int width, int height) {
        long timestampNS = TimeUnit.MILLISECONDS.toNanos(SystemClock.elapsedRealtime());
        NV21Buffer buffer = new NV21Buffer(videoBuffer, width, height, null);
    
        VideoFrame videoFrame = new VideoFrame(buffer, 0, timestampNS);
        videoSource.getCapturerObserver().onFrameCaptured(videoFrame);
    
        videoFrame.release();
    }