c++openglffmpegffserver

Stream OpenGL framebuffer over HTTP (via FFmpeg)


I have an OpenGL application of which rendered images need to be streamed over internet to mobile clients. Previously, it sufficed to simply record the rendering into a video file, which is already working, and now this should be extended to subsequent streaming.

What is working right now:

None of this steps involves FFmpeg or any other library so far. I now want to replace the last step with "Stream the current frame's byte array over internet" and I assume that using FFmpeg and FFserver would be a reasonable choice for this. Am I correct? If not, what would be the proper way?

If so, how do I approach this within my C++ code? As pointed out, the frame is already encoded. Also, there is no sound or other stuff, simply a H.264 encoded frame as byte array that is updated irregularly and should be converted into a steady video stream. I assume that this would be FFmpeg's job and that the subsequent streaming via FFserver would be simple from there. What I don't know is how to feed my data to FFmpeg in the first place, as all FFmpeg tutorials I found (in a non-exhaustive search) work on a file or webcam/capture device as data source, not volatile data in main memory.

The file mentioned above that I am already able to create is a C++ file stream to which I append each single frame, meaning that different framerates of video and rendering are not treated correctly. This also needs to be taken care of at some point.

Can somebody point me in the right direction? Can I forward data from my application to FFmpeg to build a proper video feed without writing to the hard disk? Tutorials are greatly appreciated. By the way FFmpeg/FFserver is not mandatory. If you have a better idea for streaming of OpenGL framebuffer contents, I'm eager to know.


Solution

  • You can feed the ffmpeg process readily encoded H.264 data (-f h264) and tell it to simply copy the stream into to the output multiplexer (-c:v copy). To get the data actually into ffmpeg just launch it as a child process with a pipe connected to its stdin and specify stdin as reading source

    FILE *ffmpeg_in = popen("ffmpeg -i /dev/stdin -f h264 -c copy ...", "w");
    

    you can then write your encoded h264 stream to ffmpeg_in.