pythonimagesocketsopencvgstreamer

Efficient way of sending a large number of images from client to server


I'm working on a project where one client needs to take several snapshots from a camera (i.e. it's actually taking a short-duration video, hence a stream of frames), then send all images to a server which then performs some processing on these images and returns a result to the client.

Client and server are all running Python3 code.

The critical part is the image sending one.

Some background first, images are *640*480 jpeg* files. JPEG was chosen as a default choice, but lower quality encoding can be selected as well. They are captured in sequence by a camera. We have thus approximately ~600 frames to send. Frame size is around 110KiB.

The client consists of a Raspberry Pi 3 model B+. It sends the frames via wifi to a 5c server. Server and client both reside in the same LAN for the prototype version. But future deployments might be different, both in term of connectivity medium (wired or wireless) and area (LAN or metro).

I've implemented several solutions for this:

  1. Using Python sockets on the server and the client: I'm either sending one frame directly after one frame capture, or I'm sending all images in sequence after the whole stream capture is done.

  2. Using Gstreamer: I'm launching a GStreamer endpoint on my client and directly send the frames to the server as I stream. I'm capturing the stream on the server side with OpenCV compiled with GStreamer support, then save them to the disk.

Now, the issue I'm facing is that even if both solutions work 'well' (they get the 'final' job done, which is to send data to a server and receive a result based on some remote processing), I'm convinced there is a better way to send a large amount of data to a server, using either the Python socket library, or any other available tools.

All personal researches are done on that matter lead me to solutions similar to mine, using Python sockets, or were out of context (relying on other backends than pure Python).

By a better way, I assume:

  1. A solution that saves bandwidth as much as possible.
  2. A solution that sends all data as fast as possible.

For 1. I slightly modified my first solution to archive and compress all captured frames in a .tgz file that I send over to the server. It indeed decreases the bandwidth usage but also increases the time spent on both ends (due to the un/compression processes). It's obviously particularly true when the dataset is large.

For 2. GStreamer allowed me to have a negligible delay between the capture and the reception on my server. I have however no compression at all and for the reasons stated above, I cannot really use this library for further development.

How can I send a large number of images from one client to one server with minimal bandwidth usage and delay in Python?


Solution

  • If you want to transfer images as frames, you can use some existing apps like MJPEG-Streamer which encode images from a webcam interface to JPG, which reduces the image size. But if you need a more robust transfer with advanced encoding, you can use some Linux tools like FFMPEG with streaming, which is documented here.

    If you want a lower implementation and control the whole stream by your code for modifications, you can use web-based frameworks like Flask and transfer your images directly through the HTTP protocol. You can find a good example here.

    If you don't want to stream, you can convert a whole set of images to a video-encoded format like h264 and then transfer the bytes through the network. You can use opencv to do this. There are also some good libraries written in Python like pyffmpeg.