I am trying to live stream the camera from a Raspberry pi to a PC with UDP. I followed the example code provided by the Picamera2 library to send the livestream data with python and am using C# WinForms PictureBox
to show the livestream on the PC.
Once the C# app receives the data byte array, I am using memory stream and Image.FromStream()
to convert the byte array to an image. However, this throws the `Parameter is not valid" exception.
Python code (Server):
host = #IP Address
port = 24325
with socket.socket(socket.AF_INET, socket.SOCK_DGRAM) as server_socket:
# Initial connection
server_socket.bind((host, port))
print("UDP Server is listening...")
while True:
data, addr = server_socket.recvfrom(1024)
rx = data.decode()
## Protocol Code...
# if...
# Stream code
elif rx == "STREAM":
tx = "STREAMING"
server_socket.sendto(tx.encode(), addr)
print("Starting stream")
# Note: 2 cameras are being used on the Pi but only 1 is being streamed for testing
picam1 = Picamera2(0)
picam2 = Picamera2(1)
config1 = picam1.create_preview_configuration()
config2 = picam2.create_preview_configuration()
picam1.configure(config1)
picam2.configure(config2)
picam1.controls.ExposureTime = 30000
picam2.controls.ExposureTime = 30000
encoder = H264Encoder(1000000)
server_socket.connect(addr)
stream = server_socket.makefile("wb")
picam1.start_recording(encoder, FileOutput(stream))
else:
print(f"Recieved from {addr}: {rx}")
tx = "Your message received"
server_socket.sendto(tx.encode(), addr)
C# code (client):
// Connection code...
// Protocol code...
// Receive livestream data
IPEndPoint serverEndPoint = new IPEndPoint(IPAddress.Any, 0);
bool isReceiving = true;
if (rx == "STREAMING")
{
logToConsole("Stream started"); // Custom function
try
{
while (isReceiving)
{
byte[] streamData = udpClient.Receive(ref serverEndPoint);
using (MemoryStream ms = new MemoryStream(streamData))
{
Image receivedImage = Image.FromStream(ms);
if (receivedImage != null)
{
pictureBoxStream.Invoke(new Action(() => pictureBoxStream.Image = receivedImage));
}
}
}
}
catch (Exception ex)
{
logToConsole("Error receiving video stream: " + ex.Message);
}
}
I'm no python developer, so I may have missunderstood something, but it seem to me that there are several fundamental problems:
An UDP packet has a max size of about 65kb. The receiving side assumes each packet is a full frame, and there is just no way to guarantee this unless the images are really tiny.
Normally you would define a protocol on top of UDP that allow you send and reassemble larger messages, as well as handle things like out of order packet delivery. It would also need to consider missing packets, either with some kind of resend mechanism or build in redundancy.
It looks like the python code uses h264 compression, but there is nothing on the c# to decode such a video stream. Image.FromStream
should handle most common image formats, like jpeg and png, but you will need a third party decoder for most kinds of real video streams.
From the documentation of Image.FromStream
You must keep the stream open for the lifetime of the Image.
So remove the using
I would recommend using some existing protocol. Real Time Streaming Protocol might be a good choice, is built on top of UDP, and should have built in handling of missing packets, out of order delivery etc.
Another alternative might be something something simple as sending jpeg frames over http, gRCP or similar, that should guarantee reliable transfer and should be fairly easy to use.