javascriptwebsocketrosros2roslibjs

ROS2 stream video image from usb camera through websocket


New to ROS2 and trying to stream image from a USB camera to a simple website in localhost. The camera works and I am abble to add it to rviz2 and see the video in grayscale.

My aproach is:

1-Create a topic for this camera. I use the comand ros2 launch realsense2_camera rs_launch.py and the topic with the video is /camera/fisheye1/image_raw. When I check this topic's type i get sensor_msgs/msg/Image

2-I have build a rosbridge_websocket that works and I can send messages to my simple html website.

3- I want to retrieve this topic's data and convert it to an image that I can stream. For this I made the next code to capture the message in java:

  var imageTopic = new ROSLIB.Topic({
    ros: ros, // Your ROS connection
    name: '/camera/fisheye1/image_raw', // The ROS topic you want to subscribe to
    messageType: 'sensor_msgs/msg/Image' // The message type for the topic 
  });

 // and the subscriber
 imageTopic.subscribe(function(message) {
   //code to convert from message's data in mono8 to image
  });

I know that my image is in mono8 encoding, also to double check when I run ros2 topic echo /camera/fisheye1/image_raw --no-arr i get something like:

header:
  stamp:
    sec: 1698430962
    nanosec: 551425280
  frame_id: camera_fisheye1_optical_frame
height: 800
width: 848
encoding: mono8 // <---- 
is_bigendian: 0
step: 848
data: '<sequence type: uint8, length: 678400>'  

So this indicates the encoding.

But when I receive the message in the javascript subscriber and I check what is inside, the data is alphanumeric instead of pixel values. I know the image stream works in rviz2, and the topic and message type are the same as in the program. I am not sure what could be causing this faulty streaming.

Thanks in advance.


Solution

  • I have actually managed to do this in another way. I used realsense2 to create a stream to the camera with python. Then with opencv I convert each frame from the stream into binary type. After that, I yield (flask' function) the binary image into my webapp which I made with flask.

    The realsense2 lib is only necessary because I am using an Intel realsense T265 camera. You can use any other library to create the stream.