I want to get mapping to work using a Picamera. I have a Raspberry Pi running a cv_camera_node and an Ubuntu 20.04.1 running roscore
, as well as, slam and rviz. I have OpenCV 4.2.0 and installed the following version of orb-slam2: https://github.com/appliedAI-Initiative/orb_slam_2_ros. I am running ROS Noetic. I have wrote the following launch file for slam:
<launch>
<node name="orb_slam2_mono" pkg="orb_slam2_ros"
type="orb_slam2_ros_mono" output="screen">
<param name="publish_pointcloud" type="bool" value="true" />
<param name="publish_pose" type="bool" value="true" />
<param name="localize_only" type="bool" value="false" />
<param name="reset_map" type="bool" value="true" />
<!-- static parameters -->
<param name="load_map" type="bool" value="false" />
<param name="map_file" type="string" value="map.bin" />
<param name="voc_file" type="string" value="/home/dragonros/catkin_ws/src/orb_slam_2_ros/orb_slam2/Vocabulary/ORBvoc.txt" />
<param name="pointcloud_frame_id" type="string" value="map" />
<param name="camera_frame_id" type="string" value="camera_link" />
<param name="min_num_kf_in_map" type="int" value="5" />
<!-- ORB parameters -->
<param name="/ORBextractor/nFeatures" type="int" value="2000" />
<param name="/ORBextractor/scaleFactor" type="double" value="1.2" />
<param name="/ORBextractor/nLevels" type="int" value="8" />
<param name="/ORBextractor/iniThFAST" type="int" value="20" />
<param name="/ORBextractor/minThFAST" type="int" value="7" />
<!-- Camera parameters -->
<!-- Camera frames per second -->
<param name="camera_fps" type="int" value="30" />
<!-- Color order of the images (0: BGR, 1: RGB. It is ignored if images are grayscale) -->
<param name="camera_rgb_encoding" type="bool" value="true" />
<!--If the node should wait for a camera_info topic to take the camera calibration data-->
<param name="load_calibration_from_cam" type="bool" value="false" />
<!-- Camera calibration and distortion parameters (OpenCV) -->
<param name="camera_fx" type="double" value="615.546" />
<param name="camera_fy" type="double" value="631.457" />
<param name="camera_cx" type="double" value="354.361" />
<param name="camera_cy" type="double" value="232.799" />
<!-- Camera calibration and distortion parameters (OpenCV) -->
<param name="camera_k1" type="double" value="0.0" />
<param name="camera_k2" type="double" value="0.0" />
<param name="camera_p1" type="double" value="0.0" />
<param name="camera_p2" type="double" value="1.0" />
</node>
</launch>
Then I run another custom catkin package that has the following python script:
#!/usr/bin/env python
import rospy
import cv2
from cv_bridge import CvBridge
from sensor_msgs.msg import Image
bridge = CvBridge()
def callback(data):
frame = bridge.imgmsg_to_cv2(data, "bgr8")
cv2.imshow('Video Feed', frame)
cv2.waitKey(1)
rospy.loginfo('Image feed received!')
def listener():
rospy.init_node('vid_rec')
#first parameter is the topic you want to subcribe sensor_msgs/Image from
rospy.Subscriber('/orb_slam2_mono/debug_image', Image, callback)
rospy.spin()
if __name__ == '__main__':
listener()
I am supposed to see a camera feed with all the points slam detects. But the /orb_slam2_mono/debug_image
has no data. I confirmed that by running rostopic echo /orb_slam2_mono/debug_image
. I know that there is a camera feed because both rviz
and rqt_image_viewer
can display images coming off of /cv_camera/image_raw
. I have thoroughly followed this guide: https://medium.com/@mhamdaan/implementing-orb-slam-on-ubuntu-18-04-ros-melodic-606e668deffa. What is the problem and how can I fix it?
Maybe your camera isn't getting picked up. You are using cv_camera_node meaning that the default topic will be cv_camera but orb_slam2 requires just camera. To solve this, go into the cv_camera_node.cpp which will look like this:
// Copyright [2015] Takashi Ogura<t.ogura@gmail.com>
#include "cv_camera/driver.h"
int main(int argc, char **argv)
{
ros::init(argc, argv, "cv_camera");
ros::NodeHandle private_node("~");
cv_camera::Driver driver(private_node, private_node);
try
{
driver.setup();
while (ros::ok())
{
driver.proceed();
ros::spinOnce();
}
}
catch (cv_camera::DeviceError &e)
{
ROS_ERROR_STREAM("cv camera open failed: " << e.what());
return 1;
}
return 0;
}
You will have to change the line that says ros::init(argc, argv, "cv_camera");
into this ros::init(argc, argv, "camera");
. Rerun everything and it should work.