I am trying to run SLAM algorithm (ElasticFusion) using my custom .klg file. I tried the following 2 ways :
The first way was about to build .klg file manually from separate depth and rgb image (.png) files and their time stamp informations . I tried conversion script on this Sequence 'freiburg1_desk' dataset and then run ElasticFusion . I get good result and point cloud. But when I tried to record environment on my own device with following the same steps, I did not get desired result or point cloud. The result which i am getting in live logging is much better . I guess it is because of the code that i am using for depth image visualization.
np.clip(depth, 0, 2**10-1, depth)
depth2=depth/2842.5
depth2+=1.1863
depth=np.tan(depth2)
depth*=0.1236
depth-=0.037
depth*=1000
#depth = 0.1236 * math.tan(depth / 2842.5 + 1.1863);
depth = depth.astype(np.uint16)
return depth
I get above formula from here
A better approximation is given by Stéphane Magnenat in this post:
distance = 0.1236 * tan(rawDisparity / 2842.5 + 1.1863) in meters
. Adding a final offset term of -0.037 centers the original ROS data.
The second way that i tried was using this Logger which is suggested by Thomas Whelan (ElasticFusion). I run this Logger without any error.
Number devices connected: 1
1. device on bus 001:14 is a Xbox NUI Camera (2AE) from Microsoft (45E) with serial id 'A00366911101042A'
searching for device with index = 1
Opened 'Xbox NUI Camera' on bus 1:14 with serial number 'A00366911101042A'
But i am getting black screen for Depth and RGB image.
I am using Ubuntu 16.04 and Kinect1. Any kind of suggestion or help will be appreciated
Solved
Second way worked after re-installing OpenNI. Probably in previous runs Logger somehow was not able to find OpenNI for streaming the depth and rgb.