pythoncompasscoordinate-transformationlidarwebots

Question about getting global coordinates of lidar point cloud from relative in Webots


I need to do custom mapping of surroundings with lidar using mobile robot in Webots. What I use for that:

Maybe someone familiar with Webots and can show basic code example or explain the math behind it or there is a method that I missed in Webots?

I did translation and rotation of relative points from lidar, which worked well when robot is on flat surface (2D rotation). But no matter how much I tried I can't figure out how to get accurate global coordinates from point cloud relative points, when robot is even a bit tilted (3D rotation).

My guess is that it suppose to use spatial transformation matrices, but I not sure how to use Webots Compass values in rotation matrix.


Solution

  • After getting some useful info in StackExchange. Basic Example of solution on Python:

    from scipy.spatial.transform import Rotation as Rotation
    
    RobotPoint = gps.getValues()
    STR = Rotation.from_quat(InertialUnit.getQuaternion())
    for RelativeCloudPoint in lidar.getPointCloud():
      Point2 = STR.apply(RelativeCloudPoint)
      GlobalCloudPoint = RelativeCloudPoint + RobotPoint
    

    Using InternalUnit to get Quaternion for spartial rotation matrix. Then apply it to relative coordinates. After that add to it real robot coordinates from GPS. In the end you will get global coordinates of points you need.