python-3.xmathlidarhtc-viveopenvr

Having trouble aligning 2D Lidar pointcloud to match the coordinate system of HTC Vive Controller


I have strapped on an RPLidar A1 to an HTC Vive Controller and have written a python script that converts the lidar's pointcloud to XY coordinates and then transforms these points to match the rotation and movement of the Vive controller. The end goal is to be able to scan a 3D space using the controller as tracking.

Sadly, everything I try, the native quaternion of the triad_openvr library, transformation matrix transform, euler angles even, I simply cannot get the system to function on all possible movement/rotation axes.

# This converts the angle and distance measurement of lidar into x,y coordinates 
# (divided by 1000 to convert from mm to m)

coord_x = (float(polar_point[0])/1000)*math.sin(math.radians(float(polar_point[1])))

coord_y = (float(polar_point[0])/1000)*math.cos(math.radians(float(polar_point[1])))


# I then tried to use the transformation matrix of the 
# vive controller on these points to no avail

matrix = vr.devices["controller_1"].get_pose_matrix()
x = (matrix[0][0]*coord_x+matrix[0][1]*coord_y+matrix[0][2]*coord_z+(pos_x-float(position_x)))
y = (matrix[1][0]*coord_x+matrix[1][1]*coord_y+matrix[1][2]*coord_z+(pos_y-float(position_y)))
z = (matrix[2][0]*coord_x+matrix[2][1]*coord_y+matrix[2][2]*coord_z+(pos_z-float(position_z)))

# I tried making quaternions using the euler angles and world axes
# and noticed that the math for getting euler angles does not correspond
# to the math included in the triad_vr library so I tried both to no avail
>>>>my euler angles
>>>>angle_x = math.atan2(matrix[2][1],matrix[2][2])
>>>>angle_y = math.atan2(-matrix[2][0],math.sqrt(math.pow(matrix[2][1],2)+math.pow(matrix[2][2],2)))
>>>>angle_z = math.atan2(matrix[1][0],matrix[0][0])

euler = v.devices["controller_1"].get_pose_euler()
>>>>their euler angles (pose_mat = matrix)
>>>>yaw = math.pi * math.atan2(pose_mat[1][0], pose_mat[0][0])
>>>>pitch = math.pi * math.atan2(pose_mat[2][0], pose_mat[0][0])
>>>>roll = math.pi * math.atan2(pose_mat[2][1], pose_mat[2][2])

#quaternion is just a generic conversion from the transformation matrix
#etc

Expected results are a correctly oriented 2D slice in 3D space of data, that, if appended, would eventually map the whole 3D space. Currently, I have only managed to successfully scan only on a single axis Z and pitch rotation. I have tried a near infinite number of combinations, some found on other posts, some based on raw linear algebra, and some simply random. What am I doing wrong?


Solution

  • Well we figured it out by working with the euler rotations and converting those to a quaternion:

    We had to modify the whole definition of how triad_openvr calculates the euler angles

    def convert_to_euler(pose_mat):
    pitch = 180 / math.pi * math.atan2(pose_mat[2][1], pose_mat[2][2])
    yaw = 180 / math.pi * math.asin(pose_mat[2][0])
    roll = 180 / math.pi * math.atan2(-pose_mat[1][0], pose_mat[0][0])
    x = pose_mat[0][3]
    y = pose_mat[1][3]
    z = pose_mat[2][3]
    return [x,y,z,yaw,pitch,roll]
    

    And then had to further do some rotations of the euler coordinates originating from the controller here (roll corresponds to X, yaw corresponds to Y, and pitch corresponds to Z axis for some unknown reason):

    r0 = np.array([math.radians(-180-euler[5]), math.radians(euler[3]), -math.radians(euler[4]+180)])
    

    As well as pre-rotate our LiDAR points to correspond to the axis displacement of our real world construction:

    coord_x = (float(polar_point[0])/1000)*math.sin(math.radians(float(polar_point[1])))*(-1)
    coord_y = (float(polar_point[0])/1000)*math.cos(math.radians(float(polar_point[1])))*(math.sqrt(3))*(0.5)-0.125
    coord_z = (float(polar_point[0])/1000)*math.cos(math.radians(float(polar_point[1])))*(-0.5)
    

    It was finally a case of rebuilding the quaternions from the euler angles (a workaround, we are aware) and do the rotation & translation, in that order.