pythonmatrixscipyrotationrobotics

Calculation of the rotation matrix between two coordinate systems with python in order to determin the target location for a UR5 robot end effector


I have a UR5 CB3 robotic arm that shall be trained to pick up watermelons. For that, I installed a camera that determines the location of the watermelon with respect to the TCP (or end effector / the gripping "hand"). This TCP coordinate system is called K1. I also know the location and orientation of the TCP with respect to the base coordinate system. This base coordinate system is called K0. In order to simplifiy the problem, I removed all the code used to determine the watermelon location, and just inserted arbitrary coordinates given in the K1 coordinate system, as for example [ 0 , 0 , 0.1]. Now I want to calculate what these coordinates are in the K0 coordinate system. Mathematically, this should be a fairly simple code, I now would use the [roll, pitch, yaw] to calculate the rotation matrix rot_mat_K1_to_K0 and then use matrix multiplication of rot_mat_K1_to_K0 with the vecotor pointing to W in the K1 system. Then I would add the vector from K0 to K1 given in K1 coordinated and the now rotated vector from K1 to W in a simple vector addition. Then I could send this final vector to my robot to move there.

But in the following python code, i run into the problem that the robot moves, but in a completly unrelated direction. At least the error is repeatable, so the arm always moves in the same, but wrong, direction.

The following code is called "raw_motion_tester_6.py":

from time import sleep
import numpy as np
from scipy.spatial import transform as tf
import rtde_receive
import rtde_control

print("Environment Ready")

# RTDE configuration
ROBOT_IP = "192.168.1.102"  # IP address of your UR5 CB3 robot
FREQUENCY = 125  # Frequency of data retrieval in Hz

# Establish RTDE connections
rtde_receiver = rtde_receive.RTDEReceiveInterface(ROBOT_IP)
rtde_controler = rtde_control.RTDEControlInterface(ROBOT_IP)

# Define the variables to retrieve from RTDE
variables = ["actual_TCP_pose"] #What does this do?

# Start data synchronization
rtde_receiver.startFileRecording("recorded_data.csv")

####################################################################
# Define the movement distance in the K1 coordinate system
move_distance_x = 0.0  # Move X.X m along the x-axis of K1
move_distance_y = 0.0  # Move X.X m along the y-axis of K1
move_distance_z = -0.1  # Move X.X m along the z-axis of K1
####################################################################
try:
    while True:
        # Receive and parse RTDE data
        tcp_position = rtde_receiver.getActualTCPPose()

        # Matrix transformation
        origin_K1_wrt_K0 = tcp_position[:3]  # Origin of the TCP w.r.t. the Robot Base
        angles_K1_wrt_K0 = tcp_position[3:6]  # Orientation of the origin of the TCP w.r.t. the Robot Base

        # Create a rotation matrix based on the angles with scipy transform
        rot_K0_to_K1 = tf.Rotation.from_rotvec(angles_K1_wrt_K0) # rotvec or euler should not make a difference
        rot_mat_K0_to_K1 = rot_K0_to_K1.as_matrix()
        print(f"Rotation Matrix K0 to K1: {rot_mat_K0_to_K1}")
        rot_mat_K1_to_K0 = rot_mat_K0_to_K1.T

        # Create the movement vector in the K1 coordinate system
        move_vector_K1 = np.array([move_distance_x, move_distance_y, move_distance_z])

        # Convert the movement vector from K1 to K0 coordinate system
        move_vector_K0 = rot_mat_K1_to_K0 @ move_vector_K1
        print(f"Move Vector K0: {move_vector_K0}", f"Shape of Move Vector K0: {move_vector_K0.shape}")
        # Calculate the new TCP position in the K0 coordinate system
        new_tcp_position_K0 = origin_K1_wrt_K0 + move_vector_K0

        # Move the robot to the new TCP position
        rtde_controler.moveL(new_tcp_position_K0.tolist() + [tcp_position[3], tcp_position[4], tcp_position[5]], 0.1, 0.2)

        # sleep(1)
        break

except KeyboardInterrupt:
    # Stop file recording and disconnect on keyboard interrupt
    rtde_receiver.stopFileRecording()
    rtde_receiver.disconnect()
    rtde_controler.disconnect()

finally:
    print("Disconnected from the UR5 CB3 robot")

The simplified version, without rotation matrix, just with translation, called "raw_motion_tester_7.py" works just fine, with rotation as well as translation of the TCP, so the assumption is logical that the problem is somewhere in the mathematical part of the original code.

from time import sleep
import numpy as np
from scipy.spatial import transform as tf
import rtde_receive
import rtde_control
from math import pi

print("Environment Ready")

# RTDE configuration
ROBOT_IP = "192.168.1.102"  # IP address of your UR5 CB3 robot
FREQUENCY = 125  # Frequency of data retrieval in Hz

# Establish RTDE connections
rtde_receiver = rtde_receive.RTDEReceiveInterface(ROBOT_IP)
rtde_controler = rtde_control.RTDEControlInterface(ROBOT_IP)

# Define the variables to retrieve from RTDE
variables = ["actual_TCP_pose"] #What does this do?

# Start data synchronization
rtde_receiver.startFileRecording("recorded_data.csv")

####################################################################
# Define the movement distance in the K1 coordinate system
move_distance_x = 0.0  # Move X.X m along the x-axis of K1
move_distance_y = -0.0  # Move X.X m along the y-axis of K1
move_distance_z = -0.0  # Move X.X m along the z-axis of K1
rotate_tcp_x = 0 # Rotate X.X rad around the x-axis of K0
rotate_tcp_y = 0 # Rotate X.X rad around the y-axis of K0
rotate_tcp_z = 0 # Rotate X.X rad around the z-axis of K0
####################################################################
try:
    while True:
        # Receive and parse RTDE data
        tcp_position = rtde_receiver.getActualTCPPose()

        # Matrix transformation
        origin_K1_wrt_K0 = tcp_position[:3]  # Origin of the TCP w.r.t. the Robot Base
        angles_K1_wrt_K0 = tcp_position[3:6]  # Orientation of the origin of the TCP w.r.t. the Robot Base

        # Create the movement vector in the K0 coordinate system
        move_vector_K0 = np.array([move_distance_x, move_distance_y, move_distance_z])

        # Calculate the new TCP position in the K0 coordinate system
        new_tcp_position_K0 = origin_K1_wrt_K0 + move_vector_K0

        # Move the robot to the new TCP position
        rtde_controler.moveL(new_tcp_position_K0.tolist() + [tcp_position[3]+rotate_tcp_x, tcp_position[4]+rotate_tcp_y, tcp_position[5]+rotate_tcp_z], 0.1, 0.2)

        # sleep(1)
        break

except KeyboardInterrupt:
    # Stop file recording and disconnect on keyboard interrupt
    rtde_receiver.stopFileRecording()
    rtde_receiver.disconnect()
    rtde_controler.disconnect()

finally:
    print("Disconnected from the UR5 CB3 robot")

In order to understand what data the rtde_receiver.getActualTCPPose() actually fetches, I saved that to a txt file: "TCP Position: [0.17477931598731355, 0.5513537036720785, 0.524021777855706, 0.4299023783620231, -1.6571432341558983, 1.3242450163108708]" The first three are the coordiantes in meters and the last three are the orientation in Rad. The documentation describes the function as "Actual Cartesian coordinates of the tool: (x,y,z,rx,ry,rz), where rx, ry and rz is a rotation vector representation of the tool orientation".

As descrbibed above I wrote a simpler version of the code, to rule out basic conectivity issues or something like that. Also I am quite certain that I calculate the rotation matrix correctly. I investigated the libraries that I use and to the best of my knowledge I am using everything, especially scipy's rot_K0_to_K1 = tf.Rotation.from_rotvec(angles_K1_wrt_K0) seems corret. If you guys have any input, please let me know.


Edit: I forgot to add a version of the code that runs while not connected to a robot. This one has arbitrary, but realistic, position data for the UR5. I hope that makes it easier to understand my issue.

from time import sleep
import numpy as np
from scipy.spatial import transform as tf

print("Environment Ready")

####################################################################
# Define the movement distance in the K1 coordinate system
move_distance_x = 0.0  # Move X.X m along the x-axis of K1
move_distance_y = 0.0  # Move X.X m along the y-axis of K1
move_distance_z = -0.1  # Move X.X m along the z-axis of K1
####################################################################

# Pseudo TCP Position for test purposes
tcp_position = [0.17477931598731355, 0.5513537036720785, 0.524021777855706, 0.4299023783620231, -1.6571432341558983, 1.3242450163108708] # This is an arbitrary location of the robot, the first three are the xyz coordinates of the TCP in the baseframe, and the last three are the rotation along the xyz axis

# Matrix transformation
origin_K1_wrt_K0 = tcp_position[:3]  # Origin of the TCP w.r.t. the Robot Base
angles_K1_wrt_K0 = tcp_position[3:6]  # Orientation of the origin of the TCP w.r.t. the Robot Base

# Create a rotation matrix based on the angles with scipy transform
rot_K0_to_K1 = tf.Rotation.from_rotvec(angles_K1_wrt_K0) # rotvec or euler should not make a difference
rot_mat_K0_to_K1 = rot_K0_to_K1.as_matrix()
print(f"Rotation Matrix K0 to K1: {rot_mat_K0_to_K1}")
rot_mat_K1_to_K0 = rot_mat_K0_to_K1.T

# Create the movement vector in the K1 coordinate system
move_vector_K1 = np.array([move_distance_x, move_distance_y, move_distance_z])

# Convert the movement vector from K1 to K0 coordinate system
move_vector_K0 = rot_mat_K1_to_K0 @ move_vector_K1
print(f"Move Vector K0: {move_vector_K0}", f"Shape of Move Vector K0: {move_vector_K0.shape}")
# Calculate the new TCP position in the K0 coordinate system
new_tcp_position_K0 = origin_K1_wrt_K0 + move_vector_K0

# Move the robot to the new TCP position
new_complete_tcp_description = new_tcp_position_K0.tolist() + [tcp_position[3], tcp_position[4], tcp_position[5]]
print("new_complete_tcp_description: ", new_complete_tcp_description)
# sleep(1)


Solution

  • Well, guys, I solved it. Looks like the TCP position and orientation was calibrated wrong, before I started using this robot. The callibration was done on the robot itself, not in my code, so I could not find it directly. So my motions were actually correct, but the coordinate system K1 was not what I expected it to be.... Well, I hope someone sees this before they waste as much time as I did during the hunt for the error. Once again, we see how important systematic debugging is....