Hand-Eye Calibration

With a hand-eye calibration, you can determine the position of the camera relative to a robot. The NxLib supports two different types of setup. The camera can either be fixed with respect to the robot or it can be mounted on the robot hand and move with it.

After mounting a camera at a new position, it might be useful to perform a recalibration. This can be done with the patterns that you are collecting for the hand-eye calibration anyway. See the documentation on Dynamic Recalibration for a code snippet that performs a recalibration based on collected pattern observations and that can be inserted into the code example below.

Note

In the following guide the involved transformations are written in the notation Reference_from_Target (see the Transformation page for details on the frames).

Fixed Camera

When the camera is fixed, the calibration determines the camera pose relative to the robot origin as Transformation Camera_from_RobotOrigin (see below).

In order to do this, you need to grab a calibration pattern with the robot, move it to different positions and capture an image of the calibration pattern in each of these poses. The calibration can then calculate the camera position as well as the position of the calibration pattern relative to the robot hand (which does not need to be known beforehand).

The following image shows the fixed hand-eye calibration with its coordinate systems and transformations:

../_images/hand_eye_fixed.svg

Transformation Overview

  • The user has to supply the robot hand poses RobotOrigin_from_Hand of the collected patterns.

  • The pattern poses Camera_from_Pattern are supplied by the collected patterns from the camera (see the Steps section below).

  • After the calibration the transformation Camera_from_RobotOrigin and the pattern pose Hand_from_Pattern have been estimated. If the default Target has not been changed, Camera_from_RobotOrigin has been written to the camera’s Link node as Camera_from_Workspace.

Moving Camera

When the camera is mounted on the robot’s hand, the calibration determines the camera pose relative to the robot hand as Transformation Camera_from_Hand (see below).

In order to do this, you need to place a calibration pattern in the robot’s workspace. Then you can move the robot to different poses and capture an image of the calibration pattern from each of them. The calibration can then calculate the camera’s exact position on the robot as well as the position of the calibration pattern in the robot’s workspace (which does not need to be known beforehand).

The following image shows the moving hand-eye calibration with its coordinate frames and transformations:

../_images/hand_eye_moving.svg

Transformation Overview

  • The user has to supply the robot hand poses RobotOrigin_from_Hand of the collected patterns.

  • The pattern poses Camera_from_Pattern are supplied by the collected patterns from the camera (see next section).

  • After the calibration the transformation Camera_from_Hand and the pattern pose RobotOrigin_from_Pattern have been estimated. If the default Target has not been changed, Camera_from_Hand has been written to the camera’s Link node.

Steps to Perform a Hand-Eye Calibration

You can follow almost the same steps for performing both types of hand-eye calibration. The following instructions are meant for a fixed camera setup. When they differ, the instructions for a moving camera setup are given in parentheses.

  • Mount the camera inside of the robot’s workspace (or on the robot’s hand).

  • Place the calibration pattern in the robot’s hand (or somewhere in the robot’s workspace).

  • Move the robot into the start pose, i.e. a position and orientation from which the camera can detect the calibration pattern well.

  • Set the GridSpacing to the correct value for your pattern. Alternatively, you can also use the DecodeData parameter for the CollectPattern command below.

  • Make sure that you don’t have any collected patterns in the PatternBuffer by checking PatternCount or executing the DiscardPatterns command.

  • Move the robot into different poses around the start pose. Random angle and position deltas work fairly well and are easy to implement. At each of these positions, perform the following steps.

    • Stop for a while (between 0.5 to 2 seconds). Most robots tend to oscillate around the target position a little, even after signaling that they reached the target pose. Not waiting long enough can therefore reduce the calibration accuracy significantly.

    • Capture an image.

    • Collect an observation of the calibration pattern using the CollectPattern command. The observed pattern position will be stored in the PatternBuffer.

    • Retrieve the exact pose of the robot hand from your robot and store it in your own program. You can also store it in a temporary node inside of the NxLib tree.

  • The given robot poses must correspond exactly to the stereo pattern observations in the PatternBuffer. You should check the result of the CollectPattern command and only save the robot pose if the pattern could be observed successfully in both sensors.

  • After observing the calibration pattern for at least 5 times, you can start the calibration. The more observations you stored, the more accurate the calibration will be (at the cost of a longer calibration time).

  • Consider a recalibration as described above.

  • Execute the CalibrateHandEye command. You will need to pass your stored robot poses in the Transformations parameter. Additionally, you can specify the type of calibration that you are performing (fixed or moving) with the Setup parameter.

  • Wait for the command to finish. The calibration might take around 15 to 120 seconds to complete.

Example Code

The following code outlines the process of a hand-eye calibration. Note that it only contains the parts that are done by the NxLib. You will have to insert the necessary code to move your robot and get its current position. We assume that the camera with the serial “1234” is already open. See here for information on how this can be done.

// Move your robot into a suitable starting position here. Make sure that the
// pattern can be seen from the selected position.
// ...

// Set the pattern's grid spacing so that we don't need to decode the data from
// the pattern later. You will need to adapt this line to the size of the
// calibration pattern that you are using.
NxLibItem()[itmParameters][itmPattern][itmGridSpacing] = 12.5;

// Discard any pattern observations that might already be in the pattern buffer.
NxLibCommand(cmdDiscardPatterns).execute();

// Turn off the camera's projector so that we can observe the calibration
// pattern.
NxLibItem()[itmCameras]["1234"][itmParameters][itmCapture][itmProjector] = false;
NxLibItem()[itmCameras]["1234"][itmParameters][itmCapture][itmFrontLight] = true;

// We will observe the pattern 20 times. You can adapt this number depending on
// how accurate you need the calibration to be.
for (int i = 0; i < 20; i++) {
	// Move your robot to a new position from which the pattern can be seen. It
	// might be a good idea to choose these positions randomly.
	// ...

	// Make sure that the robot is not moving anymore. You might want to wait
	// for a few seconds to avoid any oscillations.
	std::this_thread::sleep_for(std::chrono::seconds(2));

	// Observe the calibration pattern and store the observation in the pattern
	// buffer.
	NxLibCommand capture(cmdCapture);
	capture.parameters()[itmCameras] = "1234";
	capture.execute();
	bool foundPattern = false;
	try {
		NxLibCommand collectPattern(cmdCollectPattern);
		collectPattern.parameters()[itmCameras] = "1234";
		collectPattern.execute();
		foundPattern = collectPattern.result()[itmStereo].exists();
	} catch (NxLibException&) {
	}

	if (foundPattern) {
		// We actually found a pattern. Get the current pose of your robot (from
		// which the pattern was observed) and store it somewhere.
		// ...
	} else {
		// The calibration pattern could not be found in the camera image. When your
		// robot poses are selected randomly, you might want to choose a different
		// one.
	}
}

// You can insert a recalibration here, as you already captured stereo patterns
// anyway. See here for a code snippet that does a recalibration.

// We collected enough patterns and can start the calibration.
NxLibCommand calibrateHandEye(cmdCalibrateHandEye);
calibrateHandEye.parameters()[itmCameras] = "1234";
calibrateHandEye.parameters()[itmSetup] = valFixed; // Or "valMoving" when your have a moving setup.

// At this point, you need to put your stored robot poses into the command's Transformations parameter.
// We support a range of transformation formats:
//  - as object with Axis-angle rotation:
//     [{"Rotation": {"Angle" : 3.141592, "Axis" : [0.26726, 0.53452, 0.80178]}, "Translation": [-10,30,200]},
//     {"Rotation": ..., "Translation": ...}, {...}, ...]
//  - as object with Euler angles and known manufacturer:
//     [{"Rotation": {"Angles" : [3.141592, 1.12234, 0.66544], "Convention" : "Kuka"}, "Translation": [-44,89,120]},
//     {"Rotation": ..., "Translation": ...}, {...}, ...]
//  - as object with Euler angles and convention:
//     [{"Rotation": {"Angles" : [2.14, 0.12234, 0.16333], "Convention" : {"Axes":"ZYZ", "Extrinsic":true}},
//     "Translation": [-74,9,12]}, {"Rotation": ..., "Translation": ...}, {...}, ...]
//  - as homogenous matrix:
//     [..., [[0.5,0.0,-0.86603,0.0],[0.0,1.0,0.0,0.0],[0.86603,0.0,0.5,0.0],[50.0,-6.0,300.0,1.0]], ...]
// They all have in common that distances are interpeted in milimeters and angles in radians.
calibrateHandEye.parameters()[itmTransformations] << transformationsInJsonFormat;

// Start the calibration. Note that this might take a few minutes if you did a
// lot of pattern observations.
calibrateHandEye.execute();

// Store the new calibration to the camera's EEPROM.
NxLibCommand storeCalibration(cmdStoreCalibration);
storeCalibration.parameters()[itmCameras] = "1234";
storeCalibration.parameters()[itmLink] = true;
storeCalibration.execute();

After the Calibration

When the calibration is finished, the command will set the Link node of your camera to the estimated transformation as described above. The corresponding link Target will be set to the Target parameter specified in the parameters of the hand-eye calibration command. The new link data, however, is not stored in the camera permanently unless you use the StoreCalibration command to write it to the EEPROM.

Additionally, you can find the estimated PatternPose in the result node.

In order to understand how the hand-eye calibration command affects your coordinate systems, the following sections explain the effect of the corresponding setups and provide further code examples. If you are not familiar with the transformation system of the NxLib, please read the topic on Links, Coordinate Systems and LinkTree first.

Fixed Camera

The camera is now linked to the robot origin with the coordinate system with the name defined by Target, defaulting to “Workspace”. This coordinate system is at the same time the world coordinate system in which the 3D data of the camera will be returned.

If you want a different world coordinate system, you have two options which effectively do the same:

  1. Specify your desired world coordinate system by adding a link from the Target coordinate system to e.g. “World” to the user defined links node.

  2. Repeat the calibration with a different Target - e.g. “RobotOrigin” - then perform step 1.

Moving Camera

The camera is now linked to the coordinate system with the name defined by Target, defaulting to “Hand”. This means that the 3D data of the camera will be returned in a moving target coordinate system. In order to get the 3D data in a static coordinate system (e.g. “RobotOrigin”), the NxLib always needs to know the current transformation between target and that static coordinate system. This is done by specifying the corresponding transformation in the user defined links node, where you can find an example for the case “Hand” to “Workspace”. After specifying the link, the static coordinate system “Workspace” becomes the world coordinate system in which the 3D data of the camera will be returned.

Example Code

void grabDataInWorldCoordinates()
{
	// Get the current transformation between the robot origin and the desired
	// world coordinate system.
	// ...

	// Report this transformation to the NxLib by setting the "Workspace" link.
	// Note that an empty target node marks the world coordinate system.
	NxLibItem()[itmLinks][valWorkspace] << transformationInJsonFormat;

	// Now you can execute the usual commands for computing a point map. The
	// result will automatically be transformed to world coordinates.
	NxLibCommand(cmdCapture).execute();
	NxLibCommand(cmdComputeDisparityMap).execute();
	NxLibCommand(cmdComputePointMap).execute();

	// The point map (in world coordinates) is now stored in the camera's
	// PointMap node.
}

Recalibrating Robot Geometry

NxLibs hand-eye calibration command can also take robot poses in the form of joint angles together with a geometric robot model. This allows the internal iteration procedure to estimate offset angles by which the real-world zero angle of each joint deviates from the believed zero angle of the robot controller. The computed offset angles (see Angles result) can then be fed back into the controller to achieve a more accurate hand-eye calibration because of improved absolute accuracy in the calibrated working volume. Please have a look at the Links parameter of hand-eye calibration command for an example robot description.

Calibrating Robots with Less Than 6 Degrees of Freedom

To calibrate a robot with fewer DOF you need to lock some parameters using the Estimate parameter. Please refer to the documentation of the CalibrateHandEye command on how to use the parameter for different robot geometries.

Calibration Result Improvement

The following is a list of possible measures that can be taken to improve the results of a hand-eye calibration:

  • Check the camera calibration beforehand

  • Use a camera working distance similar to that used during operation

  • Use robot poses for capturing that are similar to the ones used during operation

  • Try to use the same arm configuration as during operation

  • Perform the calibration with the same weight on the robot arm as used during operation

  • Add enough waiting time after stopping at each pose (some controllers improve the position even after reporting that they have reached the pose)