Hand-Eye Calibration

With a hand-eye calibration, you can determine the position of the camera relative to a robot. The NxLib supports two different types of setup. The camera can either be fixed with respect to the robot’s base or it can be mounted on the robot’s hand and move with it.

Fixed Camera

When the camera is fixed, the calibration can find out the position of the camera relative to the robot’s base.

In order to do this, you need to grab a calibration pattern with the robot and move it to different positions. Capture an image of the calibration pattern in each of these poses. The calibration can then calculate the camera’s position as well as the position of the calibration pattern relative to the robot’s hand (which does not need to be known beforehand).

Moving Camera

When the camera is mounted on the robot’s hand, the calibration can find out the exact position of the camera relative to the robot’s hand.

In order to do this, you need to place a calibration pattern in the robot’s workspace. Then you can move the robot to different poses and capture an image of the calibration pattern from each of them. The calibration can then calculate the camera’s exact position on the robot as well as the position of the calibration pattern in the robot’s workspace (which does not need to be known beforehand).

Steps to Perform a Hand-Eye Calibration

You can follow almost the same steps for performing both types of hand-eye calibration. The following instructions are meant for a fixed camera setup. When they differ, the instructions for a moving camera setup are given in parentheses.

  • Mount the camera inside of the robot’s workspace (or on the robot’s hand).

  • Place the calibration pattern in the robot’s hand (or somewhere in the robot’s workspace).

  • Move the robot into the start pose, i.e. a position and orientation from which the camera can detect the calibration pattern well.

  • Set the GridSpacing to the correct value for your pattern. Alternatively, you can also use the DecodeData parameter for the CollectPattern command below.

  • Make sure that you don’t have any collected patterns in the pattern buffer by checking PatternCount or executing the DiscardPatterns command.

  • Move the robot into different poses around the start pose. Random angle and position deltas work fairly well and are easy to implement. At each of these positions, perform the following steps.

    • Stop for a while (between 0.5 to 2 seconds). Most robots tend to oscillate around the target position a little, even after signaling that they reached the target pose. Not waiting long enough can therefore reduce the calibration accuracy significantly.

    • Capture an image.

    • Collect an observation of the calibration pattern using the CollectPattern command. The observed pattern position will be stored in the pattern buffer.

    • Retrieve the exact pose of the robot hand from your robot and store it in your own program. You can also store it in a temporary node inside of the NxLib tree.

  • After observing the calibration pattern for at least 5 times, you can start the calibration. The more observations you stored, the more accurate the calibration will be (at the cost of a longer calibration time).

  • Execute the CalibrateHandEye command. You will need to pass your stored robot poses in the Transformations parameter. Additionally, you can specify the type of calibration that you are performing (fixed or moving) with the Setup parameter.

  • Wait for the command to finish. The calibration might take around 15 to 120 seconds to complete.

Note

When executing CalibrateHandEye, the given robot poses must correspond exactly to the stereo pattern observations in the pattern buffer. You should check the result of the CollectPattern command and only save the robot pose, if the pattern could be observed successfully in both sensors.

When the calibration is finished, it will set the Link node of your camera to the estimated transformation (either between the camera and the robot’s hand or between the camera and the robot’s base). Additionally, it will set the camera’s Target node to the Target parameter that was given to the CalibrateHandEye command. This is important as you can use this link to get 3D data in world coordinates later. See below for detailed instruction on how this works.

Additionally, you can find the estimated pose of the pattern in the command’s result node. This pose it either relative to the robot’s hand (for a fixed camera) or to the robot’s base (for a moving camera).

Note

  • The new link data is not stored to the camera permanently, unless you use the StoreCalibration command.

  • After mounting a camera at a new position, it might be useful to perform a recalibration. This can be done with the patterns that you collected for the hand-eye calibration anyway. See this how-to for a code snippet that performs a recalibration and can be

    inserted into the example code below.

Example Code

The following code outlines the process of a hand-eye calibration. Note that it only contains the parts that are done by the NxLib. You will have to insert the necessary code to move your robot and get its current position.

Note

  • We assume that the camera with the serial “1234” is already open. See here for information on how this can be done.

  • See here for a code snippet that does a recalibration.

// Move your robot into a suitable starting position here. Make sure that the
// pattern can be seen from the selected position.
// ...

// Set the pattern's grid spacing so that we don't need to decode the data from
// the pattern later. You will need to adapt this line to the size of the
// calibration pattern that you are using.
NxLibItem()[itmParameters][itmPattern][itmGridSpacing] = 12.5;

// Discard any pattern observations that might already be in the pattern buffer.
NxLibCommand(cmdDiscardPatterns).execute();

// Turn off the camera's projector so that we can observe the calibration
// pattern.
NxLibItem()[itmCameras][itmBySerialNumber]["1234"][itmParameters][itmCapture][itmProjector] = false;
NxLibItem()[itmCameras][itmBySerialNumber]["1234"][itmParameters][itmCapture][itmFrontLight] = true;

// We will observe the pattern 20 times. You can adapt this number depending on
// how accurate you need the calibration to be.
for (int i = 0; i < 20; i++) {
	// Move your robot to a new position from which the pattern can be seen. It
	// might be a good idea to choose these positions randomly.
	// ...

	// Make sure that the robot is not moving anymore. You might want to wait
	// for a few seconds to avoid any oscillations.
	std::this_thread::sleep_for(std::chrono::seconds(2));

	// Observe the calibration pattern and store the observation in the pattern
	// buffer.
	NxLibCommand capture(cmdCapture);
	capture.parameters()[itmCameras] = "1234";
	capture.execute();
	bool foundPattern = false;
	try {
		NxLibCommand collectPattern(cmdCollectPattern);
		collectPattern.parameters()[itmCameras] = "1234";
		collectPattern.execute();
		foundPattern = collectPattern.result()[itmStereo].exists();
	} catch (NxLibException&) {
	}

	if (foundPattern) {
		// We actually found a pattern. Get the current pose of your robot (from
		// which the pattern was observed) and store it somewhere.
		// ...
	} else {
		// The calibration pattern could not be found in the camera image. When your
		// robot poses are selected randomly, you might want to choose a different
		// one.
	}
}

// You can insert a recalibration here, as you already captured stereo patterns
// anyway. See here for a code snippet that does a recalibration.

// We collected enough patterns and can start the calibration.
NxLibCommand calibrateHandEye(cmdCalibrateHandEye);
calibrateHandEye.parameters()[itmSetup] = valFixed; // Or "valMoving" when your have a moving setup.

// At this point, you need to put your stored robot poses into the command's
// Transformations parameter.
calibrateHandEye.parameters()[itmTransformations] = ...;

// Start the calibration. Note that this might take a few minutes if you did a
// lot of pattern observations.
calibrateHandEye.execute();

// Store the new calibration to the camera's EEPROM.
NxLibCommand storeCalibration(cmdStoreCalibration);
storeCalibration.parameters()[itmCameras] = "1234";
storeCalibration.parameters()[itmLink] = true;
storeCalibration.execute();

Grabbing 3D Data in World Coordinates

After performing a hand-eye calibration, you can get your 3D data in world coordinates. This works with the NxLib’s transformation system. If you do not know this system yet, you can read about it in the documentation of the Links node or in this overview.

Note that you can change the coordinate system to which the robot is linked after the hand-eye calibration by specifying the hand-eye calibration’s Target parameter.

Fixed Camera

For a fixed camera, the transformation to robot coordinates is done automatically. The camera’s Link node is set to the transformation between the camera and the robot’s base, so any point map will automatically be transformed into robot base coordinates.

The hand-eye calibration set the camera’s Target node to the “Workspace” coordinate system by default. When your world frame is different from your robot base frame, you can therefore specify the transformation between these two by settings the “Workspace” coordinate system in the Links node. An example function that grabs 3D data in workspace worldcoordinates could look like this.

void grabDataInWorldCoordinates()
{
	// Get the current transformation between the robot base and the world frame.
	// ...

	// Report this transformation to the NxLib by setting the "Workspace" link.
	NxLibItem()[itmLinks][valWorkspace] = ...;

	// Now you can execute the usual commands for computing a point map. The
	// result will automatically be transformed to world coordinates.
	NxLibCommand(cmdCapture).execute();
	NxLibCommand(cmdComputeDisparityMap).execute();
	NxLibCommand(cmdComputePointMap).execute();

	// The point map (in world coordinates) is now stored in the camera's PointMap
	// node.
}

Moving Camera

For a moving camera, the calibration only sets the camera’s Link node to the transformation between the camera and the robot’s hand. This means that when you compute a point map, it will be transformed from camera coordinates to robot hand coordinates. For getting the data in robot base or world coordinates, you need to tell the NxLib the current pose of your robot. This is done by specifying the corresponding transformation in the Links node. By default, the hand-eye calibration sets your camera’s Target node to the “Hand” link. This means that you need to put the robot’s pose into the Hand link node.

As for the fixed camera, you can point the Hand frame’s Target node to another frame (e.g. “Workspace” ) and put the transformation between the “Hand” and “Workspace” frame into the Links node. This way you can get the 3D data in world coordinates.

An example function that grabs 3D data in robot base coordinates could look like this.

void grabDataInRobotBaseCoordinates()
{
	// Get the current pose of your robot.
	// ...

	// Report this pose to the NxLib by setting the Hand link.
	NxLibItem()[itmLinks][valHand] = ...;

	// Now you can execute the usual commands for computing a point map. The
	// result will automatically be transformed to robot base coordinates.
	NxLibCommand(cmdCapture).execute();
	NxLibCommand(cmdComputeDisparityMap).execute();
	NxLibCommand(cmdComputePointMap).execute();

	// The point map (in robot base coordinates) is now stored in the camera's
	// PointMap node.
}

Recalibrating Robot Geometry

NxLibs CalibrateHandEye command can also take robot poses in the form of joint angles together with a geometric robot model. This allows the internal iteration procedure to estimate offset angles by which the real-world zero angle of each joint deviates from the believed zero angle of the robot controller. The computed offset angles (see Angles result) can then be fed back into the controller to achieve a more accurate hand-eye calibration because of improved absolute accuracy in the calibrated working volume. Please have a look at the Links parameter of CalibrateHandEye for an example robot description.

Calibrating Robots with Less Than 6 Degrees of Freedom

To calibrate a robot with fewer DOF you need to lock some parameters using the Estimate parameter. Please refer to the documentation of cmdCalibrateHandEye how to use the parameter for different robot geometries.