Initialization step uses the ideas proposed in . During training, each image is projected as a point to the eigen-space and the corresponding pose of the object is stored with each point. For each object, we have used 96 training images (8 rotations for each angle on 4 different depths). One of the reasons for choosing this low number of training images is the workspace of the PUMA560 robot used. Namely, the workspace of the robot is quite limited and for our applications this discretization was satisfactory. To enhance the robustness with respect to variations in intensity, all images are normalized. At this stage, the size of the training samples is 100100 pixels color images. The training procedure takes about 3 minutes on a Pentium III 550 running Linux.
Given an input image, it is first projected to the eigenspace. The corresponding parameters are found as the closest point on the pose manifold. Now, the wire-frame model of the object can be easily overlaid on the image. Since a low number of images is used in the training process, pose parameters will not accurately correspond to the input image. Therefore, a local refinement method is used for the final fitting, see Fig. . The details are given in the next section.
During the training step, it is assumed that the object is approximately centered in the image. During task execution, the object can occupy an arbitrary part of the image. Since the recognition step delivers the image position of the object, it is easy to estimate the offset of the object from the image center and compensate for it. This way, the pose of the object relative to the camera frame can also be arbitrary.
An example of the pose initialization is presented in Fig. . Here, the pose of the object in the training image (far left) was: X=-69.3, Y=97.0, Z=838.9, =21.0, =8.3 and =-3.3. After the fitting step the pose was: X=55.9, Y=97.3, Z=899.0, =6.3, =14.0 and =1.7 (far right), showing the ability of the system to cope with significant differences in pose parameters (units are mm and degrees).