User Tools

Site Tools


ros_and_camera_calibration

Camera Calibration

A complete camera calibration process for a given camera will result in a computational model that characterizes how the camera is built and where the camera is positioned. Typical theory divides the calibration process into a two-step process:

  1. Solve for the unknown variables that govern how the camera takes pictures (referred to as intrinsic parameters)
  2. Solve for the unknown variables that specify the position and orientation of the camera in the world/global coordinate frame (referred to as extrinsic parameters).

In this circumstance, the extrinsic parameters we needed to compute relate the camera pose, i.e., the position and orientation of the camera, to the motion capture system in the laboratory (made by Optitrack).

My approach was to perform a complete calibration using small scale calibration boards/patterns to sound out the calibration process and then, to improve accuracy, I plan on scaling up all of the calibration patterns so I can use the same software pipeline to calibrate having to only change the metric configuration of the calibration boards.

Creating a camera calibration board

ROS Intrinsic (Monocular) Camera Calibration

Since I often interchange my workspace between indigo and jade platforms, I compile all of my nodes for my ROS calibration workspace. In this particular example I will be calibrating the RGB camera in my RGBD sensors (Asus XTion Pro Live/Kinect Xbox360). To do so I needed to install the following ROS nodes to my workspace: (image_pipeline, vision_opencv, rgbd_launch, openni2_launch, openni2_camera, freenect_stack). Where the package rgbd_launch is required for either RGBD sensor, the packages (openni2_launch, openni2_camera) are required as drivers for the Asus XTion and the package freenect_stack is required as drivers for the Kinect Xbox360.

The image_pipeline package consists of several other packages. The package of note for calibration is called *camera_calibration* within the image_pipeline package. For my initial process I printed a small calibration pattern check-108.pdf. This pattern has 1 inch squares (0.0254m) in a chessboard pattern. Make sure you measure the dimensions of the squares if you print this file as the size of the squares is particularly important for determining the scale when calibrating the intrinsics of your camera.

ArUco - Augmented Reality library from the University of Cordoba

ArUco is a lightweight library designed for use in augmented reality (ar) projects. The markers are easy to detect and, when detected, also have a unique pose which makes them particularly useful for visual pose estimation since the classical chessboard-based pose methods suffer from orientation ambiguities due to the symmetric nature of the pattern. For this reason we will use ArUco (aruco) markers to estimate the (camera pose/extrinsic parameters).

First thing to do is download the ArUco library and create/print your calibration pattern. While some are openly available on the net, I needed a board appropriate to the context of our laboratory.

Download and compile ArUco

I downloaded and decompressed the latest (1.3.0) version of the aruco library from http://www.uco.es/investiga/grupos/ava/node/26

I compiled from the uncompressed folder of source files as shown on their site which is repeated below:

~/NRC/aruco-1.3.0$ mkdir build
~/NRC/aruco-1.3.0$ cd build
~/NRC/aruco-1.3.0/build$ cmake .. -DCMAKE_BUILD_TYPE=Release -DCMAKE_INSTALL_PREFIX=../dist
...
~/NRC/aruco-1.3.0/build$ make
...
~/NRC/aruco-1.3.0/build$ make install
...
~/NRC/aruco-1.3.0/build$ ls ../dist/bin
aruco_board_pix2meters  aruco_hrm_create_board       aruco_hrm_test_board        aruco_simple_board  aruco_test_board_gl
aruco_create_board      aruco_hrm_create_dictionary  aruco_selectoptimalmarkers  aruco_test          aruco_test_board_gl_mask
aruco_create_marker     aruco_hrm_test               aruco_simple                aruco_test_board    aruco_test_gl

Notable changes are that I prefer to install binaries and libraries locally to prevent conflicts which can happen when you put libraries in system folders, e.g., /usr/local etc.

On my jade computer I had several installations of OpenCV which initially confused the cmake process. I had to manually edit the CMakefile.txt in the aruco-1.3.0 folder to set the OpenCV install directory by adding the following lines to the beginning of the file.

set(OpenCV_DIR /usr/share/OpenCV)
set(OpenCV_FOUND true)

which ensures that I am compiling against the Ubuntu-installed version of OpenCV (the default one that's in with the other standard OS libraries) (OpenCV_DIR) and overrides some nonsense I get that it finds OpenCV but then sets the fact that it found OpenCV to false (OpenCV_FOUND).

Create the calibration pattern

Here we use the aruco tools to create a calibration pattern.

~/NRC/aruco-1.3.0/dist/bin$ export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:../lib
~/NRC/aruco-1.3.0/dist/bin$ ls
aruco_board_pix2meters  aruco_create_marker         aruco_simple_board  aruco_test_board_gl       CMakeFiles
aruco_calibration       aruco_selectoptimalmarkers  aruco_test          aruco_test_board_gl_mask  cmake_install.cmake
aruco_create_board      aruco_simple                aruco_test_board    aruco_test_gl             Makefile
~/NRC/aruco-1.3.0/dist/bin$ ./aruco_create_board 
Usage: X:Y boardImage.png boardConfiguration.yml [pixSize] [Type(0: panel,1: chessboard, 2: frame)] [interMarkerDistance(0,1)]
~/NRC/aruco-1.3.0/dist/bin$ ./aruco_create_board 4:6 pose_calib_00.png pose_calib_00.yml 100 0 0.1
~/NRC/aruco-1.3.0/dist/bin$ 

The generated pattern is:

Print the pattern, mount it to something VERY flat (with glue/tape) and then measure the dimension of the printed squares. In my case, these dimensions were 35mm per square. Hence the space between the squares was 10% of this dimension or 3.5mm as dictated by the 0.1 spacing in the aruco_create_board command above. Once the physical dimensions of the board were known the physical description of the calibration pattern can be generated.

~/NRC/aruco-1.3.0/dist/bin$ export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:../lib
~/NRC/aruco-1.3.0/dist/bin$ ./aruco_board_pix2meters 
Usage:  in_boardConfiguration.yml markerSize_meters out_boardConfiguration.yml
~/NRC/aruco-1.3.0/dist/bin$ ./aruco_board_pix2meters pose_calib_00.yml 0.035 pose_calib_00_phy.yml
0.035 100 0.00035
~/NRC/aruco-1.3.0/dist/bin$

You now have a aruco marker board for pose estimation and two YAML-format descriptions of the calibration pattern:

  1. pose_calib_00.yml - a description of the calibration pattern giving marker locations (top left corner of each ArUco marker) in pixel coordinates where the calibration board plane is the (X,Y) plane and the center of the image is the origin.
  2. pose_calib_00_phy - a description of the calibration pattern giving marker lcoations (top left corner of each ArUco marker) in physical coordinates (meters) where the calibration board plane is the (X,Y) plane and the center of the calibration pattern is the origin.

In our ROS experiments using ar_sys we will only need the first (pose_calib_00.yml), i.e., pixel coordinate, description of the calibration pattern. This is because the ROS node takes the marker_size parameter as input to compute the physical coordinates of the calibration pattern internally (marker_size=0.035) for our example.

ROS Pose Estimation (ar_sys)

To get the extrinsic parameters, we will estimate the pose of the calibration board coordinate system. This coordinate system is defined as an origin, the point at the center of the calibration pattern, and an orientation, (Z perpendicular to calibration board plane, X across columns (4 marker/short dimension) and Y across rows (6 marker/long dimension)). The YAML files (pose_calib_00.yml/pose_calib_00_phy.yml) place the aruco markers symmetrically about the origin, i.e., the center of the calibration pattern. Note that the top left marker in the aruco pattern will be on the top left of the printed page. If you get confused orienting your pattern you can always look in the .YML file and the first entry will list the aruco marker id of the top left marker. For the YAML file I've generated, the pose_calib_00_phy.yml file contains the entries:

%YAML:1.0
aruco_bc_nmarkers: 24
aruco_bc_mInfoType: 1
aruco_bc_markers:
   - { id:532, corners:[ [ -7.5249999761581421e-02,
       -1.1374999582767487e-01, 0. ], [ -4.0249999612569809e-02,
       -1.1374999582767487e-01, 0. ], [ -4.0249999612569809e-02,
       -7.8749999403953552e-02, 0. ], [ -7.5249999761581421e-02,
       -7.8749999403953552e-02, 0. ] ] }
   - { id:716, corners:[ [ -3.6749999970197678e-02,
       -1.1374999582767487e-01, 0. ], [ -1.7499999376013875e-03,
       -1.1374999582767487e-01, 0. ], [ -1.7499999376013875e-03,
       -7.8749999403953552e-02, 0. ], [ -3.6749999970197678e-02,
       -7.8749999403953552e-02, 0. ] ] }
   - { id:442, corners:[ [ 1.7499999376013875e-03,
       -1.1374999582767487e-01, 0. ], [ 3.6749999970197678e-02,
       -1.1374999582767487e-01, 0. ], [ 3.6749999970197678e-02,
       -7.8749999403953552e-02, 0. ], [ 1.7499999376013875e-03,
....

One can see that, for this pattern, the top left marker has id 532 (from the standard aruco marker set).

One can check the pattern image and orientation with an online marker generator: http://terpconnect.umd.edu/~jwelsh12/enes100/markergen.html

This should always allow you to know where the top left marker is located on your pattern.

For ROS integration I used git to clone the ar_sys package: https://github.com/Sahloul/ar_sys into my workspace and then compiled the ROS source code. I then placed the pose_calib_00.yml in the data/single folder and modified a launch file to run the camera driver and load the physical model (YAML) file for my calibration pattern. My launch file is below:

<?xml version="1.0" encoding="utf-8"?>

<launch>
<include file="$(find openni2_launch)/launch/openni2.launch"/>
    <arg name="camera_info" default="file://$(find ar_sys)/launch/rgb_PS1080_PrimeSense.yaml"/>
    <arg name="result_display" default="true"/>
    <arg name="result_autosize" default="true"/>
    <arg name="result_draw_markers" default="true" />
    <arg name="result_draw_markers_cube" default="true" />
    <arg name="result_draw_markers_axis" default="true" />

    <arg name="uid" default=""/>
    <arg name="video_namespace" default="camera/rgb"/>
    <arg name="video_image_topic" default="image_raw"/>
    <arg name="video_info_topic" default="camera_info"/>
    <arg name="video_rectified" default="false" />

    <arg name="board_config" default="$(find ar_sys)/data/single/pose_calib_00.yml" />
    <arg name="board_frame_id" default="board1" />
    <arg name="marker_size" default="0.035" />
    <arg name="publish_tf" default="false" />

    <node ns="/" pkg="ar_sys" type="single_board" name="ar_single_board$(arg uid)" output="screen">
        <remap from="/camera_info" to="$(arg video_namespace)/$(arg video_info_topic)" />
        <remap from="/image" to="$(arg video_namespace)/$(arg video_image_topic)" />

        <param name="image_is_rectified" type="bool" value="$(arg video_rectified)"/>
        <param name="board_config" type="string" value="$(arg board_config)"/>
        <param name="board_frame" type="string" value="/$(arg board_frame_id)" />
        <param name="marker_size" type="double" value="$(arg marker_size)"/>
        <param name="draw_markers" type="bool" value="$(arg result_draw_markers)" />
        <param name="draw_markers_cube" type="bool" value="$(arg result_draw_markers_cube)" />
        <param name="draw_markers_axis" type="bool" value="$(arg result_draw_markers_axis)" />
        <param name="publish_tf" value="$(arg publish_tf)" />
    </node>

    <node ns="/" pkg="topic_tools" type="relay" name="ar_single_board_relay$(arg uid)" args="/ar_single_board$(arg uid)/transform /arsys_single_board/transform" />

    <group if="$(arg result_display)">
        <node ns="/" pkg="image_view" type="image_view" name="image_view$(arg uid)" output="screen">
            <remap from="image" to="/ar_single_board$(arg uid)/result" />

            <param name="autosize" type="bool" value="$(arg result_autosize)" />
        </node>
    </group>

</launch>

You might edit this file and change the parameters in the launch file as follows to just see the global pose estimate for the pattern:

    <arg name="result_draw_markers" default="false" />
    <arg name="result_draw_markers_cube" default="false" />
    <arg name="result_draw_markers_axis" default="false" />
ros_and_camera_calibration.txt · Last modified: 2016/06/01 11:57 by arwillis