Sensor Alignment

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

Sensor Alignment

Chris Flesher
Hello,

I am pretty new to PCL. I have an underwater vehicle with multibeam sonar on it, it provides range data in a fan, similar to the output of a SICK laser. I am trying to determine the exact orientation of this sensor with respect to my vehicle axes. I was wondering if anyone might have an idea on the best way to go about this, here is my initial thought:

- use a custom point cloud type with quaternion that specifies vehicle-to-world rotation at the time the measurement was taken
- use non-linear ICP but cost function optimizes for sensor-to-vehicle rotation rather than sensor-to-world rotation

The problem is I am new to PCL and am not sure how to actually implement something like this. I was thinking about trying to use non-linear ICP and overriding the computeTransformation function from icp.h.

Or if there is an alternate approach that might work better I would be interested in hearing any thoughts.

Thanks!
Chris
Reply | Threaded
Open this post in threaded view
|

Re: Sensor Alignment

Chris Flesher
After digging though code I found that it would be impossible to use ICP to solve for sensor mis-alignments. The problem is the transformCloud function calls in icp.hpp only allow a single transform to be specified but I need a different transform for each point depending on how the sensor is oriented at each point in time.

I think I will still use PCL to compute a distance measure between two different point clouds as input for an optimizer. The optimizer can modify sensor rotation and use the distance measure coming out of the PCL program for feedback.

On Mon, Mar 21, 2016 at 11:26 AM, Chris Flesher <[hidden email]> wrote:
Hello,

I am pretty new to PCL. I have an underwater vehicle with multibeam sonar on
it, it provides range data in a fan, similar to the output of a SICK laser.
I am trying to determine the exact orientation of this sensor with respect
to my vehicle axes. I was wondering if anyone might have an idea on the best
way to go about this, here is my initial thought:

- use a custom point cloud type with quaternion that specifies
vehicle-to-world rotation at the time the measurement was taken
- use non-linear ICP but cost function optimizes for sensor-to-vehicle
rotation rather than sensor-to-world rotation

The problem is I am new to PCL and am not sure how to actually implement
something like this. I was thinking about trying to use non-linear ICP and
overriding the computeTransformation function from icp.h.

Or if there is an alternate approach that might work better I would be
interested in hearing any thoughts.

Thanks!
Chris



--
View this message in context: http://www.pcl-users.org/Sensor-Alignment-tp4041256.html
Sent from the Point Cloud Library (PCL) Users mailing list mailing list archive at Nabble.com.
_______________________________________________
[hidden email] / http://pointclouds.org
http://pointclouds.org/mailman/listinfo/pcl-users


_______________________________________________
[hidden email] / http://pointclouds.org
http://pointclouds.org/mailman/listinfo/pcl-users
Reply | Threaded
Open this post in threaded view
|

Re: Sensor Alignment

VictorLamoine
Administrator
Hello,

You are looking for extrinsic calibration algorithm; here is a implementation example;
https://github.com/ros-industrial/industrial_calibration/tree/hydro-devel/industrial_extrinsic_cal

Unfortunately there is not many documentation about this project.

Bye
Reply | Threaded
Open this post in threaded view
|

Re: Sensor Alignment

Chris Flesher
This post has NOT been accepted by the mailing list yet.
The solution I came up with was to pertubate the sensor orientation with respect to the vehicle and then use correspondence estimation to come up with a cost and print the output. Then used optimization (scipy.fmin) to minimize the cost.

int main(int argc, char** argv) {

    // print help
    if (pcl::console::find_switch(argc, argv, "--help") ||
        pcl::console::find_switch(argc, argv, "-h")) {
        printHelp(argv);
        return 0;
    }
    
    // parse the input arguments
    if (argc < 3) {
        printHelp(argv);
        return -1;
    }

    PointCloud::Ptr patch1 = loadPointCloud(string(argv[1]));
    PointCloud::Ptr patch2 = loadPointCloud(string(argv[2]));

    // get distance measure
    pcl::registration::CorrespondenceEstimation<PointType, PointType> correspondenceEst;
    pcl::Correspondences correspondences;
    correspondenceEst.setInputSource(patch1);
    correspondenceEst.setInputTarget(patch2);
    correspondenceEst.determineCorrespondences (correspondences,
        std::sqrt(std::numeric_limits<float>::max()));
    float err(0.0);
    for (size_t i=0; i < correspondences.size(); i++) {
        const PointType& p_src = patch1->points[correspondences[i].index_query];
        const PointType& p_tgt = patch2->points[correspondences[i].index_match];
        Eigen::Vector4f s (p_src.x, p_src.y, p_src.z, 0);
        Eigen::Vector4f t (p_tgt.x, p_tgt.y, p_tgt.z, 0);
        err += ((s - t).norm());
    }
    
    // print distance measure to standard output
    std::cout << std::setprecision(9) << std::scientific << err << std::endl;

    // display the result (optional)
    if (pcl::console::find_switch(argc, argv, "--visualize") ||
        pcl::console::find_switch(argc, argv, "-v")) {
        typedef boost::shared_ptr<pcl::visualization::PCLVisualizer>  ViewerPtr;
        ViewerPtr viewer (new pcl::visualization::PCLVisualizer ("3D Viewer"));
        ColorHandler::ConstPtr green (new ColorHandler(patch1, 0, 255, 0));
        ColorHandler::ConstPtr blue (new ColorHandler(patch2, 0, 0, 255));
        viewer->setBackgroundColor(0, 0, 0);
        viewer->addPointCloud<PointType>(patch1, *green, "patch 1");
        viewer->addPointCloud<PointType>(patch2, *blue,  "patch 2");
        //viewer->addCoordinateSystem(1.0);
        viewer->initCameraParameters();
        
        // main loop, keep running visualizer until the user hits 'q'
        while (!viewer->wasStopped()) {
            viewer->spinOnce(100);
            boost::this_thread::sleep(boost::posix_time::microseconds(100000));
        }
        
    }