Point cloud data calibration with real world [screenshot inside]

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Point cloud data calibration with real world [screenshot inside]

Median
This post has NOT been accepted by the mailing list yet.
Hello everyone

I've been making some experiments with PCL, and I got to say that it is awesome, congratilations to the developers for such great work.
I am studying it and I think that there is a very good chance that I'm going to use it on a major computer-human interaction project.

in this project a user interacts with a plain surface, which can be detected using the plane segmentation algorithm which I already tried (check the snapshot). It is also necessary to recognize the fingers in order to know if the user is touching the surface (the surface would interact as a touch screen).

However, the keypoint of the project is to calibrate the point cloud data with the real world, meaning, transform point cloud data in real world coordinates. For example, if I recognize that the user is touching surface in index X on point cloud Y, how would I transform that in real surface coordinates? Has this issue already been addressed?

I found this MIT video that hints me that this is possible, since the user interacts with the projection http://video.mit.edu/watch/kinect-hand-detection-12073/



Hoping to get some guidane, thanks in advance
Reply | Threaded
Open this post in threaded view
|

Re: Point cloud data calibration with real world [screenshot inside]

VictorLamoine
Administrator
Checkout the youtube video description
Since the code is released you should be able to get the information you need.

I also now that NiTE component is able to do hand tracking. Perhaps they are using NiTE !
Reply | Threaded
Open this post in threaded view
|

Re: Point cloud data calibration with real world [screenshot inside]

Median
This post has NOT been accepted by the mailing list yet.
Thanks Vitor

Can you confirm that the the coordinates of a point within a point cloud is given in meters in this library? And that they represent the distance to the sensor?