Open big files

classic Classic list List threaded Threaded
8 messages Options
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Open big files

Raph
Hi all!
I have a problem :
I must open a big file and then work on it. I mean, I must check for neighbors on points more than 1000 times,
and actually,even with 700Mo files, it's really slow. I will have to work with 10Go+ file, meaning that it will be really long. Is there a way to import binary files (lighter than common formats...).

For example what I have to do :
Load the point cloud  //Long => i'm using the loadCloud function
Load a file with points coordinates (x,y,z) //Short => I put it in a vector
Find in the point cloud the nearest point of (x,y,z) -> call it (a,b,c)//Long => I'm using KDTree
Work with (a,b,c) neighbors //Long => KDtree again.

Is there a way to work on big file without having to wait 3 days?

Thanks for all! :)



Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Open big files

noname
Hi,

I've made kind of similar observations, the loading routines for point
clouds are extremely slow in PCL.
The only solution I could come up with, but didn't ever come to finish
(project is over already ...), was to manually implement file
I/O-things, maybe even with steaming support so one doesn't have to wait
for all data to be loaded before processing it (and not fill up all the
memory, triggering swapping/paging, making things even worse ...).

Thus, I'm also very interested in alternate solutions. :-)


kind regards

On 17.03.2017 12:01, Raph wrote:

> Hi all!
> I have a problem :
> I must open a big file and then work on it. I mean, I must check for
> neighbors on points more than 1000 times,
> and actually,even with 700Mo files, it's really slow. I will have to work
> with 10Go+ file, meaning that it will be really long. Is there a way to
> import binary files (lighter than common formats...).
>
> For example what I have to do :
> Load the point cloud  //Long => i'm using the loadCloud function
> Load a file with points coordinates (x,y,z) //Short => I put it in a vector
> Find in the point cloud the nearest point of (x,y,z) -> call it
> (a,b,c)//Long => I'm using KDTree
> Work with (a,b,c) neighbors //Long => KDtree again.
>
> Is there a way to work on big file without having to wait 3 days?
>
> Thanks for all! :)
>
>
>
>
>
>
>
> --
> View this message in context: http://www.pcl-users.org/Open-big-files-tp4044151.html
> Sent from the Point Cloud Library (PCL) Users mailing list mailing list archive at Nabble.com.
> _______________________________________________
> [hidden email] / http://pointclouds.org
> http://pointclouds.org/mailman/listinfo/pcl-users

_______________________________________________
[hidden email] / http://pointclouds.org
http://pointclouds.org/mailman/listinfo/pcl-users
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Open big files

engin
In reply to this post by Raph
you can try the LAZ format. it is a compressed binary format that comes with an official opensource library (laslib).

I'm not sure if it will make the loading faster but the files will definitely be smaller (about 7-10 times). On the other hand I also work with big files but I don't remember waiting for 3 days to query a thousand points. Maybe check how you are using kd-trees again?

You can try and compare the IO times and share the results :)?

Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Open big files

Raph
Hi engin!
Thanks for your response, i'll take a look at LAZ format. Maybe it'll help me.
I use the KD tree as it is explained in the tutorial about KD-tree. I Believe I have to use it several times, making the execution unbelivable. I just had an idea.
But when I talk about big files, it's file of 230Go, with several billions points.
(3 day was of course a way of speaking, I'm french,I must exagerate all my speeches!).
I'll take a look at everything again!

Thanks :)
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Open big files

Raph
For now, Im' searching for the nearests neighbors of 950 points in a 4Go File, and this already took me more than 1 night to perform 220/950 (knowing that I'm using parrallelism)...


Do you believe I must use Octree to perform on my Tree? Could it be better? Someone already tried Octree? What are the perf's?


Thanks for all! :)
Nice evening and day!
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Open big files

Chris Flesher
Are you running out of RAM on the 4Go problem? If your computer is forced to use swap memory it's going to slow things down a lot. What about dividing up the large point cloud into several subsets and then comparing the closest point of each of the subset results?

On Wed, Mar 22, 2017 at 4:08 AM, Raph <[hidden email]> wrote:
For now, Im' searching for the nearests neighbors of 950 points in a 4Go
File, and this already took me more than 1 night to perform 220/950 (knowing
that I'm using parrallelism)...


Do you believe I must use Octree to perform on my Tree? Could it be better?
Someone already tried Octree? What are the perf's?


Thanks for all! :)
Nice evening and day!



--
View this message in context: http://www.pcl-users.org/Open-big-files-tp4044151p4044179.html
Sent from the Point Cloud Library (PCL) Users mailing list mailing list archive at Nabble.com.
_______________________________________________
[hidden email] / http://pointclouds.org
http://pointclouds.org/mailman/listinfo/pcl-users


_______________________________________________
[hidden email] / http://pointclouds.org
http://pointclouds.org/mailman/listinfo/pcl-users
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Open big files

lucasamparo
In reply to this post by Raph
Raph,

the NN algorithm is naturally slow.

You can implement your own NN. If you'll search something more than once in the same colecction, you might order the vector ( O(n²) or O(nlogn) ) and search by binarie search ( O(logn) ).
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Open big files

Raph
Hi all, thanks for all your response.

I believe I found my major problem. In fact I was recreating the tree of my whole pointcloud everytime I made an iteration. And I did it twice an iteration, making my program suffer from troubles!

Now that I removed those lines, I have a program that run 260% faster, and using my 8core, it's 8 times faster.
Making my program more or less acceptable.

Thanks to all and for all :)
Loading...