Project Tango

classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

Project Tango

airuno2l
Does anyone know what type of sensor is being used in the project tango phone?

http://www.google.com/atap/projecttango/
Reply | Threaded
Open this post in threaded view
|

Re: Project Tango

erickulcyk
I’m not sure, but the page says it can capture a quarter million measurements every second, so that’s 640*480 pixles = 300K = 1 frame per second 😊.  Actually, this project looks really neat.

Eric

From: [hidden email]
Sent: ‎Thursday‎, ‎February‎ ‎20‎, ‎2014 ‎4‎:‎49‎ ‎PM
To: [hidden email]

Does anyone know what type of sensor is being used in the project tango
phone?

http://www.google.com/atap/projecttango/



--
View this message in context: http://www.pcl-users.org/Project-Tango-tp4032411.html
Sent from the Point Cloud Library (PCL) Users mailing list mailing list archive at Nabble.com.
_______________________________________________
[hidden email] / http://pointclouds.org
http://pointclouds.org/mailman/listinfo/pcl-users

_______________________________________________
[hidden email] / http://pointclouds.org
http://pointclouds.org/mailman/listinfo/pcl-users
Reply | Threaded
Open this post in threaded view
|

Re: Project Tango

nizar sallem
It is Myriad 1 vision processor from Movidius. Looks like a kinect like
since in the video you can see the calibration step with a pattern board.

--
Nizar

On 21/02/2014 00:02, [hidden email] wrote:

> I’m not sure, but the page says it can capture a quarter million measurements every second, so that’s 640*480 pixles = 300K = 1 frame per second 😊.  Actually, this project looks really neat.
>
>
> Eric
>
>
>
>
>
> From: airuno2l
> Sent: ‎Thursday‎, ‎February‎ ‎20‎, ‎2014 ‎4‎:‎49‎ ‎PM
> To: Point Cloud Library (PCL) users
>
>
>
>
>
> Does anyone know what type of sensor is being used in the project tango
> phone?
>
> http://www.google.com/atap/projecttango/
>
>
>
> --
> View this message in context: http://www.pcl-users.org/Project-Tango-tp4032411.html
> Sent from the Point Cloud Library (PCL) Users mailing list mailing list archive at Nabble.com.
> _______________________________________________
> [hidden email] / http://pointclouds.org
> http://pointclouds.org/mailman/listinfo/pcl-users
>
>
>
> _______________________________________________
> [hidden email] / http://pointclouds.org
> http://pointclouds.org/mailman/listinfo/pcl-users
>
_______________________________________________
[hidden email] / http://pointclouds.org
http://pointclouds.org/mailman/listinfo/pcl-users
Reply | Threaded
Open this post in threaded view
|

Re: Project Tango

Radu B. Rusu
Administrator
In reply to this post by airuno2l
The depth sensor has not been announced, and they might not do it. From the video, either the resolution of the depth sensor is poor or the FPS is, or both. Nowhere near Kinect quality, as you don’t have a texture projector that’s as powerful. Lots of different constraints going on in a smartphone.

However, it’s AWESOME to see a push for this kind of hardware getting integrated into smaller devices! Between this, Apple<->Primesense, Pelican Imaging, Softkinetic, PMD, etc etc - consumers and app developers can only benefit. This should also help robotics long term.

There is a Movidius chip for accelerating some basic computer vision building blocks in the device (http://www.movidius.com/) — nothing to do with the depth camera per se, just a separate chip that can accelerate certain operators. Think about NVIDIA’s Tegra which has a powerful GPU where you could accelerate your CV algorithms, thus freeing the CPU. Not a big deal for desktops, but pretty important for mobile. I’ve not seen any performance (per watt) comparisons between Movidius and Tegra, so I can’t comment more on what’s better. Both should be ok.

Now what we all need is a few hundred millions of these devices out, and we got something cool to play with ;) Unfortunately that’ll still take some time, but we’re on the right track! Competition is awesome.

PS. This should put some pressure on the community to better optimize PCL for Android ;) 3D for Android is coming!

Best,
Radu.

On Feb 20, 2014, at 2:49 PM, airuno2l <[hidden email]> wrote:

> Does anyone know what type of sensor is being used in the project tango
> phone?
>
> http://www.google.com/atap/projecttango/
>
>
>
> --
> View this message in context: http://www.pcl-users.org/Project-Tango-tp4032411.html
> Sent from the Point Cloud Library (PCL) Users mailing list mailing list archive at Nabble.com.
> _______________________________________________
> [hidden email] / http://pointclouds.org
> http://pointclouds.org/mailman/listinfo/pcl-users

_______________________________________________
[hidden email] / http://pointclouds.org
http://pointclouds.org/mailman/listinfo/pcl-users
Reply | Threaded
Open this post in threaded view
|

Re: Project Tango

nizar sallem
Just to follow up on this,

We were right to suspect a structured light projector indeed Tango phone
encapsulates a PrimeSense chip. Here is the tear down :
http://www.ifixit.com/Teardown/Project+Tango+Teardown/23835.

--
Nizar
On 21/02/2014 18:31, Radu B. Rusu wrote:

> The depth sensor has not been announced, and they might not do it. From the video, either the resolution of the depth sensor is poor or the FPS is, or both. Nowhere near Kinect quality, as you don’t have a texture projector that’s as powerful. Lots of different constraints going on in a smartphone.
>
> However, it’s AWESOME to see a push for this kind of hardware getting integrated into smaller devices! Between this, Apple<->Primesense, Pelican Imaging, Softkinetic, PMD, etc etc - consumers and app developers can only benefit. This should also help robotics long term.
>
> There is a Movidius chip for accelerating some basic computer vision building blocks in the device (http://www.movidius.com/) — nothing to do with the depth camera per se, just a separate chip that can accelerate certain operators. Think about NVIDIA’s Tegra which has a powerful GPU where you could accelerate your CV algorithms, thus freeing the CPU. Not a big deal for desktops, but pretty important for mobile. I’ve not seen any performance (per watt) comparisons between Movidius and Tegra, so I can’t comment more on what’s better. Both should be ok.
>
> Now what we all need is a few hundred millions of these devices out, and we got something cool to play with ;) Unfortunately that’ll still take some time, but we’re on the right track! Competition is awesome.
>
> PS. This should put some pressure on the community to better optimize PCL for Android ;) 3D for Android is coming!
>
> Best,
> Radu.
>
> On Feb 20, 2014, at 2:49 PM, airuno2l <[hidden email]> wrote:
>
>> Does anyone know what type of sensor is being used in the project tango
>> phone?
>>
>> http://www.google.com/atap/projecttango/
>>
>>
>>
>> --
>> View this message in context: http://www.pcl-users.org/Project-Tango-tp4032411.html
>> Sent from the Point Cloud Library (PCL) Users mailing list mailing list archive at Nabble.com.
>> _______________________________________________
>> [hidden email] / http://pointclouds.org
>> http://pointclouds.org/mailman/listinfo/pcl-users
> _______________________________________________
> [hidden email] / http://pointclouds.org
> http://pointclouds.org/mailman/listinfo/pcl-users
>

_______________________________________________
[hidden email] / http://pointclouds.org
http://pointclouds.org/mailman/listinfo/pcl-users