Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Cellphones Handhelds Build

Tiny LIDAR Chip Could Add Cheap 3D Sensing to Cellphones and Tablets 62

There are expensive dedicated devices that do 3D scanning (like the high-end tablet in Google's Project Tango), and versatile but bulky add-ons, like the Sense from 3D Systems, but it's not a capability built into the typical cellphone or tablet. That could change, thanks to a microsensor being prototyped now (at low resolution) at CalTech. From The Verge's coverage: The tiny chip, called a nanophotonic coherent imager, uses a form of LIDAR (Light Detection And Ranging) technology to capture height, width, and depth information from each pixel. LIDAR, which shines a laser on the target and then analyzes the light waves that are reflected back to the sensor, are best known for their use in precision-guided missile systems and self-driving cars.

While LIDAR itself isn't new, [project lead Ali] Hajimiri explains that "by having an array of tiny LIDARs on our coherent imager, we can simultaneously image different parts of an object or a scene without the need for any mechanical movements within the imager." Each "pixel" on the new sensor can individually analyze the phase, frequency, and intensity of the reflected waves, producing a single piece of 3D data. The data from all of the pixels combined can produce a full 3D scan. In addition, the researchers' implementation allows for an incredibly tiny and low-cost scanner, all while maintaining accuracy. According to the researchers, the chip can produce scans that are within microns of the original.
This discussion has been archived. No new comments can be posted.

Tiny LIDAR Chip Could Add Cheap 3D Sensing to Cellphones and Tablets

Comments Filter:
  • by Anonymous Coward

    You can bet the porn industry is the first to take advantage of this

  • In phones, maybe. But automated cars, obvious application.

    • Drones is the obvious use actually. The next ar parrot will be able to do indoor mapping. Heck even if data collection and processing needs to be offloaded rescue, fire, hostage negotiations even real estate could all benefit.

  • This is the imager only, the sensor, not the part that sends out the lasers. You still need laser emitters to use with the sensor.
  • Smartphone power (Score:4, Interesting)

    by Twinbee ( 767046 ) on Sunday April 05, 2015 @11:34AM (#49410171)
    Honestly, it's insane how ludicrously powerful phones are getting. I meant take this one article for example [phonearena.com] - 4K 120fps slow-motion video recording is coming to smartphones in 2016.

    4K videoing at 120fps?! From a smartphone?

    Isn't progress wonderful.
    • by Rei ( 128717 )

      Still tiny apertures, though :(

      I'll take more light coming into the sensor over high frame rates and pixel counts any day.

    • Re: (Score:2, Informative)

      by Anonymous Coward

      Sure, as long as you're happy with progress only in information processing. How long does it take a passenger plane to fly over the Atlantic? Same as in 1969, the year of the maiden flight of the Boeing 747.

      We have no social progress towards a leisure society for all, even though all your "progress" is only used to make the rich richer.

      Keep playing with your phone while you're being robbed.

      • Not quite true. Current 747 have a higher cruise speed and reduced fuel usage compared to the 1969 model. It isn't breathe taking difference but it does exist. Basically all new business jets run at mach .95 instead of the mach .6-.7 of 30 years ago. Why not faster well fuel usage goes up, and viable flight paths go down due to supersonic restriction laws.

        Like batteries we are at limits of chemical energy storage. Even hyper velocities are tough to maintain at a practical level. Improvements are only c

        • by Holi ( 250190 )
          Anything to back that up with? Because everything I have seen seems to explain why speeds haven't changed and that to increase speed by say 10% takes 21% more energy. So airlines would rather save fuel then reduce flight times. Drag equals the square of the increase in speed.
        • Firstly: Try catching a concorde flight today.

          Secondly: All new-design contemporary turbofan jet aircraft travel slightly SLOWER than their turbojet predecessors (0.85–0.855 for a 747-400 instead of 0.89-0.91 for turbojet designs such as the Convair 900 and Boeing 707 - the 747 has always been a turbofan engine aircraft). Going faster than this has severe fuel penalties as it causes major issue with the fixed-pitch fans (and the larger the fan the lower the top speed).

          There's a push to make civil tran

  • No small amount. 123d Catch is usually broken and buggy, and being able to do terrain scans with my phone is important for me.

    Plus, I dream of the day when we can take up full 3d video on our cell phones. Not silly stereoscopy, actual 3d, with the ability to choose any perspective. With good enough quality on the 3d imager to understand transparency and reflection (which are painfully common in real life), and with a good enough software stack to assume "that which I couldn't see between times T1 and T2 is

    • You're actually using 123D Catch to 'scan' a landscape? Does it even work at all? Are you then transferring them to DEM files?

      Seems like it would never be nearly accurate enough for a landscape. If so, I'm impressed.

      • by Rei ( 128717 )

        Trying... so far I've only gotten little strips to work, not enough to make a coherent whole. But I've been doing it during the winter where there was snow, aka reflection... I'm hoping I'll have better luck the next time I try now that the snow is melted.

  • When you have a team of PhDs working on a project essentially for free (paid for by the government, not CalTech), in a subsidized (nearly free) clean room, on a device where yield doesn't really matter, "cost" tends to not be realistically estimated.

    It is not more realistic to estimate the cost by looking at the actual money spent by all sources on the project. That's likely a couple hundred thousand dollars on this one (or so) chip, but most of that is NRE.

    When someone tries to fit this into a commercial

  • Lidar isn't an acronym, its a compound word from light and radar (radar isn't an acronym anymore either).

  • I use one of those iPhone apps that lets me scan documents with the phone, emitting a PDF that's just as good as the ones I used to need a flatbed scanner to get. When I need to scan a book chapter at the library or surreptitiously copy someone else's document, no more having to arrange to bring it home to be scanned.

    LIDAR would extend this concept to 3-D output. Most of us don't have 3-D printers at home, and would have no use for one that just works in cheesy plastic. But imagine being able to email a 3-D

  • Just imagine, there could be a phone app that displays an arrow to show the user which way to walk. Using the Lidar to detect obstacles, the app could enable a phone zombie to become almost self-driving, avoiding obstacles and other people. Almost like a real person.

  • For a smartphones, I'd rather expect so-called "time of flight" cameras [csnej106.csem.ch] to catch-up before LIDARs. Basically, you have an array of LEDs which illuminate the scene using sine or square wave intensity modulation. The imager works at a high framerate (or uses other windowing techniques) to extract the phase shift in each pixel, which gives you 2D ranging information. Of course, there is still the problem of phase unwrapping.
    So in this kind of system, you trade off dynamic range for accuracy and cost. As most measurements with smartphones will probably be performed at short distance, this system seems more suitable than regular LIDARs.
    • For a smartphones, I'd rather expect so-called "time of flight" cameras [csnej106.csem.ch] to catch-up before LIDARs. Basically, you have an array of LEDs which illuminate the scene using sine or square wave intensity modulation. .

      Unfortunately, emitted IR signals outside get too corrupted for ranges farther then 10m or so.
      I'm not sure about indoors but I can't find any ToF system that can go farther then 10m.
      Light Field cameras almost seem useful but they have their own limitations.
      Lidar is the only reasonable way to obtain depth information over long distances. And it's accurate too.
      ToF does work ok for short distances though, AFAIK

Fast, cheap, good: pick two.

Working...