Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Cellphones Hardware Technology

A New Lens Technology Is Primed To Jump-Start Phone Cameras (wired.com) 50

An anonymous reader quotes a report from Wired: A new company called Metalenz, which emerges from stealth mode today, is looking to disrupt smartphone cameras with a single, flat lens system that utilizes a technology called optical metasurfaces. A camera built around this new lens tech can produce an image of the same if not better quality as traditional lenses, collect more light for brighter photos, and can even enable new forms of sensing in phones, all while taking up less space.

Instead of using plastic and glass lens elements stacked over an image sensor, Metalenz's design uses a single lens built on a glass wafer that is between 1x1 to 3x3 millimeter in size. Look very closely under a microscope and you'll see nanostructures measuring one-thousandth the width of a human hair. Those nanostructures bend light rays in a way that corrects for many of the shortcomings of single-lens camera systems. The core technology was formed through a decade of research when cofounder and CEO Robert Devlin was working on his PhD at Harvard University with acclaimed physicist and Metalenz cofounder Federico Capasso. The company was spun out of the research group in 2017.

Light passes through these patterned nanostructures, which look like millions of circles with differing diameters at the microscopic level. The resulting image quality is just as sharp as what you'd get from a multilens system, and the nanostructures do the job of reducing or eliminating many of the image-degrading aberrations common to traditional cameras. And the design doesn't just conserve space. Devlin says a Metalenz camera can deliver more light back to the image sensor, allowing for brighter and sharper images than what you'd get with traditional lens elements. Another benefit? The company has formed partnerships with two semiconductor leaders (that can currently produce a million Metalenz "chips" a day), meaning the optics are made in the same foundries that manufacture consumer and industrial devices -- an important step in simplifying the supply chain. Metalenz will go into mass production toward the end of the year. Its first application will be to serve as the lens system of a 3D sensor in a smartphone. (The company did not give the name of the phone maker.)

This discussion has been archived. No new comments can be posted.

A New Lens Technology Is Primed To Jump-Start Phone Cameras

Comments Filter:
  • by CaptainLugnuts ( 2594663 ) on Thursday February 04, 2021 @07:25PM (#61029210)
    If they can actually get metamaterials to work well, they can get around the diffraction limit. That's great news for all optics.
    • Iâ(TM)d be rather surprised if they can get around the diffraction limit. If they did, theyâ(TM)d be totally revolutionizing the optics world, not just making a compact lens.

      There are obviously techniques that can be used to go beyond the diffraction limit (STED comes to mind) but they are extremely specialized...

      • Isn't this just nano Fresnel lens?
        • nano fresnel lens is also what i thought.

          Light passes through these patterned nanostructures, which look like millions of circles with differing diameters at the microscopic level.

        • Kind of.

          A Fresnel lens technically has rings of steps that are used to decrease overall lens thickness by creating localized areas of very small radii.This means that the steps are effectively "bending" the light at very tight angles, much tighter than they would on a normal lens surface. This gives them their characteristic "paper thin" shape.

          This metamaterial lens uses some kind of superlens technique, although it's not specified. There are numbers of techniques that are attempting to achieve a superlens,

      • That's one of the major selling points of negative refractive index optics, you can get around the diffraction limit.
  • by Pierre Pants ( 6554598 ) on Thursday February 04, 2021 @07:29PM (#61029224)
    That's not how you write an article. It's written completely like an ad from a marketing department. Sure you don't have some disclosure to make? Anyway, "The resulting image quality is just as sharp as what you'd get from a multilens system" - Yes? Which multi-lens system? The variation in quality is huge, to say the least, you know. "eliminating many of the image-degrading aberrations common to traditional cameras" - Like? Maybe list at least two? Whatever.
    • by spatley ( 191233 )

      but alas journalism was reduced to cutting and pasting corporate press releases some years ago.

      • That's because we (collectively) don't want to pay for it. So news organizations are forced to rely on ads. These revenues are not sufficient to furnish good journalism, so we get low-quality journalism. The number of people employed in the newspaper industry has been cut in half since 2008 (https://www.pewresearch.org/fact-tank/2020/04/20/u-s-newsroom-employment-has-dropped-by-a-quarter-since-2008/). I well-researched article takes time. You get what you pay for.

        And yes, I know newspapers have made mistake

    • by dgatwood ( 11270 ) on Thursday February 04, 2021 @09:31PM (#61029466) Homepage Journal

      Let me see if I can explain it using my limited knowledge of the subject, and someone else will probably correct it and/or add on to it.

      Most cameras use multiple lenses because of the need to correct for distortion. One lens creates significant distortion because it bends the light. The greater the magnification, the more distorted the image is, and the more chromatic distortion you get, because different colors of light bend by different amounts at the transition between air and the lens. They compensate for that by adding multiple lens elements made of different materials with different refractive properties. The number of elements required depends on a lot of factors, including the amount of magnification, how precisely you want to correct the distortion (mostly a factor of sensor density relative to the lens size), etc.

      For example, a Canon 17–35 f/2.8 L III lens has 16 elements. Their 100–400 L II lens has 21 elements. That's a *lot* of glass. This adds a lot of weight. To reduce the weight — particularly for long focal lengths — companies have tried diffractive optics (Fresnel lenses), where they etch concentric rings onto the lens. But this tends to create moiré patterns to some extent, and results in strange looking bokeh (out-of-focus areas).

      Metamaterial lenses basically take the concept of Fresnel lenses and reduce the patterns down to a microscopic level, and also introduce more complex patterns (rather than just concentric circles) designed to avoid moiré patterns while still producing high magnification with a single, mostly flat lens or with a much smaller number of lens elements.

      • No, not really.

        Metamaterials use a different method altogether. Lookup negative index of refraction materials for more information.

      • Fresnel lenses do not use diffraction. They are normal refraction lenses, but divided up to allow some materiel to be removed.

      • by AmiMoJo ( 196126 )

        Phones seem to be using software to get around these issues.

        I have a DSLR but mostly use my phone now. The Pixel 5's images are very hard to beat, except for the slightly limited resolution when you want to crop or zoom. It does so much right though, and all automatically. I could replicate it on a DSLR but would need to spend time setting the shot up and deploying a tripod.

        I haven't looked at DSLRs for a few years now but I wish they had some of the computational photography features that phones did. I'd l

        • by dgatwood ( 11270 )

          Phones seem to be using software to get around these issues.

          I have a DSLR but mostly use my phone now. The Pixel 5's images are very hard to beat, except for the slightly limited resolution when you want to crop or zoom. It does so much right though, and all automatically. I could replicate it on a DSLR but would need to spend time setting the shot up and deploying a tripod.

          Where the DSLRs beat the phone hands down is when you're either A. not able to get close to the action, B. shooting in low light, or C. all of the above. If you've ever tried to use a long telephoto lens on a cell phone to get a 600mm-equivalent focal length or greater, you know what I'm talking about. The mount flex alone makes it unworkable, not to mention the pathetic light gathering.

          I haven't looked at DSLRs for a few years now but I wish they had some of the computational photography features that phones did. I'd love something like a micro 4/3 camera with a couple of lenses, but which uses Google's tech to make simple snapshots on auto look excellent.

          One problem with DSLRs is that they don't expose the sensor until you take the shot, which makes things like face detect

          • by AmiMoJo ( 196126 )

            I should have said mirror less, but I agree with what you say for the most part. I'd be happy with just capturing enough data to build an HDR image on the desktop later, but I've not seen an easy way to do that.

            For that reason I find my Pixel 5 better for night shots. With a tripod you can get great results with a mirrorless camera, but with the phone I can take a hand held shot of a moving scene and it comes out great. That's the kind of thing I want, but with better optics, resolution and RAW support so I

  • The main problem in this space is that the current cameras have reached the "good enough" point quite a few years back.
    Improving a good enough technology will not result in changes in consumer behavior. At most it will allow manufacturers to shave some pennies off production costs.

    • While you're right that people are generally pretty happy with the quality of their phone camera, it seems to me this will enable smaller cameras for the same or better quality.
    • by marcle ( 1575627 ) on Thursday February 04, 2021 @08:19PM (#61029358)

      It's not so much that this makes "better" lenses, it's that it makes them much smaller and more versatile, and combines them with sensors in a new way that opens up lots of interesting possibilities.

    • by tflf ( 4410717 )

      The main problem in this space is that the current cameras have reached the "good enough" point quite a few years back.
      Improving a good enough technology will not result in changes in consumer behavior. At most it will allow manufacturers to shave some pennies off production costs.

      Even if cameras with this lens structure are no lighter, no cheaper and do not take better pictures, never underestimate the appeal of "newer and better" for a significant percentage of the buying public. Industries rely on the certain knowledge people will replace "good enough" products long before the end of active life. For every person driving a sound, reliable 12 year old car, there are several who purchased one or more replacement vehicles over that same 12 year span.

      And if picture quality is discern

  • FTA:

    as light passes through each successive lens, the image gains sharpness and clarity

    Pretty sure that's not what's happening but one word: Wired.

  • by acdc_rules ( 519822 ) on Thursday February 04, 2021 @07:59PM (#61029304)
    so, where are the images? i see images of the lenses (i think). you'd think since they are producing them and will begin to sell towards the end of the year that someone would have a test pic somewhere, right? matalenses have been discussed for at least a decade and the promises are high. unless i've missed a research article, they are up to some forms of rough imaging within a narrow wavelength, not quite up to the promises of Metalenz
    • The only application going forward is apparently the 3d sensor for a phone - which might be rough imaging within a narrow wavelength all right.
  • by seoras ( 147590 ) on Thursday February 04, 2021 @08:00PM (#61029306)

    From the article

    Take spectroscopy as an example. A spectrometer is used to finely detect different wavelengths of light, and it's commonly employed in medical assays to identify particular molecules in the blood. As metasurfaces allow you to collapse “a tabletop of optics into a single surface,” Devlin claims you can pop the right sensors in a smartphone with Metalenz to do the same kind of work.

    “You can actually look at the chemical signature of fruit with a spectrometer and tell whether it's ripe,” Devlin says. “It's really not just an image anymore, you're actually accessing all sorts of different forms of sense, and seeing and interacting with the world, getting a whole new set of information into the cellphone.”

    In recent years spectroscopy has been combined with AI and machine learning to accurately identify, and quantify, the presence of substances in wide range of organic and non-organic materials.
    We are on the cusp on phones starting to incorporate spectroscopy tech as just another camera lens which should open up a whole new range of mass market applications.
    Here's one company [sagitto.com] that is woking in this field and another one [basf.com] specialising in one particular area that I know of.

    • I foresee medical applications in third-world countries. Especially coupled with the ubiquitousness of cellphones, and the other sensors already included. Add in chemical sensors (aka noses) and you'll have citizen science. With all this said data privacy laws will have to be beefed up with the influx of new data. I also wonder if these metalenses can be applied to regular cameras?

    • by DrYak ( 748999 ) on Friday February 05, 2021 @03:43AM (#61030058) Homepage

      In recent years spectroscopy has been combined with AI and machine learning to accurately identify, and quantify, the presence of substances in wide range of organic and non-organic materials.

      As someone who has taught some organic chemistry(*) lab session to pharma students at the university:
      It's not in recent years. It's been decades that spectroscopy has been successfully used "to accurately identify, and quantify, the presence of substances in wide range of organic and non-organic materials". AI and machine learning are merely tools to alleviate the tedious and boring job of matching and interpreting the spectra yourself. But it's a task that even 2nd year student can perform.

      We are on the cusp on phones starting to incorporate spectroscopy tech as just another camera lens which should open up a whole new range of mass market applications.

      There's a very hard limit on what you can use such tech: it only works using the light that bounce of the surface of the object you're targetting.
      Thus it can only work on homogenous substance (e.g.: our student used spectrocopy to identify some "mystey white powder" in their exercises).

      All the serious applications mentioned in the links you gave are of this type: most are "composition of X" where X is something that can be observed from the surface (homogenous liquids like all the extracts on these lists, or a homogenous solid like the plastic example).

      It will definitely not work the way most average smartphone users would like it to work (e.g.: point at a product at the convenience store and get a precise composition and nutrients analysis - this is impossible. At best you'd get some information about the skin of the fruit, at worse you'd get information about the packaging if it is not IR transparent. The innards of the item are almost as invisible to you as to the device, as both of you work with light that is roughly around the visible domain).

      But it will enable use of "mobile labs" in the "get a sample with a tiny special holder, place holder in a special chamber that you fix to your smartphone" sense. (Not too dissimilar to uses of smartphone as part of a "fit in your pocket" microscope kit used, e.g.: to analyse blood samples for malaria. Except this time in the case of TFA's lens: with NIR domain analysis).

      ---

      (*): yup, I studied medicine, but at some point of my carreer I stumbled into TA-ing to pharmastudents despite us medical doctor being notoriously bad at orgchem. In Leonard "Bones" McCoy's voice "I'm a doctor, Jim! Not a chemist!"

      • Mystery white powder? Switzerland, country of banks and mountains, so for sure it was snow, now just to guess, the type that's measured in centimetres or the type measured in grams...
        • Mystery white powder? Switzerland, country of banks and mountains,

          Although that peculiar TA was during the short time of my carrier while I wasn't in Switzerland, but in Germany...

          so for sure it was snow, now just to guess, the type that's measured in centimetres or the type measured in grams...

          Spot on! It was the type of powder that rhymes with Lidocain and has a very close signature.
          (Cue in complex system of signatures that was in place to make sure that neither the master students nor the TA'ing PhDs managed to steal too much of it).

          Though I am sure a few other Swiss will joke how this week-end, the other powder wasn't white at all, but yellow, courtesy of a combo of southern winds [twitter.com]

    • Can we call them "tricorders" now?

  • > which look like millions of circles with differing diameters at the microscopic level.

    wow... they learned to etch a detraction grating as a Fresnel lens. >thup thup thup ... sound of one hand clapping.

    I seriously hope the USPTO doesn't grant a patent to a 200+ year old technology... but they will...

    • by khchung ( 462899 )

      I seriously hope the USPTO doesn't grant a patent to a 200+ year old technology... but they will...

      But, it is ON A PHONE!

      That will probably work just as well as the many ".... on the Internet!" patents granted 20 years ago.

  • by Pinky's Brain ( 1158667 ) on Thursday February 04, 2021 @09:09PM (#61029426)

    You have to read between the lines, this is for monochromatic light. Diffractive optics have massive chromatic aberration.

    Might be useful for AR/VR too, if it can be combined with RGB filters without causing noticeable screen-door effects. For image projection you can use narrowband R/G/B sources (ie. laser diodes) and intensity loss from the low fill factor in any given color from the RGB filter isn't really a problem ... just use a little more light.

    • Well, they donâ(TM)t have âoemassive chromatic aberrationâ. They have chromatic aberration which is dependent on the distance from the lens element to the focal plane, along with the distance from the edge of the image to the center.

      Diffractive optics have more chromatic aberration than traditional optics, but due to the very small distances involved, weâ(TM)re still talking a very small amount.

  • by backslashdot ( 95548 ) on Thursday February 04, 2021 @10:59PM (#61029616)

    So the article says it can be used for multi-spectral imaging. That means it can probably be used for finding out the temperature of objects. It could probably be used as a FLIR camera .. able to detect bad insultation around the home, or do fever monitoring.

    • Possibly. Without an actual transmission curve, it's difficult to say whether or not these optics perform beyond the expected characteristics of the bulk material itself.

      Chalcogenide glass, Germanium, and Zinc Selenide have a corner on the market - almost entirely due to their high transmission in the LWIR range.

      Acrylic or Polyethylene fresnels have already been used as a low-cost alternative, but the problem with plastics is that there is no known design that compensates for the poor LWIR transmission of p

  • It's called GET A THERAPY FOR YOUR ANOREXIA!

    This whole thing is batshit insane.
    The entire premise (flat thin phones) and the premise's premise (Appleitis) are batshit insane.

    Stop shrugging and blabbering, wake up, and stop playing along!

    • Flat, thin lenses would be very welcome. Would be nice to have the lens(es) flush with the back of the phone again.

      Should I get off your lawn now?

  • Is this a plain ol' Fresnel lens?
  • Would this work for contact lenses?

"If there isn't a population problem, why is the government putting cancer in the cigarettes?" -- the elder Steptoe, c. 1970

Working...