MIT researchers have developed novel photography optics that capture images based on the timing of reflecting light inside the optics, instead of the traditional approach that relies on the arrangement of optical components. These new principles, the researchers say, open doors to new capabilities for time- or depth-sensitive cameras, which are not possible with conventional photography optics.
Specifically, the researchers designed new optics for an ultrafast sensor called a streak camera that resolves images from ultrashort pulses of light. Streak cameras and other ultrafast cameras have been used to make a trillion-frame-per-second video, scan through closed books, and provide depth map of a 3D scene, among other applications. Such cameras have relied on conventional optics, which have various design constraints. For example, a lens with a given focal length, measured in millimeters or centimeters, has to sit at a distance from an imaging sensor equal to or greater than that focal length to capture an image. This basically means the lenses must be very long.
In a paper published in this week’s Nature Photonics, MIT Media Lab researchers describe a technique that makes a light signal reflect back and forth off carefully positioned mirrors inside the lens system. A fast imaging sensor captures a separate image at each reflection time. The result is a sequence of images—each corresponding to a different point in time, and to a different distance from the lens. Each image can be accessed at its specific time. The researchers have coined this technique “time-folded optics.”
Comments are closed.