Depth sensor to approximate

MIT researchers have developed a biomedical imaging system that could ultimately replace a $100,000 piece of a lab equipment with components that cost just hundreds of dollars.

The system uses a technique called fluorescence lifetime imaging, which has applications in DNA sequencing and cancer diagnosis, among other things. So the new work could have implications for both biological research and clinical practice.

“The theme of our work is to take the electronic and optical precision of this big expensive microscope and replace it with sophistication in mathematical modeling,” says Ayush Bhandari, a graduate student at the MIT Media Lab and one of the system’s developers. “We show that you can use something in consumer imaging, like the Microsoft Kinect, to do bioimaging in much the same way that the microscope is doing.”

The MIT researchers reported the new work in the Nov. 20 issue of the journal Optica. Bhandari is the first author on the paper, and he’s joined by associate professor of media arts and sciences Ramesh Raskar and Christopher Barsi, a former research scientist in Raskar’s group who now teaches physics at the Commonwealth School in Boston.

Fluorescence lifetime imaging, as its name implies, depends on fluorescence, or the tendency of materials known as fluorophores to absorb light and then re-emit it a short time later. For a given fluorophore, interactions with other chemicals will shorten the interval between the absorption and emission of light in a predictable way. Measuring that interval — the “lifetime” of the fluorescence — in a biological sample treated with a fluorescent dye can reveal information about the sample’s chemical composition.

In traditional fluorescence lifetime imaging, the imaging system emits a burst of light, much of which is absorbed by the sample, and then measures how long it takes for returning light particles, or photons, to strike an array of detectors. To make the measurement as precise as possible, the light bursts are extremely short.

The fluorescence lifetimes pertinent to biomedical imaging are in the nanosecond range. So traditional fluorescence lifetime imaging uses light bursts that last just picoseconds, or thousandths of nanoseconds.

Blunt instrument

Off-the-shelf depth sensors like the Kinect, however, use light bursts that last tens of nanoseconds. That’s fine for their intended purpose: gauging objects’ depth by measuring the time it takes light to reflect off of them and return to the sensor. But it would appear to be too coarse-grained for fluorescence lifetime imaging.

The Media Lab researchers, however, extract additional information from the light signal by subjecting it to a Fourier transform. The Fourier transform is a technique for breaking signals — optical, electrical, or acoustical — into their constituent frequencies. A given signal, no matter how irregular, can be represented as the weighted sum of signals at many different frequencies, each of them perfectly regular.

The Media Lab researchers represent the optical signal returning from the sample as the sum of 50 different frequencies. Some of those frequencies are higher than that of the signal itself, which is how they are able to recover information about fluorescence lifetimes shorter than the duration of the emitted burst of light.

For each of those 50 frequencies, the researchers measure the difference in phase between the emitted signal and the returning signal. If an electromagnetic wave can be thought of as a regular up-and-down squiggle, phase is the degree of alignment between the troughs and crests of one wave and those of another. In fluorescence imaging, phase shift also carries information about the fluorescence lifetime.