cc: Life Science
cc: Life Science Podcast
Optical Applications in Life Science
0:00
-49:41

Optical Applications in Life Science

Color science, remote sensing, machine vision, and of course, some AI.

Color Science and Digital Imaging

I don't think many people paid too much attention to the sort of miracle that produces color technology in a digital device, because we were so used to using color film, which itself is a technological masterpiece to balance the various photo-sensitive pigments that would result in images that we perceive with our human visual system to look the same as the scene we took a picture of.

This week’s conversation is with Jeff Carmichael. An expert on imaging systems particularly around the use of filter technology for looking at light from different parts of the spectrum.

Color sensors don’t see color, they detect light and have either a red, blue or green filter over each pixel (a Bayer filter array) which blows my mind in terms of manufacturing, not to mention integrating all the data it captures across millions of pixels.

Bayer filter array CC BY-SA 3.0 Cburnett - Own work

Then all that data (voltages counting photons) is turned into numbers that can be delivered to a screen that somehow reverses the whole process and presents us with an image that looks very much like the original scene.

Jeff described for me how light of different wavelengths were associated with our actual perception of those colors based on a couple of experiments by the Commission Internationale de l'éclairage (CIE) involving 17 people in the early 20th century.

You may have seen one of these chromaticity diagrams in relation to calibrating a computer screen.

1931 CIE Chromaticity Diagram

Now imagine your camera collecting light in three different buckets, just red, blue and green, each of those somewhat arbitrary and imprecise, and still making an image you would hang on your wall, watch on Netflix or further manipulate before you post on Instagram. It’s all just math, right?

Microscopes with digital cameras are an obvious application. But what else can we do inside the imaging world in life science?

Remote Sensing

How about remote sensing of environmental features where by looking at specific parts of the spectrum through various filters, you can make some determinations as to what is going on? Think about things like assessing the growth or health of crops, or the biomass of plants in the rainforest. Pour yourself a glass of wine, look up at the night sky and appreciate the fact that the satellite you see whizzing by may have helped determine the maturation of grapes in your glass.

The Normalized Differential Vegetation Index (NDVI) is a measure of vegetation based on comparisons between the red and near infrared light reflected from them (NIR - R) / (NIR + R) = (math again). I asked if that was a form of machine vision.

Share

Machine Vision

It turns out machine vision is a little different from remote sensing.

While remote sensing is at the mercy of ambient light for the most part (we talked about LIDAR as an exception) machine vision is more like studio photography -highly dependent on consistent light sources. It’s used primarily for repetitive analyses like looking for defects or sorting pistachios.

In life science, machine vision may currently be limited to routine applications ensuring that sample tubes have adequate sample, caps are on, the right tubes in place, etc. for high volume automated analyses.

What Jeff is interested in is following a developmental or disease model e.g. an embryology experiment where one might be watching for a long time without interacting with the subject.

Now, perhaps someone out there is hearing us and thinking, Jeff, you idiot. We do this all the time and maybe someone does. But it's not widespread like it is in industrial automation.

If that someone is you, leave a comment. I’d love to chat with you. As would Jeff, I’m sure.

Two separate paths

One interesting thing that Jeff has observed is that the machine vision space and the life science space seem to be on parallel tracks seemingly invisible to each other.

For example, both sectors have companies developing similar light sources, LED light engines he calls them, that can be controlled by software to get the lighting you want. But he’s not aware of any interaction between companies in different sectors.

LEDs have become, they have been for a long time now the dominant type of light source used for lighting and machine vision applications. They don't get hot, they use little energy. They can pump out a lot of light in a very small wavelength range. You can, overdrive them so that they cycle really fast and can do very fast imaging.

The same seems to be true in artificial intelligence (we had to get there, right?)

For example, Landing AI and Path AI are both in the life science space, focused on machine vision and pathology respectively. Jeff is curious whether each segment (ML and LS) has something to be learned from the other.

Because usually things mix and sorta come out in the wash in the end. So one resolution, I don't know, it could be that they borrow from each other on their particular AI approaches, right? And machine vision, they might be using a certain way of approaching AI that they hadn't thought of in the life science and vice versa. That could be the way that they merge. Like in music, you borrow from each other.

We’ll be listening to see what happens.

0 Comments
cc: Life Science
cc: Life Science Podcast
How will AI, blockchain and other new technologies impact life science?