Cancer is responsible for one-in-three deaths in Canada, according to the Canadian Cancer Society. To patients who are diagnosed, early detection can mean the difference between life and death. A microscope using AI is being touted as a powerful new instrument in the diagnostic toolkit—one that manages to snap an astounding 36 million images per second to catch cancer cells and identify their characteristics.
The microscope was designed by a team at UCLA's California NanoSystems Institute, who say it's a way to identify cancer cells in patients' blood samples faster and more accurately than current methods. In a new study published in the journal Nature Scientific Reports, they describe how, using a patented microscope outfitted with a camera, they're able to photograph cells without destroying them. It can spot 16 different physical characteristics of cancer cells—like size, granularity and biomass—at 95 percent accuracy.
The new microscope can increase the clarity of images and simultaneously slow them enough to detect and digitize them at a rate of 36 million images per second.
Recognizing the value of this research requires an understanding of how doctors currently test for cancer. In one standard approach, doctors add biochemicals to blood samples, which fix biological "labels" to the cancer cells. These, in turn, highlight cells so that instruments can detect and identify them. But, the biochemicals can harm the cells, which means that they're much harder for scientists to analyze in the future.
Mahjoubfar believes this technology would be a fixture in hospitals within a decade
This new technology harnesses something called photonic time stretch. According to Ata Mahjoubfar, a UCLA postdoctoral fellow and author on the new paper, this means that photons with different wavelengths carry separate parts of information about cancer cells (for example, image pixels). After going through a very long, specialized optical fiber, they arrive at the sensors separately during the intervals between optical pulses, essentially slowing down the information carried by the signal. Thus the UCLA researchers can better detect important data in cells, which they might have missed using other techniques.
Mahjoubfar explained with an analogy. "Imagine a marathon where we can give different pieces of a detailed message to different people at the start line," he told me. "When they arrive at the end line, we get each person's portion and put them together to form the original message. But, since they are not coming to the end line at the same time, we will have much more time to scribble down the whole detailed message."
As to how the microscope slows down the image capture, the researchers use subnanosecond laser pulses (flashes of light shorter than 1 billionth of second) to catch a very fast snapshot of the cell, and then use a photonic time stretch system. Mahjoubfar said the photonic time stretch system "adds different delays to pixels within a pulse and feed them one-by-one, serially, into the photodetector," and the result is the ability to see one pixel at a time, offering a slow-down in the information displayed.
This microscope also can uncover cell data by using low lighting via laser bursts, so as not to damage the cells. Riffing off Mahjoubfar's example, UCLA doctoral student and study author Claire Lifan Chen told me that photonic time stretch with low lighting "is similar to feeding the marathon runners along the way. You do not need to put all of the energy in the laser bursts at the beginning." She said you can first let the photons interact with the cells and get their information, and then conduct an optical amplification.
She noted, "In our system, the average power was a few milliwatts in the field of view, which is very safe for the cells. But, after passing through the cells, we amplify the signal by about 30 times in the photonic time stretch."
Avoiding cell damage is critical, said Mahjoubfar. The cells can be damaged by high power illumination, which essentially cooks them, or be altered by biomarkers, he said. The labeling process is usually done before imaging or identification. "By the time cells are being imaged or analyzed, their natural form might be gone due to the change induced by added biomarkers. This may affect the downstream analysis, like doing more studies on the disease-causing cells, and possibly mislead the diagnosis decision-making."
The microscope harnesses a form of AI called deep learning, which models high-level patterns and abstractions from multidimensional data. Chen said while "deep learning has been extensively used for image recognition and speech processing, this is the first time it is applied to label-free classification of cells."
Mahjoubfar added: "We have shown that our deep learning approach makes it possible to efficiently combine the information from all biophysical characteristics, and achieve a label-free cell classification technique that is more accurate and more robust."
As the study release states, such a system could lead to data-driven diagnoses by cells' physical characteristics, which could then lead to quicker and earlier diagnoses of cancer, and better understanding of the tumor-specific gene expression in cells, which could inspire new treatments for disease.
The cost for this technology isn't prohibitive. Mahjoubfar said the microscope's cost is in line with other high-tech scopes of this nature. He believes this technology would be a fixture in hospitals within a decade.
The study was performed on cancer cell lines. The next step will be to try it out on patients.