Deep learning microscope targets malaria
Image: Machine learning microscope from Duke University.
US-based researchers have developed artificial intelligence that enables a microscope to adapts its lighting angles, colours and patterns while learning the best settings needed to complete a given diagnostic task.
As Professor Roarke Horstmeyer from Biomedical Engineering at Duke University points out in Biomedical Optics Express, the recent development of deep learning algorithms for automated image analysis has brought a clear need to re-design the microscope’s hardware for specific interpretation tasks.
With this in mind, he and colleagues devised a method to co-optimise how a sample is illuminated in a microscope, along with a pipeline to automatically classify the resulting image, using a deep neural network.
In the initial proof-of-concept study, the microscope simultaneously developed a lighting pattern and classification system that allowed it to quickly identify red blood cells infected by the malaria parasite more accurately than trained physicians and other machine learning approaches.
“A standard microscope illuminates a sample with the same amount of light coming from all directions, and that lighting has been optimised for human eyes over hundreds of years,” says Horstmeyer.
“But... we have redesigned the hardware to provide a diverse range of lighting options and, we’ve allowed the microscope to optimise the illumination for itself,” he adds.
Duke University microscope uses LEDs of various colours and lighting schemes determined by machine learning. [Duke University]
Rather than diffusing white light from below to evenly illuminate the slide, the engineers developed a programmable LED array - a bowl-shaped light source with LEDs embedded throughout its surface.
This allows samples to be illuminated from different angles up to nearly 90 degrees with different colours, which essentially casts shadows and highlights different features of the sample depending on the pattern of LEDs used.
The researchers then fed the microscope hundreds of samples of malaria-infected red blood cells prepared as thin smears, in which the cell bodies remain whole and are ideally spread out in a single layer on a microscope slide.
Using a convolutional neural network, the microscope learned which features of the sample were most important for diagnosing malaria and how best to highlight those features.
The algorithm eventually selected a ring-shaped LED pattern of different colours coming from relatively high angles.
According to Horstmeyer, while the resulting images are noisier than a regular microscope image, they highlight the malaria parasite in a bright spot and are correctly classified about 90 percent of the time.
The new microscope taught itself the best way to light up red blood cells to spot malaria parasites within. Compared to a traditional microscope (top), the red blood cell images created by the new microscope (bottom) contain more noise, but the malaria parasites are lit up by bright patches due to the lighting conditions.[Duke University]
Trained physicians and other machine learning algorithms typically perform with about 75 percent accuracy.
“The patterns it’s picking out are ring-like with different colours that are non-uniform and are not necessarily obvious,” says Horstmeyer. “Even though the images are dimmer and noisier than those that a clinician would create, the algorithm is saying it’ll live with the noise, it just really wants to get the parasite highlighted to help it make a diagnosis.”
The researchers also showed that the microscope works well with thick blood smear preparations, in which the red blood cells form a highly non-uniform background and may be broken apart.
For this preparation, the machine learning algorithm was successful 99 percent of the time.
Following on from these results, Duke engineering graduate students have formed a startup company SafineAI to miniaturize the reconfigurable LED microscope concept, which has already earned a $120,000 prize at a local pitch competition.
SafineAI: making intelligent microscopes for automated pathology.
Meanwhile, Horstmeyer is working with a different machine learning algorithm to create a version of the microscope that can adjust its LED pattern to any specific slide it’s trying to read.
“We’re basically trying to impart some brains into the image acquisition process,” he says. “We want the microscope to use all of its degrees of freedom. So instead of just dumbly taking images, it can play around with the focus and illumination to try to get a better idea of what’s on the slide, just like a human would.”
Research is published in Biomedical Optics Express.