Life Science Solutions
应用资料

Perform Accurate and Efficient Microscopy Image Analysis Using TruAI based on Deep Learning


Introduction

Experiments often require data from microscope images. For accurate image analysis, segmentation is important to extract the analysis target area from the image. A common segmentation method is to apply thresholds to the image intensity values or color.

While effective, this method can be time-consuming and affect the sample condition. Next-generation image analysis methods like our cellSens imaging software with deep-learning-based TruAI help reduce the risks of sample damage while achieving high efficiency and accuracy. 

Application Examples Using TruAI

1)  Label-Free Nucleus Detection and Segmentation

To count the number of cells, localize nuclei in cells and tissues, and evaluate the cell area, researchers commonly use fluorescent staining of nuclei to segment based on the fluorescence intensity information. 

In contrast, TruAI can perform nucleus segmentation using only brightfield images. It works by training a neural network using the nucleus segmentation results from brightfield and fluorescence images. 

This self-learning microscopy approach eliminates the need to fluorescently stain the nucleus once the neural network is created. Other benefits include:

  • Minimize time spent on nuclear labeling
  • Exclude effects on cells due to labeling
  • Prevent phototoxicity and fading
  • Acquire extra sample information by adding another channel 

Label free nucleus detection by TruAI

Figure 1

Figure 2

Figure 1: While the brightfield image (left) has minimal contrast due to unstained cells, the TruAI detects the nuclei with high accuracy (right).
Figure 2: Compared to the fluorescence image (left), Olympus’ TruAI clearly distinguishes close nuclei from one another (right), indicating that detection is possible with high accuracy.

Read the white paper 

2) Quantitative Analysis of Fluorescently Labeled Cells with Ultra-Low Light Exposure

Fluorescent labels are an invaluable tool in modern microscopy-based cell studies. However, the high exposure to excitation light can lead to photodamage or phototoxicity and have an observable impact on cell viability. Even if no direct effect is observed, strong light exposure can influence the cells’ natural behavior, leading to undesired effects. 

In long-term live cell experiments, minimal light exposure during fluorescence observation is ideal. From a technical standpoint, ultra-low light exposure means analyzing images with very low signal levels and, consequently, low signal-to-noise ratios. Our TruAI enables you to analyze low-signal images with both robustness and precision.

Figure 3

Figure 4

Figure 5

Figure 3: The result of detecting nuclei (right) from a fluorescence image (left) with sufficient luminance using a conventional method that applies a luminance threshold.
Figure 4: The result of detecting nuclei (right) with the same conventional method as Figure 3, from fluorescence images (left) with extremely poor SNR due to weak excitation light. You can see that the detection accuracy is low.
Figure 5: The result of detecting nuclei (right) using TruAI from a fluorescence image (left) with extremely poor SNR due to weak excitation light. You can see that the accuracy is as high as Figure 3 and performed with much higher accuracy than Figure 4.

Read the white paper

3) Segmentation Based on Morphological Features

If you want to segment an image based on its morphological features, it is very difficult to achieve high-precision segmentation with the conventional approach of applying thresholds to intensity values and color. Therefore, it was necessary to manually count and measure each time.

In contrast, TruAI enables highly efficient and accurate segmentation based on morphological features. After the neural network learns the segmentation results from hand-labeled images, it can apply the same methodology to additional data sets. For instance, neural networks trained from hand-labeled images can count mitotic cells, as depicted in the images below.

Figure 6

A high-magnification image (left) of the framed area in Figure 6

Figure 7

Figure 6: Prediction of mitotic cells using TruAI (green).
Figure 7: While you can see many cells, only the dividing cells are detected (right).

4) Tissue Specimen Segmentation

TruAI can also be used to segment tissue specimens. For example, kidney glomeruli are difficult to discriminate using conventional methods but can be segmented using TruAI.

Figure 8

A high-magnification image (left) of the framed area in Figure 8

Figure 9

Figure 8: Prediction of glomeruli positions on a mouse kidney section using TruAI (blue).
Figure 9: TruAI captures and detects the glomeruli features (right). 

Conclusion

Conventional segmentation methods can be difficult and cause damage to samples. Our cellSens imaging software with deep learning enables accurate and efficient segmentation in conditions that cause minimal damage to cells, such as label-free imaging or ultra-low light exposure. The software also makes it easier to perform segmentation of tissue specimens based on their morphological features. 

Products used for this application

激光扫描共聚焦显微镜

FV3000

  • 可提供配置常规扫描的FV3000或配置常规/共振混合扫描的FV3000RS
  • 所有通道均可进行全新高效且精确的全真光谱TruSpectral检测
  • 针对活细胞成像的高灵敏度和低光毒性优化
精密活细胞成像

IXplore Live

  • 使用奥林巴斯实时控制器可以得到细胞干扰度最低的生理学相关数据
  • 能够在使用各种环境控制选项成像时保持细胞活性
  • 通过奥林巴斯硬件自动聚焦(Z轴漂移补偿)系统,维持延时试验的准确、可靠聚焦
  • 使用奥林巴斯硅油浸入式光学器件探索细胞的真实形状
电动荧光显微镜

BX63

  • 全电动系统可以实现复杂多维实验的自动化
  • 最精确的电动Z轴驱动系统
  • 固定式载物台设计实现更高稳定性
生命科学解决方案

cellSens

  • 模块化成像软件平台
  • 直观的应用程序驱动的用户界面
  • 广泛的功能集,从简单的快照到先进的多维实时实验

已经成功添加到您的书签

查看书签关闭

Maximum Compare Limit of 5 Items

Please adjust your selection to be no more than 5 items to compare at once

对不起,此内容在您的国家不适用。

This site uses cookies to enhance performance, analyze traffic, and for ads measurement purposes. If you do not change your web settings, cookies will continue to be used on this website. To learn more about how we use cookies on this website, and how you can restrict our use of cookies, please review our Cookie Policy.

OK