back-ground
Analysis

Analysis

Index

Introduction

To use the detection cells produced by the Wet lab, it is necessary to develop hardware that can accurately quantify specific fluorescence. To this end, we developed software to quantify specific fluorescence from image data captured by a webcam.

How the developed software works

To use the infection detection cells that Wet is creating, it is necessary to implement a system that can quantify each of the two colors, red and green. However, since the mechanism for quantifying red and green is essentially the same, we implemented software that can quantify the green color.

The principle of operation of the developed software is shown in Figure 1.

The processing of the Figure 1 was implemented entirely in Python. This is because OpenCV is a powerful image processing tool, and I wanted to use Numpy, which makes it easy to write matrix operations[1][2]. In particular, I used OpenCV's inRange function to create the mask, and Numpy to count the non-black elements of the mask.

The reason why the color space of the input image is converted first is that it is easier to specify a range of colors in the HSV color space (Figure 2) than in the RGB color space (Figure 3). In the HSV color space, Hue, Saturation, and Value must be specified to specify the color, but this time only the Hue value was changed to specify the range.

Development Flow

Quantification of GH-7

First, to confirm the validity of the policy, the fluorescence of GH-7 was first quantitated by fluorescence of the data taken by microscopy. The images used for the quantification are the images of GH-7 and the Negative control with exposure of 200 ms and 400 ms. Negative controls with exposure of 200 ms are shown in Figure4 and fluoresce GH-7 is shown in Figure 5. The Negative controls with exposure of 400ms are shown in Figure 6 and fluoresce GH-7 is shown in Figure 7.

The clarity of the images used and the large contrast allowed for very good quantification. The generated masks and the quantification results are shown in Figure 8 and Figure 9, respectively.

Since the quantification of GH-7 was successful, we decided to process the information from the webcam using the same method.

Image from webcam

In the early days of software development, the first step was to compare the negative control image and the image with the highest density, assign various values to Hue, and manually adjust the range. However, this was not efficient, and differences were expected to occur depending on the person who made the settings, and the values would change depending on the monitor environment. Therefore, we made it possible to automatically determine the Hue value based on the negative control and the image with the highest density.

The implementation of this gave results similar to those expected from experiments using a plate reader.

The implementation of this method produced results (Figure 10) similar to those expected from the experiment using the plate reader (Figure 11).

This indicates that, at a minimum, the software and hardware are capable of quantifying fluorescence.

Future Outlook

In the present software, sample positioning was done manually. However, in actual use, this would affect the reliability of the software. One way to solve this problem is to write a unique pattern in the corner of the well and implement an automatic cropping of the image based on that pattern before starting the quantification.

References