Live-Cell Assays are used to study the dynamic functional cellular processes in High-Content Screening of drug discovery processes. The large amount of image data created during the screening requires automatic image-analysis procedures that can describe these dynamic processes. One class of tasks in this application is the tracking of cells and the description of the events and the changes in the cell characteristics, so that the desired information can be extracted from it based on data-mining and knowledge-discovery methods. In this paper we propose a similarity-based approach for motion detection of the entire cell. Results are given on a test series from a real drug discovery process.
The utilization of dynamic High-Content Analysis approaches during preclinical drug research will permit to gain a more specified and detailed insight into complex sub-cellular processes by the use of living cell culture systems. This will effectively support future drug discoveries leading to a great outcome of highly specific and most effective drugs that come along with a well improved compliance. Live-cell assays are therefore used to study the dynamic functional cellular processes in High-Content Screening of drug-discovery processes. The large amount of image data created during the screening requires automatic image analysis procedures that can automatically describe these dynamic processes. One class of tasks in this application is the tracking of the cells, the description of the events and the changes in the cell characteristics, so that the desired information can be extracted from it based on data-mining and knowledge-discovery methods.
There are several tracking approaches known that track a cell by detection in each single frame and associate the detected cells in each frame by optimizing a certain probabilistic function . Other approaches track cells by model evolution . This approach seems to be able to handle touching and overlapping cells well, but is computational expensive . Most of the approaches need some time to adjust their parameters to the specific images by calculating some probabilities or other parameters from a collection of images.
Imaging & Microscopy Issue 4 , 2012 as free epaper or pdf download
This is not preferable in a real-time process and would result in a series of images that could not be automatically inspected since they were needed for the parameter calculation.
We propose a similarity-based approach for motion detection of cells. The software is able to track cells in a sequence of images. These images are taken in a time interval of thirty minutes. The whole experiment runs over two and a half days. The cells show no steady movement. They might suddenly jump from one direction to the opposite one. The cells might also turn around their own axis that will change their morphological appearance. They also might appear and disappear during the experiment as well as re-appear after having gone out of the focus of the microscope. It is also highly likely that the cells might touch or overlap. The tracking algorithm should run fast enough to produce the result in little computation time.
The following conditions have been decided for the algorithm: Only cells that are in focus should be tracked (background objects are not considered). Fragmented cells at the image borders are eliminated. Each detected cell gets an identity label and gets tracked. Disappearing cells are tracked until they disappear. Newly appearing cells are tracked upon their appearance and get a new identity label. Cells overlapping or touching each other are eliminated. They are considered as disappearing cells. Dividing cells are tracked after splitting and both get an identity label that refers to the mother cell. Cell fragments are eliminated by a heuristic less than 2 x Sigma of Cell Size. Note that we decided to exclude overlapping cells for the reason of the resulting higher computation time. If we want to consider this kind of cells, we can use our matching algorithm in  to identify the portion of touching cells belonging to the cell under consideration.
The Object-tracking Algorithm
The unsteady movement of the cells and the movement of the cells around their own axis require special processing algorithms that are able to handle this. The successive estimation of the movement based on mean-shift or Kalman filters  would only make sense if we can expect the cells to have steady movement. Since it can happen that the cells jump from one side to another, we used a search window around each cell to check where the cell might be located. The size of the search window is estimated based on the maximal distance a cell can move between two time frames. The algorithm searches inside the window for similar cells. Since we cannot expect the cell to have the same shape after turning around its own axis and as also the texture inside the cell might change, we have chosen a similarity-based procedure for the detection of the cell inside the window. The overall procedure is shown in figure 2.
The image gets thresholded by Otsu`s well-known segmentation procedure. Afterwards the morphological filter opening by a 3 x 3 window is applied to close the contour and the inner holes. Fragmented cells at the image borders as well as small remaining objects of a size of ten pixels are deleted. The cells at the image borders are only tracked when they fully appear inside the image. Around the object is drawn the convex hull and remaining holes or open areas inside the cell area are closed by the operation flood-fill. The resulting images after these operations are shown in figure 3. This resulting area is taken as the cell area and the area with its grey levels is temporarily stored as template in the data base.
Then the center of gravity of the object is determined and the search window is tentatively spanned around the object. A raw estimation of the cell's movement direction is calculated by the mean-shift filter over three frames. In the resulting direction is started the search for similar cells. Cells fragmented by the window border are not considered for further calculation. Each cell inside the window is compared by the similarity measure to the respective template of the cell under consideration. Before the similarity is determined, the cells are aligned, so that they have the same orientation. The cell having the highest similarity score to the template is labeled as the same cell moved to the position x, y in the image t + 1. The template is updated with the detected cell area for the comparison in the next time frame. The position of the detected cell is stored into the data base under the label of the template. Mitotic cells are detected by classifying each cell based on the texture descriptor given in  and the decision tree classifier.
Let CBtB be the cell at time-point t and CBt + 1B the same cell at time point t + 1. Then the rule for labeling a cell as "disappeared" is: IF CBtB has no CBt + 1B THEN Disappearing_Cell.
Similarity Determination Between Two Cells
The algorithm computes the similarity between two image areas A and B. These images are in our case the bounding box around each cell (see fig. 4). According to the specified distance function, the proximity matrix is calculated, for one pixel at position r, s in image A, to the pixel at the same position in image B and to the surrounding pixels within a predefined window. Then, the minimum distance between the compared pixels is computed. The same process is done for the pixel at position r, s in image B. Afterwards, the average of the two minimal values is calculated. This process is repeated until all the pixels of both images have been processed. The final dissimilarity for the whole image is calculated from the average minimal pixel distance. The use of an appropriate window size should make this measure invariant to scaling, rotation and translation.
The resulting similarity value based on this similarity measure for the pairwise comparison of cell_1 to cell_1, cell_1 to cell_2, and cell_1 to cell_3 in the preceding time frame is given in figure 5. The lowest similarity value is obtained when comparing cell_1 to cell_1 in the proceeding time frame.
Results in figure 6 A-E show the tracking path of five different cells. We compared the manually determined path of a cell from an operator of the High-Content Screening Process-line with the automatic determined path by the tool IBaI-Track for ten videos with 117 frames each. If both methods gave the same path we evaluated it as positive otherwise as negative. We observed a correspondence between these two descriptions of 98.2%. The computation time for a sequence of 117 images of 674 x 516 pixels each and on average 80 cells per image is 11 minutes 42 seconds on PC with 1.8 GHz.
We have presented our novel cell-tracking algorithm for tracking cells in dynamic drug discovery experiments. The algorithm uses a window around the position of the cell in the preceding time frame for searching for the next position of the cell. This is necessary since the movement of the cells cannot be steady. The search inside the window is started in the expected direction of the cell. To calculate this direction, we use a mean-shift filter over three time frames. The detection of the cell is done based on a similarity determination of the grey-level profile between two cells. The cell giving the highest similarity value is selected as the same cell in the preceding time frame. A similarity measure that is invariant to translation is used for that reason. The resulting data are stored in a data file and can be used for further data-mining analysis.
The software CellTrack is available as a commercial program.
 Debeir O., et al.: IEEE Trans. Med. Im. (24), 697-711 (2005)
 Padfield D., et al.: Spatiotemporal cell segmentation and tracking in automated screening, In: Proc. of the IEEE Intern. Symp. Biomed. Imaging, 2008, pp 376-379
 Li K., et al.: Medical Image Analysis, 12, 546-566 (2008)
 Perner P., et al.:, Journal Artificial Intelligence in Medicine 36 (2) 137-157 (2006)
 Comaniciu D., et al.: Real-time tracking of non-rigid objects using mean shift. In Proc. IEEE Conf. Comp. Vision and Patt. Recog., volume 2, pages 142-149, June 2000
 Perner P., et al.: Texture Classification based on Random Sets and its Application to Hep-2 Cells, In: R. Kasturi, D. Laurendeau, and C. Suen (Eds.), ICPR 2002, IEEE Computer Society 2002, Vol. II, pp. 406-411
Prof. Dr. Petra Perner
Inst. of Computer Vision and Applied Computer Sciences IBai
Tel: +49 341 8612 273
Fax: +49 341 8612 275