Deconvolution of Very Large Data Sets

Web-Based Batch Restoration with the Huygens Remote Manager

  • Deconvolution of Very Large Data Sets - Web-Based Batch Restoration with the Huygens Remote ManagerDeconvolution of Very Large Data Sets - Web-Based Batch Restoration with the Huygens Remote Manager
  • Deconvolution of Very Large Data Sets - Web-Based Batch Restoration with the Huygens Remote Manager
  • Fig. 1: The HRM welcome page.
  • Fig. 2: Deconvolution result for spinning disk. Comparison of raw (left) and deconvolved (right) images. (A, B) Maximum intensity projection. (C, D) Single XZ plane. Image courtesy of Christian Schuberth.
  • Fig. 3: Deconvolution result for SPM. Comparison of raw (top) and deconvolved (bottom) images. (A, B) Maximum intensity projection. (C, D) Surface rendering (Imaris, Bitplane AG) of the anterior side of the embryo. Image courtesy of Stefan Guenther.

The Huygens Remote Manager (HRM) is an open-source, multi-user, web-based batch processor for large scale restoration of microscopy images. By enabling mass deconvolution, HRM aims at maximizing the volume of data suitable for segmentation, quantification and analysis. Recent deconvolution runs of large spinning disk and SPIM images show that while these images present a true challenge for current desktop computers, they can be easily processed on a small HRM server automatically.


Image deconvolution reassigns the out-of-focus signal introduced by the microscope optics while it improves resolution, contrast and the signal-to-noise ratio (SNR) [1]. Hence, the recognition of deconvolution as a fundamental processing technique for wide-field, confocal, spinning disk, multiphoton and STED microscopy images. As we show here, image deconvolution also improves the quality of images from Selective Plane Illumination Microscopy (SPIM) despite the challenge that their large sizes represent. Moreover, deconvolution is a cost-effective tool, as it only requires conventional computational infrastructure and no additional microscopy equipment. The steady growth of data set sizes is raising technical concerns in all segments of the standard imaging pipelines. Data management solutions (e.g. OMERO [2], openBIS [3]), image restoration (deconvolution, stabilization, chromatic corrections) and analysis algorithms (tracking, segmentation) are all facing this new challenge. Recent microscopy modalities, such as spinning disk confocal and SPIM [4], are pushing these boundaries even further. In particular, SPIM microscopy allows for acquisition of extremely large data sets and is becoming the tool of choice in studies of embryonic development [5] and functional whole-brain imaging [6]. To facilitate the restoration of ever increasing volumes of microscopy data, the development of a Remote Manager (HRM) for the Huygens software ( was started in 2004 at Montpellier Rio Imaging. Nowadays, HRM is a collaborative effort that has proven useful in the large-scale automation and optimization of the image processing pipelines at a growing number of facilities worldwide (fig. 1).

To illustrate the HRM processing rate, we present the results of the deconvolution of two biological data sets acquired with spinning disk and SPIM microscopes.

Deconvolution Examples of Spinning Disk and SPIM Data

The Advanced Light Microscopy Facility (ALMF) at EMBL Heidelberg has been successfully using HRM for several years on a server with the relatively modest specifications of 8 cores, 96 GB of RAM and 2 TB of disk space. Disk space and RAM limit the maximum size of images that can be deconvolved.

Study of Golgi Organization with Spinning Disk Microscopy

In the study of Golgi organization in mammalian cells, Golgi fragments, generated for example during experimentally induced Golgi breakdown and biogenesis processes [7, 8], or during cell division events, are tracked over time. An analysis strategy for 2D image sequences has worked well for the initial characterization of Golgi fragment dynamics [7, 8]. However, for a more comprehensive investigation, 3D imaging is required, though limited by axial resolution and, for long live cell experiments, by SNR. To track the changes of Golgi morphology, a stack of 40 slices was recorded every 3 min during 10 h with a 63x/1.4 oil immersion objective on an UltraView VoX spinning disk microscope, where Golgi of HeLa cells were tagged with GFP (green) and nuclei with mCherry (red). The final image size was 30 GB. The results of the image acquisition and deconvolution are presented in figure 2, where the reassignment of out-of-focus light and SNR increase by deconvolution are visualized. This process allows for more accurate analysis of the Golgi dynamics as the higher quality of deconvolved images permits more precise segmentation and distinction of smaller structures. The deconvolution time for the entire data set was about 4 h on the HRM server (7.5 GB/h).

Studies of Drosophila Embryo Development with SPIM Microscopy

One goal in developmental research is to understand cell fate decisions on a single cell level. Using Drosophila melanogaster embryos as a model, imaging technology can monitor the development from a single to thousands of cells in larval stage within 24 hours. While SPIM microscopy allows for fast 3D imaging of the entire embryo [9], fluorescent blur, and signal scattering by yolk and tissues deteriorate signal quality. Successful lineage tracing, however, is highly dependent on the image quality, since it requires precise segmentation and tracking of the individual cells that are densely packed within the embryo. To follow the development of these embryos, images of live Drosophila in syncytial blastoderm (2 hpf) labeled with histone mCherry were acquired with a 25x/1.1 water dipping objective lens on a SPIM microscope. A stack of 400 slices was taken every 30 s for 20 h to image the change in position of nuclei during embryo development. The total data set size was 1 TB. The comparison of the raw and deconvolved data is presented in figure 3, where deconvolution shows a clear SNR improvement and elimination of background blur. This allows for more precise quantification of the embryo development due to the more accurate detection of the individual cells and nuclei and more reliable, easier tracking. The HRM server deconvolved the entire data set in about 120 h (8.3 GB/h).

Discussion and Outlook

HRM allows for efficient batch deconvolution of very large data sets in a multi-user environment via a user-friendly web application [10]. With the server specifications at ALMF, a processing rate in the range of 5 to 10 GB/h can routinely be reached. Raw data is usually made immediately available at the processing server by mounting the HRM disk [10] on the acquisition machines. Alternatively, HRM offers a direct bridge to the OMERO data management system that allows for two-way exchange of raw and deconvolved images between HRM and OMERO instances in a network. To speed up the processing of large data volumes, HRM can work with an array of processing machines, thus splitting the workload across multiple servers. A new HRM architecture based on the GC3Pie library ( will soon enable better parallelization of huge deconvolution tasks over clusters, grids and cloud-based virtual machines. Future developments of HRM will offer the same large scale support for additional tools like chromatic aberration correction, image stitching, image stabilization, cross-talk correction, and distillation of Point Spread Functions. As data gets larger, careful consideration must also be given to strategies for optimally storing, accessing and modifying images. The Hierarchical Data Format (HDF5, is designed to efficiently store and organize large amounts of numerical data. We encourage its usage in HRM and promote its adoption (for HDF5 support in other software see Finally, it is important to remember that data sets should not be larger than they strictly need to be to address the biological question of interest (


Image deconvolution increases resolution and SNR while it decreases background and blur, thus allowing for easier, more reliable segmentation, tracking, and analysis. Very large images are no exception and should also be deconvolved before drawing any conclusions from them. However, large images quickly become a real challenge for desktop computers. To ease local computing resources and centralize image processing, we showed that HRM allows for running large deconvolution jobs at a good rate on moderately sized servers from a user‘s web browser.


We would like to thank all members of ALMF, Stefan Guenther and Christian Schuberth from EMBL; Carl Zeiss AG, Leica, Olympus and PerkinElmer; and the HRM developers.

[1] Van der Voort et al.: Journal of Microscopy 178, 165-181 (1995)
[2] Allan C. et al.: Nat. Methods 9(3), 245-53 (2012)
[3] Bauch A. et al.: BMC Bioinformatics 12(1), 468-486 (2011)
[4] Stelzer E.H.K.: Nat. Publ. Gr. 12(1), 23-26 (2015)
[5] Huisken J. et al.: Science 305, 1007-1009 (2004)
[6] Ahrens M. B. et al.: Nat Methods 10(5), 413-420 (2013)
[7] Ronchi P. et al.: J Cell Sci 127(21), 4620-33 (2014)
[8] Schuberth CE. et al.: J Cell Sci 128(7), 1279-93 (2015)
[9] Krzic U. et al.: Nature Methods 9, 730-33 (2012)
[10] Ponti A. et al.: Imaging & Microscopy 9(2):57-58 (2007)

Dr. Aaron Ponti

ETH Zurich
Department of Biosystems Science and Engineering
Basel, Switzerland

MSc. Daniel Sevilla Sanchez
Scientific Volume Imaging B.V.
Hilversum, The Netherlands

Dr. Yury Belyaev
Heidelberg, Germany

Niko Ehrenfeuchter
Image Analysis Specialist
Biozentrum, University of Basel, Switzerland



EHT Zürich
Mattenstrasse 26
4058 Basel
Phone: +41 61 387 33 74
Telefax: +41 61 387 39 93

Register now!

The latest information directly via newsletter.

To prevent automated spam submissions leave this field empty.