Software tools for advanced image processing
High-throughput microscopy techniques, such as light-sheet imaging, generate huge amounts of data, often in the TeraBytes range. The challenge then becomes to manage these images and extract semantically relevant information from the raw data (a large matrix of grayscale values).
At the level of big data management, we developed ZetaStitcher, a tool for fast alignment of multiple adjacent tiles, and for efficient access to arbitrary portions of the complete volume. To orchestrate the operation of our laboratory equipment, we are developing data acquisition and control software in the form of hardware libraries and graphical user interfaces that are able to sustain a high data rate. Further, we are exploring different compression strategies to find the best compromise between data size, image quality preservation, and I/O speed. The tools that we develop are available from our GitHub page.
Concerning image analysis, we are exploiting different machine learning strategies to quantify the position, shape, and number of fluorescently labeled cells. Semantic deconvolution is used to recover image quality, allowing easier cell detection with standard methods. Deep convolutional neural networks are used to precisely segment neurons, allowing further classification based on cell shape. All methods are carefully designed to sustain the high data flux of our applications and to require a limited amount of manually annotated ground truth.