

High-throughput microscopy techniques generate an ever growing amount of data that are fundamental to gather scientific, biologically and medically relevant insights. This growing amount of data dramatically affects the scientific workflow at every step. Visualization and analysis tasks are performed with limited interactivity and the implementations often require HPC skills and lack of portability, usability and maintainability.
In this work we explore a software infrastructure that simplifies end-to-end visualization and analysis of massive data. Data management and movement is performed using a hierarchical streaming data access layer which enable interactive exploration of remote data. The analysis tasks are expressed and performed using a library for rapid prototyping of algorithms using an Embedded Domain Specific Language which enables portable deployment in both desktop and HPC environments. Finally, we use a scalable runtime system (Charm++) to automate the mapping of the analysis algorithm to the computational resources available, reducing the complexity of developing scaling algorithms. We present large scale experimentations using tera-scale microscopy data executing some of the most common neuroscience use cases: data filtering, visualization using two different image compositing algorithms, and image registration.