STED imaging has been developed into a powerful nanoscopy technique, nowadays applied for imaging of a variety of structures even in living cells and dynamically. How fast one can image dynamics often depends on the size of the region, but up to 30 Hz imaging is possible in micrometer-sized regions of interest. Knowing where and when to apply this type of imaging in cells is however difficult, and for any throughput of interest currently unfeasible. Additionally, photobleaching and photodamage complicates the imaging for the user.
We set out to develop a smart acquisition method that alleviates some of the issues, and enables automated STED nanoscopy imaging up to 25 Hz of cellular processes of interest. The key to this is observing the cell, and when a triggering event of interest is observed fast STED imaging is automatically and near-instantly employed in a small region of interest surrounding the triggering event (Figure 1). As just published in Nature Methods (Alvelid et al., 2022), the acquisition method is named event-triggered STED (etSTED) imaging and allows timelapse STED imaging, both in 2D and 3D, inside 40–70 ms from the triggering event. It is applied to shed new light on cellular processes such as synaptic vesicle dynamics, endo- and exocytosis, and endosomal vesicle interactions.
EtSTED expands the family of smart microscopy methods, which all attempt to provide solutions for gentler or more powerful imaging by adapting illumination and acquisition settings to sample characteristics and feedback, and sometimes, as here, by switching between different imaging modalities. In etSTED, widefield imaging together with fluorescent sensors or protein labels is used to monitor processes such as local calcium activity, local pH changes, or endosomal vesicles approaching one another. An image analysis pipeline is run in real-time on every recorded widefield frame, returning the coordinates of any detected event taking place. Those coordinates are then used to perform the STED imaging in the sample area of interest. In the work we optimized various image analysis pipelines to detect different events of interest, and where the fastest of them run in merely 5–6 ms.
The combination of the imaging methods is performed on a custom-built STED setup, as previously published (Alvelid et al. 2019), where a widefield arm is added and spectrally coupled to the STED path. The microscope is controlled using the open-source microscope control software ImSwitch (https://github.com/kasasxav/ImSwitch), as previously developed in the lab (Casas Moreno et al. 2021). For etSTED acquisitions a custom control widget has been developed where the user can load custom-written analysis pipelines for the event detection, control various settings for the acquisition method as well as the pipeline parameters (Figure 2), and optimize them with live visual feedback of what events are detected or not . Once the user is ready to launch the real experiment, a button press allows the control widget to take full control of the microscope: it starts the widefield imaging, runs the analysis pipeline in real-time, translates the coordinates of detected events to scanning space, and initiates STED imaging in the area and time of interest. The cycle can be repeated endlessly, allowing the microscope to catch many events in the same area without any user input.
All software and real-time image analysis pipelines (https://github.com/kasasxav/ImSwitch, https://github.com/jonatanalvelid/etSTED-widget, https://github.com/jonatanalvelid/etSTED-widget-base) as well as imaging data (https://doi.org/10.5281/zenodo.5593270) are published open-source through GitHub and Zenodo. The widget to control the acquisition method was written for ImSwitch, however as our aim is to allow the method to be used by as many users as possible, even with other combinations of imaging methods, we also developed a standalone widget that is as software-agnostic as possible (https://github.com/jonatanalvelid/etSTED-widget-base). It is written in Python and can be readily connected to other Python-based microscope control software. If implementation for control software written in other programming languages is wanted the general structure of the standalone widget together with the pseudocode in the supplementary material of the article can act as a guide for implementation.
In the article we show a range of applications where we believe the method can give novel insight and higher throughput recordings. We use etSTED to: (1) detect and trigger on local calcium activity (using the calcium chelator BAPTA-1) and super-resolve local synaptic vesicle activity (Figure 3); (2) detect dynamin-mediated endocytosis, trigger on an increase in local dynamin intensity, and super-resolve the plasma membrane dynamics; (3) detect exocytosis, trigger on local pH changes, and super-resolve the plasma membrane dynamics; and (4) detect and trigger on endosomal vesicle proximity and super-resolve the interaction dynamics and movement.
The benefits of etSTED are manifold: previously unfeasible experiments are possible thanks to the high speed and automation; data can be recorded just when and where it is wanted leading to a reduced photobleaching and photodamage; and a significantly higher throughput of valuable data can be recorded. Therefore we hope that etSTED will prove powerful and versatile in the toolbox of smart microscopy, and that it can be used to explore the likes of synaptic activity and vesicle interaction in greater detail than ever before. Overall we believe that smart microscopy acquisition methods such as this one, where image acquisition is directly driven by what is taking place in the observed sample, will grow to be increasingly more important.