Scientists have always been particularly intrigued by the extremes in nature and made significant efforts to study these; microscopes allow us to observe the smallest objects, while telescopes permit us to explore the largest objects and also those farthest away. The work proposed herein will provide new means and generate insights to phenomena occurring on the shortest timescales in nature.
Past methods to probe ultrafast events – occurring on picosecond timescale or faster – have mostly relied on pump/probe scanning, yet these can only measure the dynamics of such processes if they are repetitive. Understanding all spatiotemporal aspects of ultrafast phenomena, however, requires experimental means to spatially, spectrally and temporally resolve them. Recently the PI invented a “coding” imaging concept called Frequency Recognition Algorithm for Multiple Exposures (FRAME) that can film at up to 5 trillion frames per second. To date, FRAME is the only videography method that can unify a femtosecond temporal resolution with spectroscopic compatibility, making it a powerful tool with high potential for new scientific discoveries. This project aims to (i) develop novel diagnostic tools based on FRAME and (ii) apply FRAME videography to study ultrafast events, whose temporal evolution could not be visualized in the past.
Ultrafast science is a wide field, making the project highly interdisciplinary. For example, within photo-physics, systems will be developed to film plasmas and laser filaments. Diagnostics will be developed to image the lifetime of coherent states as well as fluorescence decays of two fluorophores in parallel, which holds potential within biology, physics and chemistry. A two-color FRAME setup will be developed to temporally track the creation and consumption of two species in a chemical reaction simultaneously. The ensemble of work-packages proposed herein constitutes a significant step forward in the research area of ultrafast imaging and videography.
Fields of science
Funding SchemeERC-STG - Starting Grant
See on map