A key scientific task is distinguishing signal from noise, with observed data often shaped by two random variables: noise, typically characterized through control experiments, and the signal of interest, where discoveries are made.
Traditional approaches, like Fourier transforms, are popular but can conflict with probability theory, especially for complex or small datasets. Bayesian methods provide rigor but require precise prior knowledge of the signal’s distribution. What if this knowledge is lacking?
This talk introduces NFdeconvolute, a tool that applies normalizing flows to deconvolution without assuming a signal’s distribution. By combining observed data with known noise characteristics, NFdeconvolute enables robust, data-driven probability density estimation in noisy conditions.