Describing statistical dependencies is foundational to empirical scientific research. For uncovering intricate and possibly nonlinear dependencies between a single target variable and several source variables within a system, a principled and versatile framework can be found in the theory of partial information decomposition (PID). Nevertheless, the majority of existing PID measures are restricted to categorical variables, while many systems of interest in science are continuous. In this paper, we present a novel analytic formulation for continuous redundancy-a generalization of mutual information-drawing inspiration from the concept of shared exclusions in probability space as in the discrete PID definition of I_{∩}^{sx}. Furthermore, we introduce a nearest-neighbor-based estimator for continuous PID and showcase its effectiveness by applying it to a simulated energy management system provided by the Honda Research Institute Europe GmbH. This work bridges the gap between the measure-theoretically postulated existence proofs for a continuous I_{∩}^{sx} and its practical application to real-world scientific problems.