Expectation through CDF

For a non-negative random variable $X \colon \Omega \to [0,+\infty)$, there exist explicit formulae for the mean and variance \begin{align*} \newcommand{\E}{\mathbb{E}} \newcommand{\V}{\mathbb{V}} \E X &= \int_0^\infty (1-F_X(x)) \, dx, \newline \V X &= \int_0^\infty 2x (1-F_X(x)) \, dx - \left( \int_0^\infty (1-F_X(x)) \, dx \right)^2, \end{align*} discussed in some detail on Stackexchange here and here. This note generalizes those two by providing the formula $$\label{expectation_through_cdf} \E g = \int_0^\infty (1 - F_X(x)) \, dg(x).$$ for the expectation of any differentiable function $g \colon [0,+\infty) \to \mathbb{R}$ that satisfies the condition $g(0) = 0$. I am pretty sure I am not the first to stumble upon this formula, but I couldn’t find a reference for it.

Informal derivation

We want to compute $$\label{expectation} \E g := \int_0^\infty g(t) \, f_X(t) \, dt.$$ Using Newton’s fundamental insight (supplemented by the condition $g(0)=0$) $$\label{integral} g(t) = \int_0^t g^\prime(x) \, dx,$$ we can obtain \begin{equation*} \E g = \int_0^\infty g^\prime(x) \int_x^\infty f_X(t)\,dt\,dx = \int_0^\infty g^\prime(x) (1-F_X(x)) \, dx \end{equation*} after substituting \eqref{integral} into \eqref{expectation} and switching the order of integration; finally, by putting $g$ inside the differential, we arrive at the desired formula \eqref{expectation_through_cdf}.

Example application

Where can this be useful? I came across this formula when working with a ramp-transformed logistic random variable.

Let $X \sim \mathcal{L}(0, 1)$ be a logistic random variable with $\mu = 0$ and $s = 1$, i.e., the CDF of $X$ is the standard logistic function \begin{equation*} F_X(x) = \frac{1}{1 + e^{-x}}. \end{equation*} The standard logistic function has a remarkable property $$\label{logistic_derivative} (\ln F_X(x))^\prime = 1 - F_X(x),$$ that—you can imagine—plays well with Formula \eqref{expectation_through_cdf}. The ramp (or ReLU) transformation of $X$ \begin{equation*} Y = \varphi(X) := X \mathbf{1} (X \geq 0) \end{equation*} has the CDF \begin{equation*} F_Y(y) = F_X(y) \mathbf{1} (y \geq 0); \end{equation*} therefore, the expectation of $g(Y)$ is given by \begin{equation*} \E g = \int_0^\infty (1 - F_X(y)) \, dg(y). \end{equation*} Using Property \eqref{logistic_derivative}, we can write \begin{equation*} \E g = \int_0^\infty (\ln F_X(x))^\prime \, dg(x) = g^\prime(x) \ln F_X(x) \Big\rvert_0^\infty - \int_0^\infty \ln F_X(x) \, dg^\prime(x); \end{equation*} or, more explicitly, \begin{equation*} \E g = g^\prime(0) \ln (1 + e^{-0}) + \int_0^\infty \ln (1 + e^{-x}) \, dg^\prime(x). \end{equation*} This formula can be generalized to arbitrary logistic distributions $\mathcal{L}(\mu, s)$. For example, the mean of $Y = \varphi(X)$ with $X \sim \mathcal{L}(\mu, s)$ is given by \begin{equation*} \E Y = s \ln (1 + e^{\frac{\mu}{s}}). \end{equation*}