[Maturing]

A transformed distribution applies a monotonic transformation to an existing distribution. This is useful for creating derived distributions such as log-normal (exponential transformation of normal), or other custom transformations of base distributions.

The density(), mean(), and variance() methods are approximate as they are based on numerical derivatives.

dist_transformed(dist, transform, inverse)

Arguments

dist

A univariate distribution vector.

transform

A function used to transform the distribution. This transformation should be monotonic over appropriate domain.

inverse

The inverse of the transform function.

Details

We recommend reading this documentation on pkgdown which renders math nicely. https://pkg.mitchelloharawild.com/distributional/reference/dist_transformed.html

Let \(Y = g(X)\) where \(X\) is the base distribution with transformation function transform = \(g\) and inverse = \(g^{-1}\). The transformation \(g\) must be monotonic over the support of \(X\).

Support: \(g(S_X)\) where \(S_X\) is the support of \(X\)

Mean: Approximated numerically using a second-order Taylor expansion:

$$ E(Y) \approx g(\mu_X) + \frac{1}{2}g''(\mu_X)\sigma_X^2 $$

where \(\mu_X\) and \(\sigma_X^2\) are the mean and variance of the base distribution \(X\), and \(g''\) is the second derivative of the transformation. The derivative is computed numerically using numDeriv::hessian().

Variance: Approximated numerically using the delta method:

$$ \mathrm{Var}(Y) \approx [g'(\mu_X)]^2\sigma_X^2 + \frac{1}{2}[g''(\mu_X)\sigma_X^2]^2 $$

where \(g'\) is the first derivative (Jacobian) computed numerically using numDeriv::jacobian().

Probability density function (p.d.f): Using the change of variables formula:

$$ f_Y(y) = f_X(g^{-1}(y)) \left|\frac{d}{dy}g^{-1}(y)\right| $$

where \(f_X\) is the p.d.f. of the base distribution and the Jacobian \(|d/dy \, g^{-1}(y)|\) is computed numerically using numDeriv::jacobian().

Cumulative distribution function (c.d.f):

For monotonically increasing \(g\): $$ F_Y(y) = F_X(g^{-1}(y)) $$

For monotonically decreasing \(g\): $$ F_Y(y) = 1 - F_X(g^{-1}(y)) $$

where \(F_X\) is the c.d.f. of the base distribution.

Quantile function: The inverse of the c.d.f.

For monotonically increasing \(g\): $$ Q_Y(p) = g(Q_X(p)) $$

For monotonically decreasing \(g\): $$ Q_Y(p) = g(Q_X(1-p)) $$

where \(Q_X\) is the quantile function of the base distribution.

Examples

# Create a log normal distribution
dist <- dist_transformed(dist_normal(0, 0.5), exp, log)
density(dist, 1) # dlnorm(1, 0, 0.5)
#> [1] 0.7978846
cdf(dist, 4) # plnorm(4, 0, 0.5)
#> [1] 0.9972194
quantile(dist, 0.1) # qlnorm(0.1, 0, 0.5)
#> [1] 0.5268835
generate(dist, 10) # rlnorm(10, 0, 0.5)
#> [[1]]
#>  [1] 1.7645954 1.0285990 1.0192352 0.8064466 0.7581369 0.4255272 0.7577866
#>  [8] 1.4767394 1.2341949 0.7469440
#>