darts_segmentation.training.reversed_loguniform
¶
reversed_loguniform
module-attribute
¶
reversed_loguniform = darts_segmentation.training.reversed_loguniform.reversed_loguniform_gen(
name="reversed_loguniform", shapes="a, b, n"
)
reversed_loguniform_gen
¶
Bases: scipy.stats._distn_infrastructure.rv_continuous
A reversed log-uniform continuous random variable.
This distribution has equal probability density on a logarithmic scale approaching an upper bound (default 1), but never reaching it.
For example, [0.9, 0.99), [0.99, 0.999), and [0.999, 0.9999) all have equal probability.
Notes¶
The probability density function for this class is:
.. math::
f(x, a, b, n) = \frac{1}{(n-x) \log((n-a)/(n-b))}
for :math:a \le x \le b < n, where :math:n is the upper bound
(default 1). This class takes :math:a, :math:b, and optionally
:math:n as shape parameters.
The distribution is created by transforming a loguniform distribution: if Y ~ loguniform(n-b, n-a), then X = n - Y ~ reversed_loguniform(a, b, n).
Examples¶
from scipy.stats import reversed_loguniform import matplotlib.pyplot as plt import numpy as np fig, ax = plt.subplots(1, 1)
Generate random variates:
r = reversed_loguniform(0.5, 0.9999).rvs(size=1000)
Display histogram on transformed scale to show equal probability:
ax.hist(-np.log10(1 - r)) ax.set_ylabel("Frequency") ax.set_xlabel("Transformed value (-log10(1-x))") plt.show()