Counterexamples in Probability

Leave a comment

Let us define a random variable (RV) Z = X + Y where X and Y are two independent RVs. If f_X(x) and f_Y(y) are probability density functions of X and Y, then what can we say about f_Z(z), the pdf of Z? A rigorous double “E” graduate course in stochastic processes is usually sufficient to answer this question. It turns out f_Z(z) is the convolution of the densities f_X(x) and f_Y(y). See this (p.291) for more.

It is tempting to ask the counter question: if f_Z(z) = f_X(x)*f_Y(y), then does it imply Z = X + Y where X and Y are independent RVs? I found the answer in this amazing text:
Counterexamples in Probability by Jordan Stoyanov. The book is an adorable compilation of some 300 counterexamples to the probability questions which might be bothering you during a good night-sleep.

So, the answer to my question is no. The counterexample is using the Cauchy distributions. It turns out that the convolution of two Cauchy distributions is always a Cauchy distribution whether X and Y are independent or not.

Coming to think of the convolution of pdfs, our favorite website has a list of convolution of common pdfs.

Advertisements

A few good problems from an old book

Leave a comment

In Fall 2000, I was introduced to the signals and systems through a less popular textbook: An Introduction to the Principles of Communication Theory by J. C. Hancock (1961). My fellow undergraduate students used to tremble at its very sight. The book was laconic in explanations and parsimonious in examples. However, it was (and continue to be) a universally used textbook for undergraduate signals and systems course in several universities. I personally believe that this book should be replaced by more recent classic textbooks in this area since it is highly likely that an uninitiated student will further cloud than clear his understanding after the first reading of this book. Indeed, the book’s review that appeared in 1962 in IRE Transactions On Information Theory was an unfavorable one.

That said, the book is very useful as an interesting handbook on communication theory. It packs signals and systems, communication theory, analog electronics, random variables, probability, detection theory and more all in 253 pages – taking up the award of ingenious technical brevity. It also contains some of the most interesting exercises at the end of each chapter. I have revisited them time and again to verify my evolving comprehension of the subject. In one of my more recent regurgitation of this text, I came across two interesting problems, both from Chapter III: Random Signal Theory. The first problem[1] deals with the probability of random variables. It gives probability density functions of two statistically independent random variables X and Y and asks for the probability that a sample value of x(t) exceeds a sample value of y(t). We are given (notation is borrowed from Hancock’s book),

p(x) = 2ae^{-bx}, 0 \leq x \leq \infty, and

p(y) = ae^{-b|y|}, -\infty \leq y \leq \infty

Since X and Y are statistically independent, we have,

p(x, y) = p(x)p(y), 0 \leq x \leq \infty, 0 \leq y \leq \infty,
where the support of Y has been changed since p(x) = 0 for x < 0.

Now, P(X>Y) = P(X-Y>0) = 1 - P(X-Y \leq 0)

\Rightarrow P(X>Y) = 1- {\int\int_{x'-y' \leq 0} p(x,y)dxdy}

= 1- {\int_0^{\infty}\int_{0}^{y} p(x,y)dxdy}

= 1- {\int_0^{\infty}\int_{0}^{y} p(x)p(y)dxdy}

= 1- {\int_0^{\infty}(\int_{0}^{y} 2ae^{-bx}dx) p(y)dy}

= 1- {\int_0^{\infty}( \frac{2a}{-b}e^{-bx}|_{0}^{y}) p(y)dy}

= 1 - \frac{2a}{-b}{\int_0^{\infty}  (e^{-by} - 1) . ae^{-by} dy}

= 1 + \frac{2a^2}{b}{\int_0^{\infty}  (e^{-2by} - e^{-by}) dy}

= 1 + \frac{2a^2}{b} {\frac{1}{-2b} (e^{-2by} - 2e^{-by})|_{0}^{\infty}}

= 1 - \frac{a^2}{b^2} (0 - 1 - (0 - 2))

\Rightarrow P(X>Y) = 1 - \frac{a^2}{b^2}

The second problem[2] deals with finding the spectral density of a function from its time domain representation. Although the equation of the time domain function is not given, it can be deduced from the diagram that the function is a rectified sine wave. If the period of the sine wave is T, then that of rectified sine wave is \frac{T}{2}. So,

f(t) = |\sin(\frac{2\pi}{2T} t)| = |\sin(\frac{\pi t}{T})|

For a deterministic periodic function f(t) with period \frac{T}{2}, the spectral density G(f) is given by,

G(f) = \lim_{T \rightarrow \infty}\frac{|F_T(f)|^2}{T}

where F_T(f) is the Fourier Transform of f(t). Here,

… and I am still working on posting the entire solution.

References:
[1] Hancock J. C., “An introduction to the principles of communication theory,” McGraw-Hill Book Company, 1961, Problem 3-16.
[2] Hancock J. C., “An introduction to the principles of communication theory,” McGraw-Hill Book Company, 1961, Problem 3-27.