During my Fall 2010 ECE 516 (Information Theory) class, the instructor asked the class to prove this seemingly trivial logarithmic inequality [1] by as many different methods and, if possible, using an information theoretic result. The given inequality comes handy to prove the Gibbs Inequality [2] [3].I could think of the following ways though the desired proof still eludes me. I would present my approach to it though:

**Using series expansion:**

For , the power series expansion of the function is given by

and where

Q.E.D.

**Using definite integral:**

For , we have area under the monotonically increasing logarithmic curve lying above the x-axis given by

… (1)

For , we have area under the monotonically increasing logarithmic curve lying below the x-axis given by

Following similar steps as above for , we get,

… (2)

Result follows from (1) and (2).

Q.E.D.

**Using the definition of convexity:**

Let’s assume where and then check the convexity of this function:

Q.E.D.

**Using an information theoretic result:**

This is just a vaporware [4] as of now. However, as shown in [2], this logarithmic inequality can be used to prove Jensen’s Inequality [5] or properties of Kullback-Leibler distance [6]. So is it possible that one of these results can be employed to do the inverse i.e. to prove the above-mentioned logarithmic inequality?

References:

[1] Cover T. M. and Thomas J. A., “Elements of Information Theory,” Wiley-Interscience, 2nd Edition, 2006, Problem 2.13.

[2] Gibbs’ Inequality.

[3] Proof of Gibbs’ Inequality. Please note that this document misspells the inequality as Gib**b’s**.

[4] Vaporware.

[5] Jensen’s Inequality.

[6] Kullback-Leibler Distance.

Satyaanveshi

Jul 19, 2013@ 06:38:10How about this sir, put x = 1/x and add the two inequalities which would yield, x + 1/x >= 2 Which I remember as a standard result for x>0. 😀

vizziee

Nov 19, 2013@ 02:55:24Hi Satyaanveshi,

Yes, it is. But where is the K-L distance in your proof?