Let us define a random variable (RV) $Z = X + Y$ where $X$ and $Y$ are two independent RVs. If $f_X(x)$ and $f_Y(y)$ are probability density functions of $X$ and $Y$, then what can we say about $f_Z(z)$, the pdf of $Z$? A rigorous double “E” graduate course in stochastic processes is usually sufficient to answer this question. It turns out $f_Z(z)$ is the convolution of the densities $f_X(x)$ and $f_Y(y)$. See this (p.291) for more.

It is tempting to ask the counter question: if $f_Z(z) = f_X(x)*f_Y(y)$, then does it imply $Z = X + Y$ where $X$ and $Y$ are independent RVs? I found the answer in this amazing text:
Counterexamples in Probability by Jordan Stoyanov. The book is an adorable compilation of some 300 counterexamples to the probability questions which might be bothering you during a good night-sleep.

So, the answer to my question is no. The counterexample is using the Cauchy distributions. It turns out that the convolution of two Cauchy distributions is always a Cauchy distribution whether $X$ and $Y$ are independent or not.

Coming to think of the convolution of pdfs, our favorite website has a list of convolution of common pdfs.