Probably worth mentioning the moment-generating function as well, since it's a bit more elementary than the characteristic function and shares many of the properties. It's also more simply related to the probability generating function, you can go from one to the other with a basic change of coordinates (t -> log(x)). I also estimate calculating the moment generating function to be easier in most cases.
In fact most properties of the PGF come from the monent-generating/characteristic function. Including why the second derivative is related to the variance. The second derivative of the moment generating function is the second moment E[X^2]. The second derivative of the logarithm of the MGF is the variance by definition.
The one property that's somewhat unique to the PGF is how composition relates to drawing a randomly-sized sample, which I can see could be useful.
One can also think of probability generating functions as (flipped) Z transforms, moment generating functions as (flipped Laplace transforms), and characteristic functions as Fourier transforms of the respective PMF/PDF. Lot of their properties then follow from simple properties of Signals and Systems.
Don't have a reference on the top of my head, but the main idea is as follows:
The definition of MGF of a random variable with PDF f(x) is
E[e^{sX}] = int_{-inf}^{inf} f(x) e^{sx} dx
The definition of Laplace Transform of a signal f(t) is
F(s) = int _{-inf}^{inf} f(t) e^{-st} dt
Hence MGF is 'flipped' Laplace transform
Now for we know that the MGF of sum independent RVs is the product of their MGFs. So if we take the inverse Laplace transform, the density of the sum is convolution of the individual densities.
Similarly, if we take derivative in frequency domain, that is same as multiplying in time domain: So M'_X(s) is the 'flipped Laplace transform' of x f(x) and its value at s=0 is the 'DC-gain' of the signal.
And so on... the properties are all immediate consequence of the definition of MGF and since the definition is essentially the same as that of a Laplace transform , there is an equivalent property in signals and systems as well.
In fact most properties of the PGF come from the monent-generating/characteristic function. Including why the second derivative is related to the variance. The second derivative of the moment generating function is the second moment E[X^2]. The second derivative of the logarithm of the MGF is the variance by definition.
The one property that's somewhat unique to the PGF is how composition relates to drawing a randomly-sized sample, which I can see could be useful.