I was reviewing an electronics textbook the other day, and it made an offhand comment that "sinusoidal signals of the same frequency always add up to a sinusoid, even if their magnitudes and phases are different". This gave me pause; is that really so? Even with different phases?

Using EE notation, a sinusoidal signal with magnitude A_1, frequency w and phase \phi_1 is A_1 sin(wt+\phi_1) [1]. The book's statement amounts to:

\[A_1 sin(wt+\phi_1)+A_2 sin(wt+\phi_2)=A_3 sin(wt+\phi_3)\]

The sum is also a sinusoid with the same frequency, but potentially different magnitude and phase. I couldn't find this equality in any of my reference books, so why is it true?

Empirical probing

Let's start by asking whether this is true at all? It's not at all obvious that this should work. Armed with Python, Numpy and matplotlib, I plotted two sinusoidal signals with the same frequency but different magnitudes and phases:

Two sinusoidal signals plotted together

Now, plotting their sum in green on the same chart:

Two sinusoidal signals plotted together with their sum signal

Well, look at that. It seems to be working. I guess it's time to prove it.

Proof using trig identities

The first proof I want to demonstrate doesn't use any fancy math beyond some basic trigonometric identities. One of best known ones is:

\[sin(a+b)=sin(a)cos(b)+cos(a)sin(b) \hspace{2cm} (id. 1)\]

Taking our sum of sinusoids:

\[A_1 sin(wt+\phi_1)+A_2 sin(wt+\phi_2)\]

Applying (id.1) to each of the terms, and then regrouping, we get:

\[\begin{align*} <sum>&=A_1\left [sin(wt)cos(\phi_1)+cos(wt)sin(\phi_1) \right ]+A_2\left [sin(wt)cos(\phi_2)+cos(wt)sin(\phi_2) \right ]\\ &=\left [A_1 cos(\phi_1) + A_2 cos(\phi_2) \right ]sin(wt)+\left [ A_1 sin(\phi_1) + A_2 sin(\phi_2)\right ]cos(wt)\\ \end{align*}\]

Now, a change of variables trick: we'll assume we can solve the following set of equations for some B and \theta [2]:

\[\begin{align*} Bcos(\theta)&=A_1 cos(\phi_1)+A_2 cos(\phi_2) \hspace{2cm} (1)\\ Bsin(\theta)&=A_1 sin(\phi_1)+A_2 sin(\phi_2) \hspace{2cm} (2)\\ \end{align*}\]

To find B, we can square each of (1) and (2) and then add the squares together:

\[B^2 cos^2 (\theta)+B^2 sin^2 (\theta)=(A_1 cos(\phi_1)+A_2 cos(\phi_2))^2 + (A_1 sin(\phi_1)+A_2 sin(\phi_2))^2\]

Using the fact that cos^2(a)+sin^2(a)=1, we get:

\[B=\sqrt{(A_1 cos(\phi_1)+A_2 cos(\phi_2))^2 + (A_1 sin(\phi_1)+A_2 sin(\phi_2))^2}\]

To solve for \theta, we can divide equation (2) by (1), getting:

\[\frac{sin(\theta)}{cos(\theta)}=tan(\theta)=\frac{A_1 sin(\phi_1)+A_2 sin(\phi_2)}{A_1 cos(\phi_1)+A_2 cos(\phi_2)}\]

Meaning that:

\[\theta = atan{\frac{A_1 sin(\phi_1)+A_2 sin(\phi_2)}{A_1 cos(\phi_1)+A_2 cos(\phi_2)}}\]

Now that we have the values of B and \theta, let's put them aside for a bit and get back to the final line of our sum of sinusoids equation:

\[A_1 sin(wt+\phi_1)+A_2 sin(wt+\phi_2)=\left [A_1 cos(\phi_1) + A_2 cos(\phi_2) \right ]sin(wt)+\left [ A_1 sin(\phi_1) + A_2 sin(\phi_2)\right ]cos(wt)\]

On the right-hand side, we can apply equations (1) and (2) to get:

\[A_1 sin(wt+\phi_1)+A_2 sin(wt+\phi_2)=B cos(\theta) sin(wt)+ B sin(\theta) cos(wt)\]

Applying (id.1) again, we get:

\[A_1 sin(wt+\phi_1)+A_2 sin(wt+\phi_2)=B sin(wt + \theta)\]

We've just shown that the sum of sinusoids with the same frequency w is another sinusoid with frequency w, and we've calculated B and \theta from the other parameters (A_1, A_2, \phi_1 and \phi_2) \blacksquare

Proof using complex numbers

The second proof uses a bit more advanced math, but overall feels more elegant to me. The plan is to use Euler's equation and prove a more general statement on the complex plane.

Instead of looking at the sum of real sinusoids, we'll first look at the sum of two complex exponential functions:

\[A_1 e^{j(wt + \phi_1)} + A_2 e^{j(wt + \phi_2)}\]

Reminder: Euler's equation for a complex exponential is

\[e^{jx}=cosx+jsinx\]

Regrouping our sum of exponentials a bit and then applying this equation:

\[\begin{align*} A_1 e^{j(wt + \phi_1)} + A_2 e^{j(wt + \phi_2)}&=e^{jwt}\left (A_1 e^{j\phi_1} + A_2 e^{j\phi_2}\right )\\ &=e^{jwt}\left ( A_1 cos(\phi_1) + jA_1 sin(\phi_1) + A_2 cos(\phi_2) + jA_2 sin(\phi_2)\right )\\ &=e^{jwt}\left [\left (A_1 cos(\phi_1) + A_2 cos(\phi_2) \right ) + j\left(A_1 sin(\phi_1) + A_2 sin(\phi_2) \right ) \right ] \end{align*}\]

The value inside the square brackets can be viewed as a complex number in its rectangular form: x + jy. We can convert it to its polar form: re^{j\theta}, by calculating:

\[\begin{align*} r&=\sqrt{x^2+y^2}\\ \theta&=atan(\frac{y}{x}) \end{align*}\]

In our case:

\[r=\sqrt{(A_1 cos(\phi_1)+A_2 cos(\phi_2))^2 + (A_1 sin(\phi_1)+A_2 sin(\phi_2))^2}\]

And:

\[\theta = atan{\frac{A_1 sin(\phi_1)+A_2 sin(\phi_2)}{A_1 cos(\phi_1)+A_2 cos(\phi_2)}}\]

Therefore, the sum of complex exponentials is another complex exponential with the same frequency, but a different magnitude and phase:

\[A_1 e^{j(wt + \phi_1)} + A_2 e^{j(wt + \phi_2)}= e^{jwt} r e^{j \theta}=r e^{j(wt + \theta)}\]

From here, we can use Euler's equation again to see the equivalence in terms of sinusoidal functions:

\[\begin{align*} A_1 cos(wt+\phi_1)+jA_1 sin(wt+\phi_1)&+\\ A_2 cos(wt+\phi_2)+jA_2 sin(wt+\phi_2)&=r cos(wt+\theta) + jr sin(wt+\theta) \end{align*}\]

If we only compare the imaginary parts of this equation, we get:

\[A_1 sin(wt+\phi_1)+A_2 sin(wt+\phi_2)=r sin(wt+\theta)\]

With known r and \theta we've calculated earlier from the other constants \blacksquare

Note that by comparing the real parts of the equation, we can trivially prove a similar statement about the sum of cosines (which should surprise no one, since a cosine is just a phase-shifted sine).


[1]

Electrical engineers prefer their signal frequencies in units of radian per second.

We also like calling the imaginary unit j instead of i, because the latter is used for electrical current.

[2]If you're wondering "hold on, why would this work?", recall that any point (x,y) on the Cartesian plane can be represented using polar coordinates with magnitude B and angle \theta.