A lot of times, math can be deceptive and unexpected

A few examples are the Monty Hall problem, the Banach-Tarski paradox, and the Weierstrass function. Today we will look at one of the most famous and classic counterexamples, that is that if a sequence converges to zero, it’s series does not necessarily converge.

What does it mean? If we have a sequence tending to zero, when we sum up all the elements in order, the more we sum the less we add. So it is natural to expect that because the rate the sums so far, called the **partial sums**, constantly decreases, the sum of the entire series will be finite.

In fact, the best counterexample to this uses one of the simplest sequences tending to zero, that is the sequence of 1/n, for a positive integer n.

This series is known as the **harmonic series **(big surprise, I know, it’s not like it is in the title or something). The name comes from music. When you oscillate a string in a random point on the string, almost surely (literally) the sound won’t be that pleasant. However, if you oscillate it on a point that is part of the division of the string to n equal sized parts, the sound will be pleasant, harmonic. Those points can be obtained by oscillating it in the location 1/n of the strings length, where n is a positive integer. Thus the naming.

So why exactly is the total sum infinite, that is the series **diverges**, well that is actually surprisingly simple to prove.

Let’s look at the first element of the sum, that is, 1. it is obviously greater than 1/2

Well, that was a bit pointless. Anyways, let’s look now at the second and third elements of the sum, 1/2+1/3. As 2 and 3 are less than 4, both elements of the sum are greater than 1/4, Hence, the sum itself is greater than 1/4+1/4.

Now we continue, using the next four elements: 1/4+1/5+1/6+1/7. Similar to before, all elements of the sum are greater than 1/8. So the total is greater than 1/8+1/8+1/8+1/8=1/2.

We repeat this process, looking at the next eight elements, then the next sixteen, and so on, doubling each time. We now grouped the series into smaller sums, each one greater than 1/2.

Therefore, the total sum of the harmonic series, is greater than the sum of 1/2 infinitely many times. But that is obviously infinite! Therefore, the harmonic series diverges. **Q.E.D**

**How fast though?**

As stated before, 1/n tends to zero as n gets large. So the more we sum, the rate of growth of the partial sums decreases. But what is exactly the rate of growth of the partial sums, S_{n}? *(S _{n} is the sum of the first n elements in the series).* Can you describe it, say as a power of n? some other known function? or you can’t using any known function?

Well it turns out you can, the harmonic series has **logarithmic growth**, that is, the partial sum S_{n} is approximately ln(n), where ln is the natural logarithm:

A nice intuition to why that is the case is by looking at the difference between adjacent terms in both sequences. The difference between S_{n} and S_{n+1} is 1/(n+1), as the latter is the sum of the first n+1 terms and the former of the first n, therefore the difference must be the (n+1)th element, 1/(n+1).

For ln(n), the difference is ln(n+1)-ln(n). Using logarithm laws, we can rewrite that as ln((n+1)/n), or ln(1+1/n). One fact of the function ln(x+1) is that the closer x is to 0, ln(x+1) is approximately equal to x itself. So using this, for a large enough n, ln(1+1/n) is approximately 1/n.

So as the differences of adjacent terms in both sequences are very close together, we can expect that both sequences will be very close to each other. **But not too close**, note the gap in the graph.

The length of that gap has a name, it is called the **Euler-Mascheroni constant**, *γ* . It is given by the difference of S_{n} and ln(n) as n gets large, that is:

One interesting fact about *γ* is that unlike other mathematical constants such as pi and e, not a lot is known about it. Most famously, it is unknown whether it is rational (a quotient of two whole numbers), or not. Unlike constants such as π , e, and the square root of two which are known to be irrationals

**The Harmonic Series’ Friends**

Let’s modify the harmonic series a bit, shall we?

One possible way to modifying it is to change the signs of each term. That is, for certain terms instead of adding 1/n, subtracting 1/n. For example 1+1/2-1/3+1/4.. As it turns out, the value of this sum is ln(2). Although those modifications are very interesting as well, we will not focus on them today.

Instead, let’s look at modifications obtained by summing 1/n over only **certain values of n** (n as before is a positive integer). There are lots of such modifications (you could say, *uncountably many*)

**Only Odd/Even numbers**

First, let’s see what happens when you sum over only even n:

We took out 1/2, and got 1/2 of the original harmonic series! Therefore, summing over only even numbers leads again to an infinite sum. Now what if we sum only over odd numbers? Well it is also infinite, as an elegant corollary of what we showed previously:

Similar sums to those are summing n in a certain **arithmetic progression**. That is, for a starting number a, and a step length d (both positive integers), summing over the reciprocals of a, a+d, a+2d, a+3d and so on. In a similar way to proving the harmonic series diverges, you can prove that those type of series diverge as well

**Only Square Numbers**

Now things start to get interesting

The problem of finding what the sum of the reciprocals of square numbers, that is the sum of 1/n^{2} from n=1 to infinity, is known as the **Basel Problem**,

First proposed in 1650, the problem was solved nearly 100 years later by Leonard Euler *(unsurprisingly). *The name Basel problem thus comes from Euler’s home town, Basel.

So what is the sum equal to? The answer is, surprisingly **pi squared divided by six**

It is weird to see pi suddenly show up, as there is no clear connection between it and the initial problem. Nevertheless, of the countless ways to solve the problem, some make it a bit more clear why pi shows up by showing a connection to a circle. I’ve linked a great video by 3Blue1Brown of one of these proofs at the end of the post

This sum is related to a much greater concept in mathematics named the **Riemann Zeta Function**. For a real number s greater than 1, it is defined by the sum of 1/n^{s} from n=1 to infinity. The function can tell us a lot about the distribution of prime numbers, among other things. Most (in)famously, the problem of determining where the zeroes of the function lie, known as **The Riemann Hypothesis**, is yet unsolved. In fact, the Clay math institute labels it as one of the millennium prize problems. Should you somehow solve it, you will be rewarded by them a small prize of a million dollars, along with eternal fame.

Some other values of the Riemann zeta function, relating back to the type of series we were talking about earlier:

- For s=3, that is the sum of 1/n
^{3}from n=1 to infinity, is known as**Apéry’s constant**, and is equal to approximately 1.20205… Unfortunately, there is no nicer closed form for this constant, for example that uses pi like before - For s=4, the sum of 1/n
^{4}from n=1 to infinity is equal to π^{4}/90 - For s=6, the sum of 1/n
^{6}from n=1 to infinity is equal to π^{6}/945

**Only Prime Numbers**

And finally, what happens when you sum 1/p, over all prime numbers p? *(That is, all natural numbers that are divisible only by 1 and themselves, except 1 itself)*

Well, surprise surprise, this sum diverges!

Even though it feels like prime numbers are very rare, and so this sum would be finite (same intuition, say for the sum with only square numbers). However, turns out that this intuition is false, and despite their apparent scarcity, prime numbers are frequent enough to cause this sum to diverge.

Even more surprisingly is how this sum grows. Turns out that not only the sum diverges, but it has double logarithmic growth, that is growth like ln(ln(x)). If you sum 1/p over all prime numbers p that are below some number n, that sum will be approximately ln(ln(n)). Like before with the harmonic series, the gap between this sum and ln(ln(n)) too approaches a certain value, named the **Meissel-Mertens constant** (equal to approximately 0.2615)

One final surprising fact: We call pairs of prime numbers **twin primes** if they differ by two. For example 3 and 5 are twin primes, and so are 17 and 19. Now what happens when you sum 1/p over all twin primes (including duplicates)? Well turns out that this sum is indeed finite (this fact is called **Brun’s Theorem**), unlike the previous sum. However, what’s interesting is that we don’t know a lot about twin primes, most importantly if there are even infinitely many of them, a question dating to ancient times. Despite this, we do know that this sum does converge! It would be a bit funny if it turns out there are only finitely many twin primes (not very likely), then Brun’s theorem would be kind of useless, as the sum only has a finite number of terms so it obviously finite.

Well I guess at least we know a lot about primes that differ by 3…

**The Erdős Conjecture**

So here is a question, is there a way to tell when the sums from before converge or not, without looking directly at the sum? That is for which sets of positive integers S does the sum of 1/n, where n is in S, diverge and for which it converges?

The great mathematician Paul Erdős conjectured the following criteria: If the sum diverges for a set S, S has arbitrarily long arithmetic progressions. That is, for each number L, you can find an arithmetic progression of length L in the set S.

*The opposite is not necessarily true, consider for example the sum:*

$1+1/10+1/11+1/100+1/101+1/102+1/1000+….$

*That is, we go over the powers of ten $10^n$ and start an arithmetic progression of length n there. The resulting series has arbitrarily long arithmetic progressions but still converges*.

This is a very beautiful way to characterize such sets in my view, as it simplifies a complex criteria to a simpler one (although neither is very easy to prove). Unfortunately, as of the writing of this post this conjecture remains unproven. Although something that was proven is for the case where S is the set of prime numbers, this is called the **Green-Tao** theorem.

In fact, going back to twin primes from before, the Green-Tao theorem marks a big progress on proving that there are infinitely many twin primes. Due to help from the Polymath project, it has been proven that there are infinitely many prime numbers that differ by 246! *(this isn’t a factorial)*. Hopefully, both the twin prime theorem and the Erdős conjecture will be proven soon too, or at least in the next 30-50 years that is.

I will end this post with a question to you. Going back to the growth of the harmonic series, which had ln(n) growth, and the sum of reciprocals of primes, which had ln(ln(n)) growth, can you go further think of a series of reciprocals which has a ln(ln(ln(n))) growth?

You have the Erdos conjecture the wrong way around. 1/1+1/10+1/11+1/100+1/101+1/102+1/1000+1/1001+1/1002+1/1003+… Contains arbitrarily long arithmetic progressions, but converges. The tricky question is if a diverging sequence lacks such progressions.

fixed it, thanks!