Season 2 Episode 8

OOMC Season 2 Episode 8. Dotty Maths

The dot product, dots for time derivatives, dots of dotted dotty vectors, dots on dominoes, and dots on dice. Has James finally gone completely dotty? Find out in this episode.

 

Further Reading

Dot product

In the livestream I hinted that the scalar product (or dot product) between two vectors has some nice properties. I didn’t describe any of them, so here’s one in the further reading for you. $$|\mathbf{u} \cdot \mathbf{v}|^2\leq (\mathbf{u}\cdot \mathbf{u})(\mathbf{v} \cdot \mathbf{v})$$

This is called the Cauchy-Schwarz inequality, and it’s important because it’s true for other sorts of dot product. Other sorts of dot product? That’s right, if you do mathematics at university then we’ll teach you how to dot other things together, in a general theory of “inner product spaces”. For example, if $f(x)$ and $g(x)$ are integrable functions on $(a,b)$ then one possible dot product on those functions is defined as $f(x)\cdot g(x) = \int_a^b f(x) g(x) \,\mathrm{d}x$. Operations like this have to satisfy a couple of basic rules in order to be “inner products”. Amazingly, the basic rules are enough to prove the Cauchy-Schwarz inequality above.

Why is this cool? Well, it lets us define an “angle between two functions”. For vectors, we’ve got this familiar statement that $\mathbf{u} \cdot\mathbf{ v} = |\mathbf{u}| |\mathbf{v}| \cos \theta$ where  $\theta$ is the angle between the vectors. If we take that same equation but think of it as the definition of $\theta$ (rather than some sort of geometry fact), and we replace the dot product with our new integration-for-functions dot product above, then we’ve got an equation for $\cos \theta$ given two functions. We can therefore talk about functions being “orthogonal” (at right angles to each other). This isn’t just a meaningless distraction; the idea of particular functions being orthogonal is directly useful for solving partial differential equations, using the sort of mathematics that Jonah talked about in Season 2 Episode 2.

 

Dots for time derivatives

Here’s a little bit of slow-motion replay of the last thing we did with dotting a dot product of dotted vectors.
We have $\dot{\mathbf{x}}\cdot\ddot{\mathbf{x}}=\dot{\mathbf{x}}\cdot \mathbf{g}$

Separately, I know how to differentiate $\dot{\mathbf{x}}\cdot \dot{\mathbf{x}}$ with respect to time. The product rule says that this derivative is $\ddot{\mathbf{x}}\cdot\dot{\mathbf{x}}+\dot{\mathbf{x}}\cdot\ddot{\mathbf{x}}$ but these two terms are equal to each other, so this is $2\dot{\mathbf{x}}\cdot\ddot{\mathbf{x}}$. This looks a lot like the left-hand side of my equation (with a 2 here).

On the right, I know that $\mathbf{g} $ is a constant vector so the derivative of $\mathbf{x}\cdot \mathbf{g} $ with respect to time is just $\dot{\mathbf{x}}\cdot \mathbf{g}$.

So I can recognise each side as a derivative, and integrate each side with respect to time to get $$\frac{1}{2}\dot{\mathbf{x}}\cdot\dot{\mathbf{x}}=\mathbf{x}\cdot \mathbf{g} +C$$

where C is a constant. This is the conservation of energy equation.

 

Sicherman Dice

You can read more about Sicherman dice on Wikipedia. That Wikipedia page talks about generating functions. I can’t remember when I first learned about these, so here’s a quick introduction.

The idea is that you make a polynomial where the powers are the possible outcomes of your discrete random variable, and the coefficients are the corresponding probabilities. That polynomial is called the probability generating function (PGF). Why would you do that? Because, brilliantly, the PGF of a sum of two random variables is given by the product of their PGFs- you just multiply the polynomials together and you get the probabilities for the sum of the variables! That’s because the way we multiply polynomials together is exactly right for working out the probabilities of the distribution of the sum of the two random variables.

For a six-sided die, the PGF is $$\frac{1}{6}\left(x+x^2+x^3+x^4+x^5+x^6\right)=\frac{x}{6}(1+x+x^2)(1+x^3)$$

so the PGF for the sum of two dice is $$\frac{x^2}{36}(1+x+x^2)^2(1+x^3)^2.$$

If we can find another, different, way to factorise this polynomial, then we can find Sicherman dice without checking lots of cases for the faces. Now $1+x^3=(1+x)(1-x+x^2)$ (using the "sum of two cubes" identity, which is like a sort of less-famous cousin of the "difference of two squares" identity), and that gives us some factors to play with in our PGF for the sum of two dice. There’s still some exploring to do, but there are fewer things to check. And this approach isn’t much harder to do with larger dice, which is nice.

Curve sketching

I tried to draw a bar chart in the livestream and it didn’t go that well. Now it’s your turn! If I roll three normal six-sided dice and find the total, what does the bar-chart of the probability distribution look like? What if I roll four six-sided dice and find the total? What if I roll a hundred dice?


 

If you want to get in touch with us about any of the mathematics in the video or the further reading, feel free to email us on oomc [at] maths.ox.ac.uk.

Please contact us with feedback and comments about this page. Last updated on 29 Apr 2022 12:07.