Tuesday, October 20, 2020
10/20/2020 - 9:00am
Abstract: We demonstrate a very universal frequency principle that deep neural networks learn low frequency faster on high-dimensional benchmark datasets such as MNIST/CIFAR10 and deep neural networks such as VGG16. Then, we utilize the frequency principle to empirically provide a promising mechanism to understand why deeper learning is faster, that is, we propose a deep frequency principle that the effective target function for a deeper hidden layer biases towards lower frequency during the training.
10/20/2020 - 4:00pm
The Burau representation is a classical linear representation of the braid group that can be used to define the Alexander polynomial invariant for knots and links.
The question of whether or not the Burau representation of the braid group $B_4$ is faithful is an open problem since the 1930s. The faithfulness of this representation is necessary for the Jones polynomial of a knot to detect the unknot.
In this talk, I will present my work on this problem, which includes strong constraints on the kernel of this representation. The key techniques include a new interpretation of the Burau matrix of a positive braid and a new decomposition of positive braids into subproducts.
I will discuss all of the relevant background for the problem from scratch and illustrate my techniques through simple examples. I will also highlight the beautiful and elegant connections to bowling balls and quantum intersection numbers of simple closed curves on punctured disks.