Stella Yu : Publications / Google Scholar

SurReal: Complex-Valued Learning as Principled Transformations on a Scaling and Rotation Manifold
Rudrasis Chakraborty and Yifei Xing and Stella X. Yu
IEEE Transactions on Neural Networks and Learning Systems, 2020
Paper | Code | arXiv

Complex-valued data are ubiquitous in signal and image processing applications, and complex-valued representations in deep learning have appealing theoretical properties. While these aspects have long been recognized, complex-valued deep learning continues to lag far behind its real-valued counter-part. We propose a principled geometric approach to complex-valued deep learning. Complex-valued data could often be subject to arbitrary complex-valued scaling; as a result, real and imaginary components could covary. Instead of treating complex values as two independent channels of real values, we recognize their underlying geometry: we model the space of complex numbers as a product manifold of nonzero scaling and planar rotations. Arbitrary complex-valued scaling naturally becomes a group of transitive actions on this manifold. We propose to extend the property instead of the form of real-valued functions to the complex domain. We define convolution as the weighted Fr├ęchet mean on the manifold that is equivariant to the group of scaling/rotation actions and define distance transform on the manifold that is invariant to the action group. The manifold perspective also allows us to define nonlinear activation functions, such as tangent ReLU and G-transport, as well as residual connections on the manifold-valued data. We dub our model SurReal, as our experiments on MSTAR and RadioML deliver high performance with only a fractional size of real- and complex-valued baseline models.

complex value, equivariance, Fr\'{e}chet mean, invariance, Riemannian manifold