Widely Linear Kernels for Complex-valued Kernel Activation Functions

@article{Scardapane2019WidelyLK,
  title={Widely Linear Kernels for Complex-valued Kernel Activation Functions},
  author={Simone Scardapane and Steven Van Vaerenbergh and Danilo Comminiello and Aurelio Uncini},
  journal={ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)},
  year={2019},
  pages={8528-8532}
}
Complex-valued neural networks (CVNNs) have been shown to be powerful nonlinear approximators when the input data can be properly modeled in the complex domain. One of the major challenges in scaling up CVNNs in practice is the design of complex activation functions. Recently, we proposed a novel framework for learning these activation functions neuron-wise in a data-dependent fashion, based on a cheap one-dimensional kernel expansion and the idea of kernel activation functions (KAFs). In this… 

Figures and Tables from this paper

A Comprehensive Survey and Performance Analysis of Activation Functions in Deep Learning
TLDR
A comprehensive overview and survey is presented for AFs in neural networks for deep learning, covering different classes of AFs such as Logistic Sigmoid and Tanh based, ReLU based, ELU based, and Learning based.
Complex and Hypercomplex-Valued Support Vector Machines: A Survey
TLDR
The importance, recent progress, prospective applications, and future directions of complex, hypercomplex-valued and geometric Support Vector Machines are presented and discussed.
Activation Functions in Deep Learning: A Comprehensive Survey and Benchmark
Neural networks have shown tremendous growth in recent years to solve numerous problems. Various types of neural networks have been introduced to deal with different types of problems. However, the

References

SHOWING 1-10 OF 23 REFERENCES
Complex-Valued Neural Networks With Nonparametric Activation Functions
TLDR
This paper proposes the first fully complex, nonparametric activation function for CVNNs, which is based on a kernel expansion with a fixed dictionary that can be implemented efficiently on vectorized hardware.
Kafnets: kernel-based non-parametric activation functions for neural networks
Widely Linear Complex-Valued Kernel Methods for Regression
TLDR
It is shown that in the all-relevant nonlinear equalization problem the pseudo-kernel plays a significant role and previous approaches that do not rely on this kernel clearly underperform.
Deep Complex Networks
TLDR
This work relies on complex convolutions and present algorithms for complex batch-normalization, complex weight initialization strategies for complex-valued neural nets and uses them in experiments with end-to-end training schemes and demonstrates that such complex- valued models are competitive with their real-valued counterparts.
Kernels for Vector-Valued Functions: a Review
TLDR
This monograph reviews different methods to design or learn valid kernel functions for multiple outputs, paying particular attention to the connection between probabilistic and functional methods.
Extension of Wirtinger's Calculus to Reproducing Kernel Hilbert Spaces and the Complex Kernel LMS
TLDR
The notion of Wirtinger's calculus is extended, for the first time, to include complex RKHSs and use it to derive several realizations of the complex kernel least-mean-square (CKLMS) algorithm, verifying that the CKLMS offers significant performance improvements over several linear and nonlinear algorithms, when dealing with nonlinearities.
Complex Gaussian Processes for Regression
TLDR
A novel Bayesian solution for nonlinear regression in complex fields by solving the nonlinear channel equalization problem by developing a recursive solution with basis removal and remarkable improvements compared to previous solutions.
Approximation by Fully Complex Multilayer Perceptrons
TLDR
Three proofs of the approximation capability of the fully complex MLP are provided based on the characteristics of singularity among ETFs, which shows the output of complex MLPs using ETFs with isolated and essential singularities uniformly converges to any nonlinear mapping in the deleted annulus of singularities nearest to the origin.
On Complex Valued Convolutional Neural Networks
TLDR
A variation of the CNN model with complex valued input and weights is presented, and it is demonstrated that the complex model is significantly less vulnerable to overfitting and detects meaningful phase structure in the data.
Complex Support Vector Machines for Regression and Quaternary Classification
TLDR
A new framework for complex support vector regression (SVR) as well as Support Vector Machines (SVM) for quaternary classification is presented and it is proved that any complex SVM/SVR task is equivalent with solving two real SVM /SVR tasks exploiting a specific real kernel, which is generated by the chosen complex kernel.
...
...