• Publications
  • Influence
Support Vector Clustering
TLDR
A novel clustering method using the approach of support vector machines, where data points are mapped by means of a Gaussian kernel to a high dimensional feature space, where the minimal enclosing sphere is searched for. Expand
On the Computational Power of Neural Nets
TLDR
It is proved that one may simulate all Turing machines by such nets, and any multi-stack Turing machine in real time, and there is a net made up of 886 processors which computes a universal partial-recursive function. Expand
Neural networks and analog computation - beyond the Turing limit
  • H. Siegelmann
  • Mathematics, Computer Science
  • Progress in theoretical computer science
  • 1 March 1999
TLDR
This chapter discusses Neural Networks and Turing Machines, which are concerned with the construction of neural networks based on the explicit specification of a discrete-time Turing machine. Expand
Posttranscriptional Regulation of BK Channel Splice Variant Stability by miR-9 Underlies Neuroadaptation to Alcohol
TLDR
Computational modeling indicates that in adult mammalian brain, alcohol upregulates microRNA miR-9 and mediates posttranscriptional reorganization in BK mRNA splice variants by mi R-9-dependent destabilization of BK mRNAs containing 3'UTRs with a miR -9 Recognition Element (MRE). Expand
Analog Computation via Neural Networks
TLDR
It is noted that these networks are not likely to solve polynomially NP-hard problems, as the equality “ p = np ” in the model implies the almost complete collapse of the standard polynomial hierarchy. Expand
Computation Beyond the Turing Limit
TLDR
A simply described but highly chaotic dynamical system called the analog shift map is presented here, which has computational power beyond the Turing limit (super-Turing); it computes exactly like neural networks and analog machines. Expand
Turing computability with neural nets
Abstract This paper shows the existence of a finite neural network, made up of sigmoidal neurons, which simulates a universal Turing machine. It is composed of less than 10 5 synchronously evolvingExpand
Computational capabilities of recurrent NARX neural networks
TLDR
It is constructively proved that the NARX networks with a finite number of parameters are computationally as strong as fully connected recurrent networks and thus Turing machines, raising the issue of what amount of feedback or recurrence is necessary for any network to be Turing equivalent and what restrictions on feedback limit computational power. Expand
A support vector clustering method
We present a novel kernel method for data clustering using a description of the data by support vectors. The kernel reflects a projection of the data points from data space to a high dimensionalExpand
The Dynamic Universality of Sigmoidal Neural Networks
TLDR
The techniques can be applied to a much more general class of “sigmoidal-like” activation functions, suggesting that Turing universality is a relatively common property of recurrent neural network models. Expand
...
1
2
3
4
5
...