• Corpus ID: 12806324

Towards an automated method based on Iterated Local Search optimization for tuning the parameters of Support Vector Machines

@article{Consoli2017TowardsAA,
  title={Towards an automated method based on Iterated Local Search optimization for tuning the parameters of Support Vector Machines},
  author={Sergio Consoli and Jacek Kustra and Pieter C. Vos and Monique Hendriks and Dimitrios Mavroeidis},
  journal={ArXiv},
  year={2017},
  volume={abs/1707.03191}
}
We provide preliminary details and formulation of an optimization strategy under current development that is able to automatically tune the parameters of a Support Vector Machine over new datasets. The optimization strategy is a heuristic based on Iterated Local Search, a modification of classic hill climbing which iterates calls to a local search routine. 
1 Citations
A Distributed Support Vector Machine Using Apache Spark for Semi-supervised Classification with Data Augmentation
TLDR
A distributed implementation of support vector machine along with the data augmentation upon the SparkR, which is a recent and effective platform for performing distributed computation, is introduced and analyzed and shows that the proposed approach greatly enhances the predictive performance of the method in terms of execution time and faster processing.

References

SHOWING 1-10 OF 11 REFERENCES
Sequential Model-Based Optimization for General Algorithm Configuration
TLDR
This paper extends the explicit regression models paradigm for the first time to general algorithm configuration problems, allowing many categorical parameters and optimization for sets of instances, and yields state-of-the-art performance.
Iterated Local Search: Framework and Applications
The key idea underlying iterated local search is to focus the search not on the full space of all candidate solutions but on the solutions that are returned by some underlying algorithm, typically a
SVM Parameter Tuning with Grid Search and Its Impact on Reduction of Model Over-fitting
TLDR
This paper addresses the challenge of building robust classification models with support vector machines (SVMs) that are built from time series data and investigates the impact of parameter tuning of SVMs with grid search on the classification performance and its effect on preventing over-fitting.
Random Search for Hyper-Parameter Optimization
TLDR
This paper shows empirically and theoretically that randomly chosen trials are more efficient for hyper-parameter optimization than trials on a grid, and shows that random search is a natural baseline against which to judge progress in the development of adaptive (sequential) hyper- parameter optimization algorithms.
Parameter Tuning via Kernel Matrix Approximation for Support Vector Machine
TLDR
An approximate parameter tuning algorithm APT is designed, which applies MoCIC to compute a low-dimension and low-rank approximation of the kernel matrix, and uses this approximate matrix to efficiently solve the quadratic programming of SVM.
Job-shop scheduling: Computational study of local search and large-step optimization methods
TLDR
From the computational results, it can conclude that the large-step optimization methods outperform the simulated annealing method and find more frequently an optimal schedule than the other studied methods.
SVM parameter selection based on harmony search with an application to hyperspectral image classification
  • O. Ceylan, G. T. Kaya
  • Mathematics, Computer Science
    2016 24th Signal Processing and Communication Application Conference (SIU)
  • 2016
TLDR
H harmony search method, that has been recently introduced as a heuristic method, will be used to optimally determine the kernel parameters of SVM's radial basis kernel function, and the proposed approach will firstly be experimented on hyperspectral datasets.
Support-Vector Networks
TLDR
High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.
The Nature of Statistical Learning Theory
  • V. Vapnik
  • Computer Science, Mathematics
    Statistics for Engineering and Information Science
  • 2000
Setting of the learning problem consistency of learning processes bounds on the rate of convergence of learning processes controlling the generalization ability of learning processes constructing
Selection and parameter optimization of SVM kernel function for underwater target classification
The identification and classification of noise sources in the ocean has become a key task of modern underwater acoustic signal processing and because of the ever changing and complicated oceanic
...
1
2
...