• Corpus ID: 56425999

Conceptualization of an Autonomic Machine Learning Platform for Non-Expert Developers

@article{Lee2017ConceptualizationOA,
  title={Conceptualization of an Autonomic Machine Learning Platform for Non-Expert Developers},
  author={Keon Myung Lee and Jaesoo Yoo and Jiman Hong},
  journal={WSEAS Transactions on Computers archive},
  year={2017},
  volume={16}
}
Machine learning is an approach to develop some algorithm for problem solving from data of the problem domain without coding programs. Although there are various machine learning tools with which machine learning applications can be developed relatively easily, non-experts have yet difficulties in developing machine learning applications. To be a successful developer, it is required to understand machine learning algorithms and to make right design choices. This paper addresses the decision… 

Figures from this paper

References

SHOWING 1-10 OF 17 REFERENCES
MLbase: A Distributed Machine-learning System
TLDR
This work presents the vision for MLbase, a novel system harnessing the power of machine learning for both end-users and ML researchers, which provides a simple declarative way to specify ML tasks and a novel optimizer to select and dynamically adapt the choice of learning algorithm.
A comparison of platforms for implementing and running very large scale machine learning algorithms
We describe an extensive benchmark of platforms available to a user who wants to run a machine learning (ML) inference algorithm over a very large data set, but cannot find an existing implementation
Social big data: Recent achievements and new challenges
Scaling Distributed Machine Learning with the Parameter Server
TLDR
View on new challenges identified are shared, and some of the application scenarios such as micro-blog data analysis and data processing in building next generation search engines are covered.
A survey on platforms for big data analytics
TLDR
An in-depth analysis of different hardware platforms available for big data analytics and assesses the advantages and drawbacks of each of these platforms based on various metrics such as scalability, data I/O rate, fault tolerance, real-time processing, data size supported and iterative task support.
Auto-WEKA: combined selection and hyperparameter optimization of classification algorithms
TLDR
This work considers the problem of simultaneously selecting a learning algorithm and setting its hyperparameters, going beyond previous work that attacks these issues separately and shows classification performance often much better than using standard selection and hyperparameter optimization methods.
Making a Science of Model Search: Hyperparameter Optimization in Hundreds of Dimensions for Vision Architectures
TLDR
This work proposes a meta-modeling approach to support automated hyperparameter optimization, with the goal of providing practical tools that replace hand-tuning with a reproducible and unbiased optimization process.
Random Search for Hyper-Parameter Optimization
TLDR
This paper shows empirically and theoretically that randomly chosen trials are more efficient for hyper-parameter optimization than trials on a grid, and shows that random search is a natural baseline against which to judge progress in the development of adaptive (sequential) hyper- parameter optimization algorithms.
A Tutorial on Bayesian Optimization of Expensive Cost Functions, with Application to Active User Modeling and Hierarchical Reinforcement Learning
TLDR
A tutorial on Bayesian optimization, a method of finding the maximum of expensive cost functions using the Bayesian technique of setting a prior over the objective function and combining it with evidence to get a posterior function.
An empirical evaluation of deep architectures on problems with many factors of variation
TLDR
A series of experiments indicate that these models with deep architectures show promise in solving harder learning problems that exhibit many factors of variation.
...
...