• Corpus ID: 248227825

Finding Materialized Models for Model Reuse

@inproceedings{Zhao2021FindingMM,
  title={Finding Materialized Models for Model Reuse},
  author={Minjun Zhao and Lu Chen and Keyu Yang and Yuntao Du and Yunjun Gao},
  year={2021}
}
—Materialized model query aims to find the most appropriate materialized model as the initial model for model reuse. It is the precondition of model reuse, and has recently attracted much attention. Nonetheless, the existing methods suffer from low privacy protection, limited range of applications, and inefficiency since they do not construct a suitable metric to measure the target-related knowledge of materialized models. To address this, we present MMQ, a privacy-protected, general, efficient… 

References

SHOWING 1-10 OF 60 REFERENCES
Don't Fear the REAPER: A Framework for Materializing and Reusing Deep-Learning Models
  • Melanie B. Sigl
  • Computer Science
    2019 IEEE 35th International Conference on Data Engineering (ICDE)
  • 2019
TLDR
The aim of this research is to reduce training time of machine learning from a data-management perspective through model reuse, and shed some light on the above relationship in the case when reusing a model is appropriate.
On automated source selection for transfer learning in convolutional neural networks
Gradient-based learning applied to document recognition
TLDR
This paper reviews various methods applied to handwritten character recognition and compares them on a standard handwritten digit recognition task, and Convolutional neural networks are shown to outperform all other techniques.
Efficient Construction of Approximate Ad-Hoc ML models Through Materialization and Reuse
TLDR
A cost based optimization framework that identifies appropriate ML models to combine at query time and conduct extensive experiments on real-world and synthetic datasets, indicating that the framework can support analytic queries on ML models, with superior performance, achieving dramatic speedups of several orders in magnitude on very large datasets.
On Model Discovery For Hosted Data Science Projects
TLDR
Instead of prescribing a structured data model for data science projects, this work takes an information retrieval approach by decomposing the discovery task into three major steps: project query and matching, model comparison and ranking, and processing and building ensembles with returned models.
Towards Unified Data and Lifecycle Management for Deep Learning
TLDR
A high-level domain specific language (DSL) is proposed, inspired by SQL, to raise the abstraction level and thereby accelerate the modeling process and to manage the variety of data artifacts, especially the large amount of checkpointed float parameters.
Deep Residual Learning for Image Recognition
TLDR
This work presents a residual learning framework to ease the training of networks that are substantially deeper than those used previously, and provides comprehensive empirical evidence showing that these residual networks are easier to optimize, and can gain accuracy from considerably increased depth.
Very Deep Convolutional Networks for Large-Scale Image Recognition
TLDR
This work investigates the effect of the convolutional network depth on its accuracy in the large-scale image recognition setting using an architecture with very small convolution filters, which shows that a significant improvement on the prior-art configurations can be achieved by pushing the depth to 16-19 weight layers.
ImageNet classification with deep convolutional neural networks
TLDR
A large, deep convolutional neural network was trained to classify the 1.2 million high-resolution images in the ImageNet LSVRC-2010 contest into the 1000 different classes and employed a recently developed regularization method called "dropout" that proved to be very effective.
Source-selectionfree transfer learning
  • IJCAI, pages 2355–2360
  • 2011
...
1
2
3
4
5
...