Corpus ID: 220250564

Train and You'll Miss It: Interactive Model Iteration with Weak Supervision and Pre-Trained Embeddings

@article{Chen2020TrainAY,
  title={Train and You'll Miss It: Interactive Model Iteration with Weak Supervision and Pre-Trained Embeddings},
  author={Mayee F. Chen and Daniel Y. Fu and Frederic Sala and Sen Wu and Ravi Teja Mullapudi and Fait Poms and K. Fatahalian and Christopher R'e},
  journal={ArXiv},
  year={2020},
  volume={abs/2006.15168}
}
Our goal is to enable machine learning systems to be trained interactively. This requires models that perform well and train quickly, without large amounts of hand-labeled data. We take a step forward in this direction by borrowing from weak supervision (WS), wherein models can be trained with noisy sources of signal instead of hand-labeled data. But WS relies on training downstream deep networks to extrapolate to unseen data points, which can take hours or days. Pre-trained embeddings can… Expand
Goodwill Hunting: Analyzing and Repurposing Off-the-Shelf Named Entity Linking Systems
Interactive Weak Supervision: Learning Useful Heuristics for Data Labeling

References

SHOWING 1-10 OF 74 REFERENCES
Training Complex Models with Multi-Task Weak Supervision
Label Propagation for Deep Semi-Supervised Learning
Snorkel: Rapid Training Data Creation with Weak Supervision
Data Programming: Creating Large Training Sets, Quickly
Learning Dependency Structures for Weak Supervision Models
Neural Ranking Models with Weak Supervision
Learning to Learn from Weak Supervision by Full Supervision
Exploring the Limits of Weakly Supervised Pretraining
Deep k-Nearest Neighbors: Towards Confident, Interpretable and Robust Deep Learning
...
1
2
3
4
5
...