Semantic Scholar uses AI to extract papers important to this topic.
Transfer learning, where a model is first pre-trained on a data-rich task before being fine-tuned on a downstream task, has… Expand Large Transformer models routinely achieve state-of-the-art results on a number of tasks but training these models can be… Expand We present a new method that views object detection as a direct set prediction problem. Our approach streamlines the detection… Expand Recent progress in natural language processing has been driven by advances in both model architecture and model pretraining… Expand Transformers have a potential of learning longer-term dependency, but are limited by a fixed-length context in the setting of… Expand Recent progress in natural language processing has been driven by advances in both model architecture and model pretraining… Expand Image generation has been successfully cast as an autoregressive sequence generation or transformation problem. Recent work has… Expand Convolutional Neural Networks define an exceptionally powerful class of models, but are still limited by the lack of ability to… Expand The distribution transformer has been in use by utilities throughout the twentieth century. Until now, it has consisted of a… Expand Winding deformation in power transformers can be measured externally using a new frequency response analysis (FRA) method Field… Expand