PaLM: A Hybrid Parser and Language Model

@inproceedings{Peng2019PaLMAH,
  title={PaLM: A Hybrid Parser and Language Model},
  author={Hao Peng and Roy Schwartz and Noah A. Smith},
  booktitle={EMNLP/IJCNLP},
  year={2019}
}
We present PaLM, a hybrid parser and neural language model. Building on an RNN language model, PaLM adds an attention layer over text spans in the left context. An unsupervised constituency parser can be derived from its attention weights, using a greedy decoding algorithm. We evaluate PaLM on language modeling, and empirically show that it outperforms strong baselines. If syntactic annotations are available, the attention component can be trained in a supervised manner, providing syntactically… CONTINUE READING

Figures, Tables, and Topics from this paper.

References

Publications referenced by this paper.
SHOWING 1-10 OF 43 REFERENCES

Pointer Sentinel Mixture Models

VIEW 5 EXCERPTS
HIGHLY INFLUENTIAL

Long Short-Term Memory

VIEW 12 EXCERPTS
HIGHLY INFLUENTIAL

Unsupervised Recurrent Neural Network Grammars

VIEW 1 EXCERPT
HIGHLY INFLUENTIAL

Rational Recurrences

VIEW 7 EXCERPTS

A Minimal Span-Based Neural Constituency Parser

VIEW 3 EXCERPTS
HIGHLY INFLUENTIAL