Text Representation Enrichment Utilizing Graph based Approaches: Stock Market Technical Analysis Case Study

@article{Salamat2022TextRE,
  title={Text Representation Enrichment Utilizing Graph based Approaches: Stock Market Technical Analysis Case Study},
  author={Sara Salamat and Nima Tavassoli and Behnam Sabeti and Reza Fahmi},
  journal={ArXiv},
  year={2022},
  volume={abs/2211.16103}
}
Graph neural networks (GNNs) have been utilized for various natural language processing (NLP) tasks lately. The ability to encode corpus-wide features in graph representation made GNN models popular in various tasks such as document classification. One major shortcoming of such models is that they mainly work on homogeneous graphs, while representing text datasets as graphs requires several node types which leads to a heterogeneous schema. In this paper, we propose a transductive hybrid… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 41 REFERENCES

HeteGCN: Heterogeneous Graph Convolutional Networks for Text Classification

A heterogeneous graph convolutional network (HeteGCN) modeling approach that unites the best aspects of PTE and TextGCN together is proposed, enabling faster training and improving performance in small labeled training set scenario.

Every Document Owns Its Structure: Inductive Text Classification via Graph Neural Networks

This work proposes TextING for inductive text classification via GNN, which first builds individual graphs for each document and then uses GNN to learn the fine-grained word representations based on their local structure, which can also effectively produce embeddings for unseen words in the new document.

Tensor Graph Convolutional Networks for Text Classification

This paper investigates graph-based neural networks for text classification problem with a new framework TensorGCN (tensor graph convolutional networks), which presents an effective way to harmonize and integrate heterogeneous information from different kinds of graphs.

Heterogeneous Graph Attention Networks for Semi-supervised Short Text Classification

A novel heterogeneous graph neural network based method for semi-supervised short text classification, leveraging full advantage of few labeled data and large unlabeled data through information propagation along the graph is proposed.

A Sequential Graph Neural Network for Short Text Classification

This work proposes an improved sequence-based feature propagation scheme, which fully uses word representation and document-level word interaction and overcomes the limitations of textual features in short texts.

TextGTL: Graph-based Transductive Learning for Semi-supervised Text Classification via Structure-Sensitive Interpolation

A novel semi-supervised framework for text classification that refines graph topology under theoretical guidance and shares information across different text graphs, namely Text-oriented Graph-based Transductive Learning (TextGTL), is proposed.

Graph Neural Networks for Natural Language Processing: A Survey

A new taxonomy of GNNs for NLP is proposed, which systematically organizes existing research of Gnns forNLP along three axes: graph construction, graph representation learning, and graph based encoder-decoder models.

Inductive Representation Learning on Large Graphs

GraphSAGE is presented, a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings for previously unseen data and outperforms strong baselines on three inductive node-classification benchmarks.

Knowledge-driven graph similarity for text classification

This paper introduces weighted co-occurrence graphs to represent text documents, which weight the terms and their dependencies based on their relevance to text classification, and proposes a novel method to automatically enrich the weighted graphs using semantic knowledge in the form of a word similarity matrix.

Graph Convolutional Networks for Text Classification

This work builds a single text graph for a corpus based on word co-occurrence and document word relations, then learns a Text Graph Convolutional Network (Text GCN) for the corpus, which jointly learns the embeddings for both words and documents as supervised by the known class labels for documents.