Corpus ID: 238419545

Exploiting Language Model for Efficient Linguistic Steganalysis

@inproceedings{Yi2021ExploitingLM,
  title={Exploiting Language Model for Efficient Linguistic Steganalysis},
  author={Biao Yi and Hanzhou Wu and Guorui Feng and Xinpeng Zhang},
  year={2021}
}
Recent advances in linguistic steganalysis have successively applied CNN, RNN, GNN and other efficient deep models for detecting secret information in generative texts. These methods tend to seek stronger feature extractors to achieve higher steganalysis effects. However, we have found through experiments that there actually exists significant difference between automatically generated stego texts and carrier texts in terms of the conditional probability distribution of individual words. Such… Expand

Figures and Tables from this paper

References

SHOWING 1-10 OF 27 REFERENCES
Linguistic Steganalysis With Graph Neural Networks
TLDR
In the proposed method, texts are translated as directed graphs with the associated information, where nodes denote words and edges show associations between the words, and adopted a globally-shared matrix to record correlation strengths between words so that each text can effectively utilize the global information to obtain the better self-representation. Expand
RNN-Stega: Linguistic Steganography Based on Recurrent Neural Networks
TLDR
A linguistic steganography based on recurrent neural networks, which can automatically generate high-quality text covers on the basis of a secret bitstream that needs to be hidden, and achieves the state-of-the-art performance. Expand
TS-RNN: Text Steganalysis Based on Recurrent Neural Networks
TLDR
This letter observes that the conditional probability distribution of each word in the automatically generated steganographic texts will be distorted after embedded with hidden information and uses recurrent neural networks to extract feature distribution differences and then classify those features into cover text and stego text categories. Expand
Convolutional Neural Network Based Text Steganalysis
TLDR
This letter proposes a novel text steganalysis model based on convolutional neural network, which is able to capture complex dependencies and learn feature representations automatically from the texts, and uses a word embedding layer to extract the semantic and syntax feature of words. Expand
VAE-Stega: Linguistic Steganography Based on Variational Auto-Encoder
TLDR
Experimental results show that the proposed model can greatly improve the imperceptibility of the generated steganographic sentences and thus achieves the state of the art performance. Expand
A Fast and Efficient Text Steganalysis Method
TLDR
This letter proposed a fast and efficient text steganalysis method that can achieve a high detection accuracy and shows a state-of-the-art performance. Expand
Generative Text Steganography Based on LSTM Network and Attention Mechanism with Keywords
TLDR
Experiments show that the steganographic text generated by the proposed method is of higher semantic quality and more capable of resisting against steganalysis, which has shown the superiority. Expand
Real-Time Text Steganalysis Based on Multi-Stage Transfer Learning
TLDR
The experimental results show that the proposed text steganalysis method can outperform previously reported methods in terms of detection accuracy and inference efficiency, and enhance inference efficiency and detection performance simultaneously. Expand
Steganalysis against substitution-based linguistic steganography based on context clusters
TLDR
This paper provides a new steganalysis scheme against substitution-based linguistic steganography based on context clusters and introduces context clusters to estimate the context fitness and shows how to use the statistics of context fitness values to distinguish between normal texts and stego texts. Expand
An Efficient Linguistic Steganography for Chinese Text
TLDR
A Chinese linguistic steganography algorithm is presented by utilizing the existing Chinese information processing techniques based on the substitution of synonyms and variant forms of the same word in order to decrease the interaction between the surrounding words and the substituted word. Expand
...
1
2
3
...