Skip to search formSkip to main contentSkip to account menu

Tokenization (data security)

Known as: Token, Tokenization 
Tokenization, when applied to data security, is the process of substituting a sensitive data element with a non-sensitive equivalent, referred to as… 
Wikipedia (opens in a new tab)

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
2016
2016
This paper describes the goals, design and results of a shared task on the automatic linguistic annotation of German language… 
2016
2016
We introduce the first generic text representation model that is completely nonsymbolic, i.e., it does not require the… 
Review
2016
Review
2016
In this research paper,the authors are addressing the problem of algorithms for Wireless LAN for Secured Transmission. This… 
2016
2016
We present a detailed description of our submission to the EmpiriST shared task 2015 for tokenization and part-of-speech tagging… 
Highly Cited
2015
Highly Cited
2015
We introduce a neural machine translation model that views the input and output sentences as sequences of characters rather than… 
2013
2013
We present a comprehensive Arabic tagging system: from the raw text to tagging disambiguation. For each processing step in the… 
2010
2010
We describe an approach to simultaneous tokenization and part-of-speech tagging that is based on separating the closed and open… 
2010
2010
The paper describes our experiments with English-Czech machine translation for WMT10 in 2010. Focusing primarily on the… 
2009
2009
This chapter describes several issues that are fundamental to achieving accurate Chinese parsing given available Chinese… 
2002
2002
Current taggers assume that input texts are already tokenized, i.e. correctly segmented in tokens or high level information units…