Skip to search formSkip to main contentSkip to account menu

Tokenization (data security)

Known as: Token, Tokenization 
Tokenization, when applied to data security, is the process of substituting a sensitive data element with a non-sensitive equivalent, referred to as… 
Wikipedia (opens in a new tab)

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
2016
2016
This paper describes the goals, design and results of a shared task on the automatic linguistic annotation of German language… 
2013
2013
Cooperative spectrum sensing is envisaged to increase reliability in cognitive radio networks. Several users cooperate to detect… 
2013
2013
Numerous path-finding applications do not take into account the actual condition on the road such as congestion or traffic… 
Highly Cited
2007
Highly Cited
2007
We present an approach for extracting relations between named entities from natural language documents. The approach is based… 
2007
2007
Tokenization is a fundamental preprocessing step in Information Retrieval systems in which text is turned into index terms. This… 
2006
2006
In the TREC 2005 Spam Evaluation Track, a number of popular spam filters – all owing their heritage to Graham’s A Plan for Spam… 
Review
2004
Review
2004
This paper describes a multi-word expression processor for preprocessing Turkish text for various language engineering… 
2003
2003
Parallel sub-word recognition (PSWR) is a new model that has been proposed for language identification (LID) which does not need… 
2000
2000
We are facing an growing demand in deploying large scale sensor networks in the real world. However, there are still many… 
1996
1996
Early automated systems for detecting plagiarism in student programs employed attribute counting techniques in their comparisons…