General Cost Models for Evaluating Dimensionality Reduction in High-Dimensional Spaces

@article{Lian2009GeneralCM,
  title={General Cost Models for Evaluating Dimensionality Reduction in High-Dimensional Spaces},
  author={Xiang Lian and Lei Chen},
  journal={IEEE Transactions on Knowledge and Data Engineering},
  year={2009},
  volume={21},
  pages={1447-1460}
}
Similarity search usually encounters a serious problem in the high-dimensional space, known as the "curse of dimensionality". In order to speed up the retrieval efficiency, most previous approaches reduce the dimensionality of the entire data set to a fixed lower value before building indexes (referred to as global dimensionality reduction (GDR)). More recent works focus on locally reducing the dimensionality of data to different values (called the local dimensionality reduction (LDR)). In… CONTINUE READING

References

Publications referenced by this paper.
SHOWING 1-10 OF 27 REFERENCES

Central Limit Theorem, http://mathworld.wolfram

E. W. Weisstein
  • com/CentralLimitTheorem.html,
  • 2008
VIEW 6 EXCERPTS
HIGHLY INFLUENTIAL

A General Cost Model for Dimensionality Reduction in High Dimensional Spaces

  • 2007 IEEE 23rd International Conference on Data Engineering
  • 2007
VIEW 11 EXCERPTS
HIGHLY INFLUENTIAL

An adaptive and efficient dimensionality reduction algorithm for high-dimensional indexing

  • Proceedings 19th International Conference on Data Engineering (Cat. No.03CH37405)
  • 2003
VIEW 15 EXCERPTS
HIGHLY INFLUENTIAL

Database-friendly random projections

VIEW 10 EXCERPTS
HIGHLY INFLUENTIAL

Similar Papers