#### Filter Results:

- Full text PDF available (9)

#### Publication Year

2012

2016

- This year (0)
- Last 5 years (9)
- Last 10 years (9)

#### Publication Type

#### Co-author

#### Publication Venue

#### Data Set Used

#### Key Phrases

Learn More

- Mina Ghashami, Jeff M. Phillips
- SODA
- 2014

We consider processing an n × d matrix A in a stream with row-wise updates according to a recent algorithm called Frequent Directions (Liberty, KDD 2013). This algorithm maintains an ℓ × d matrix Q deterministically, processing each row in O(dℓ 2) time; the processing time can be decreased to O(dℓ) with a slight modification in the algorithm and a constant… (More)

- Mina Ghashami, Edo Liberty, Jeff M. Phillips, David P. Woodruff
- SIAM J. Comput.
- 2016

We describe a new algorithm called Frequent Directions for deterministic matrix sketching in the row-updates model. The algorithm is presented an arbitrary input matrix A ∈ R n×d one row at a time. It performed O(dd) operations per row and maintains a sketch matrix B ∈ R ×d such that for any k < A T A − B T B 2 2 ≤ A − A k 2 F /(− k) and A − π B k (A) 2 F ≤… (More)

- Amey Desai, Mina Ghashami, Jeff M. Phillips
- IEEE Transactions on Knowledge and Data…
- 2014

Matrices have become essential data representations for many large-scale problems in data analytics, and hence matrix sketching is a critical task. Although much research has focused on improving the error/size tradeoff under various sketching paradigms, the many forms of error bounds make these approaches hard to compare in theory and in practice. This… (More)

- Mina Ghashami, Edo Liberty, Jeff M. Phillips
- KDD
- 2016

This paper describes Sparse Frequent Directions, a variant of Frequent Directions for sketching sparse matrices. It resembles the original algorithm in many ways: both receive the rows of an input matrix <i>A</i><sup><i>n</i> x <i>d</i></sup> one by one in the streaming setting and compute a small sketch <i>B</i> ∈ <b>R</b><sup><i>l</i> x… (More)

- Mina Ghashami, Jeff M. Phillips, Feifei Li
- PVLDB
- 2014

Tracking and approximating data matrices in streaming fashion is a fundamental challenge. The problem requires more care and attention when data comes from multiple distributed sites, each receiving a stream of data. This paper considers the problem of " tracking approximations to a matrix " in the distributed streaming model. In this model, there are m… (More)

- Mina Ghashami, Daniel J. Perry, Jeff M. Phillips
- AISTATS
- 2016

Kernel principal component analysis (KPCA) provides a concise set of basis vectors which capture nonlinear structures within large data sets, and is a central tool in data analysis and learning. To allow for nonlinear relations, typically a full n ⇥ n kernel matrix is constructed over n data points, but this requires too much space and time for large values… (More)

- Mina Ghashami, Amirali Abdullah
- ArXiv
- 2015

Big data is becoming ever more ubiquitous, ranging over massive video repositories, document corpuses, image sets and Internet routing history. Proximity search and clustering are two algorith-mic primitives fundamental to data analysis, but suffer from the " curse of dimensionality " on these gigantic datasets. A popular attack for this problem is to… (More)

Random Projection is another class of methods used for low-rank matrix approximation. A random projection algorithm projects datapoints from a high-dimensional space R n onto a lower-dimensional subspace R r (r ⌧ n) using a random matrix S 2 R r⇥n. The key idea of random mapping comes from the Johnson-Lindenstrauss lemma[7] (we will explain later in detail)… (More)

- Mina Ghashami, Hoda Mashayekhi, Jafar Habibi
- IDCS
- 2012

Document indexing using dimension reduction has been widely studied in recent years. Application of these methods in large distributed systems may be inefficient due to the required computational, storage, and communication costs. In this paper, we propose DLPR, a distributed locality preserving dimension reduction algorithm, to project a large distributed… (More)

- ‹
- 1
- ›