#### Filter Results:

- Full text PDF available (35)

#### Publication Year

1991

2017

- This year (3)
- Last 5 years (25)
- Last 10 years (36)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Data Set Used

#### Key Phrases

Learn More

Large-scale `1-regularized loss minimization problems arise in high-dimensional applications such as compressed sensing and high-dimensional supervised learning, including classification and regression problems. High-performance algorithms and implementations are critical to efficiently solving these problems. Building upon previous work on coordinate… (More)

- David J. Haglin, Shankar M. Venkatesan
- IEEE Trans. Computers
- 1991

- David J. Haglin, Anna M. Manning
- DMIN
- 2007

A new algorithm for minimal infrequent itemset mining is presented. Potential applications of finding infrequent itemsets include statistical disclosure risk assessment, bioinformatics, and fraud detection. This is the first algorithm designed specifically for finding these rare itemsets. Many itemset properties used implicitly in the algorithm are proved.… (More)

- Ted Fischer, Andrew V. Goldberg, David J. Haglin, Serge A. Plotkin
- Inf. Process. Lett.
- 1993

- Eric L. Goodman, David J. Haglin, Chad Scherrer, Daniel G. Chavarría-Miranda, Jace Mogill, John Feo
- 2010 IEEE International Symposium on Parallel…
- 2010

Two of the most commonly used hashing strategies-linear probing and hashing with chaining-are adapted for efficient execution on a Cray XMT. These strategies are designed to minimize memory contention. Datasets that follow a power law distribution cause significant performance challenges to shared memory parallel hashing implementations. Experimental… (More)

- David J. Haglin, Kenneth R. Mayes, +4 authors John A. Keane
- Concurrency and Computation: Practice and…
- 2009

- Anna M. Manning, David J. Haglin, John A. Keane
- Data Mining and Knowledge Discovery
- 2007

A new algorithm, SUDA2, is presented which finds minimally unique itemsets i.e., minimal itemsets of frequency one. These itemsets, referred to as Minimal Sample Uniques (MSUs), are important for statistical agencies who wish to estimate the risk of disclosure of their datasets. SUDA2 is a recursive algorithm which uses new observations about the properties… (More)

- Cliff Joslyn, Bob Adolf, +5 authors David Mizell

As semantic graph database technology grows to address components ranging from extant large triple stores to SPARQL endpoints over SQL-structured relational databases, it will become increasingly important to be able to bring high performance computational resources to bear on their analysis, interpretation, and visualization, especially with respect to… (More)

We present a generic framework for parallel coordinate descent (CD) algorithms that includes, as special cases, the original sequential algorithms Cyclic CD and Stochastic CD, as well as the recent parallel Shotgun algorithm. We introduce two novel parallel algorithms that are also special cases—Thread-Greedy CD and ColoringBased CD—and give performance… (More)

To-date, the application of high-performance computing resources to Semantic Web data has largely focused on commodity hardware and distributed memory platforms. In this paper we make the case that more specialized hardware can offer superior scaling and close to an order of magnitude improvement in performance. In particular we examine the Cray XMT. Its… (More)