Adam L. Buchsbaum

Learn More
We describe a new external memory data structure, the buffered repository tree, and use it to provide the first non-trivial external memory algorithm for directed breadth-first search (BFS) and an improved external algorithm for directed depth-first search. We also demonstrate the equivalence of various formulations of external undirected BFS, and we use(More)
We present a new approach for designing external graph algorithms and use it to design simple, deterministic and randomized external algorithms for computing connected components, minimum spanning forests, bottleneck minimum spanning forests, maximal independent sets (randomized only), and maximal matchings in undirected graphs. Our I/ O bounds compete with(More)
We present a new linear-time algorithm to find the immediate dominators of all vertices in a flowgraph. Our algorithm is simpler than previous linear-time algorithms: rather than employ complicated data structures, we combine the use of microtrees and memoization with new observations on a restricted class of path compressions. We have implemented our(More)
We present two new data structure tools—disjoint set union with bottom-up linking, and pointer-based radix sort—and combine them with bottom-level microtrees to devise the first linear-time pointer-machine algorithms for off-line least common ancestors, minimum spanning tree (MST) verification, randomized MST construction, and computing dominators in a(More)
D<sc>YNAMIC</sc> S<sc>TORAGE</sc> A<sc>LLOCATION</sc> is the problem of packing given axis-aligned rectangles into a horizontal strip of minimum height by sliding the rectangles vertically but not horizontally. Where <i>L=LOAD</i> is the maximum sum of heights of rectangles that intersect any vertical line and <i>OPT</i> is the minimum height of the(More)
We formalize the problem of maintaining views of graphs. These are graphs induced by the contraction of vertex subsets that are defined by associated hierarchies. We provide data structures that allow applications to refine and coarsen such views interactively and efficiently, in time linear in the number of changes induced by any exploration operation. The(More)
We study the problem of compressing massive tables within the partition-training paradigm introduced by Buchsbaum et al. [2000], in which a table is partitioned by an off-line training procedure into disjoint intervals of columns, each of which is compressed separately by a standard, on-line compressor like gzip. We provide a new theory that unifies(More)
We study the problem of compressing massive tables. We devise a novel compression paradigm—training for lossless compression— which assumes that the data exhibit dependencies that can be learned by examining a small amount of training material. We develop an experimental methodology to test the approach. Our result is a system, pzip, which outperforms gzip(More)
Suppose we are given a set of objects that cover a region and a duration associated with each object. Viewing the objects as jobs, can we schedule their beginning times to maximize the length of time that the original region remains covered? We call this problem the SENSOR COVER PROBLEM. It arises in the context of covering a region with sensors. For(More)