Revisiting wavelet compression for large-scale climate data using JPEG 2000 and ensuring data precision

@article{Woodring2011RevisitingWC,
  title={Revisiting wavelet compression for large-scale climate data using JPEG 2000 and ensuring data precision},
  author={J. Woodring and S. Mniszewski and C. Brislawn and David E. DeMarle and J. Ahrens},
  journal={2011 IEEE Symposium on Large Data Analysis and Visualization},
  year={2011},
  pages={31-38}
}
We revisit wavelet compression by using a standards-based method to reduce large-scale data sizes for production scientific computing. Many of the bottlenecks in visualization and analysis come from limited bandwidth in data movement, from storage to networks. The majority of the processing time for visualization and analysis is spent reading or writing large-scale data or moving data from a remote site in a distance scenario. Using wavelet compression in JPEG 2000, we provide a mechanism to… Expand
Statistical Analysis of Compressed Climate Data
TLDR
This work evaluates the effects of two leading compression algorithms, sz and zfp, on daily average and monthly maximum temperature data, and daily average precipitation rate data, from a historical run of CESM1 CAM5. Expand
Video Compression for Ocean Simulation Image Databases
TLDR
Overall, it is shown that video compression techniques provide an efficient means of storing image databases at a shareable size, while preserving image quality, and enables the wise use of available disk space, so scientists can more easily study the physical features of interest. Expand
Title : Video Compression for Ocean Simulation Image Databases
Climate research requires monitoring a large range of spatial and temporal scales to understand the climate system and potential future impacts. Climate simulations are now run with very highExpand
Reducing the HPC-datastorage footprint with MAFISC—Multidimensional Adaptive Filtering Improved Scientific data Compression
TLDR
A lossless algorithm is developed and its compression ratio is compared to that of standard compression tools and this paper discusses the economics of data compression in HPC environments using the example of the German Climate Computing Center. Expand
A Statistical Analysis of Compressed Climate Model Data
The data storage burden resulting from large climate model simulations continues to grow. While lossy data compression methods can alleviate this burden, they introduce the possibility that keyExpand
Evaluating Lossy Compression on Climate Data
TLDR
This paper examines the effects of three lossy compression methods (GRIB2 encoding, GRIB 2 using JPEG 2000 and LZMA, and the commercial Samplify APAX algorithm) on decompressed data quality, compression ratio, and processing time. Expand
A High Performance Compression Method for Climate Data
TLDR
This paper proposes a lossless compression algorithm for the time-spatial climate floating-point arrays, which can eliminate more data redundancy efficiently through adaptive prediction, XOR-differencing, and multi-way compression, and static regions can be identified and compressed more efficiently. Expand
Evaluating lossy data compression on climate simulation data within a large ensemble
TLDR
This paper reports on the results of a lossy data compression experiment with output from the CESM Large Ensemble (CESM-LE) Community Project, in which climate scientists are challenged to examine features of the data relevant to their interests, and to identify which of the ensemble members have been compressed and reconstructed. Expand
A Collaborative Effort to Improve Lossy Compression Methods for Climate Data
TLDR
The initial results of a successful and mutually beneficial collaboration between the two communities that led to improvements in a well regarded compression algorithm and more effective compression of climate simulation data are reported on. Expand
A methodology for evaluating the impact of data compression on climate simulation data
TLDR
It is found that the diversity of the climate data requires the individual treatment of variables, and, in doing so, the reconstructed data can fall within the natural variability of the system, while achieving compression rates of up to 5:1. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 61 REFERENCES
Wavelet-Based 3D Compression Scheme for Very Large Volume Data
TLDR
An effective 3D compression scheme for very large volume data that exploits the power of wavelet theory that achieves fairly good compression ratios and minimizes the overhead caused during run-time reconstruction of voxel values is described. Expand
An efficient wavelet-based compression method for volume rendering
  • Tae-Young Kim, Y. Shin
  • Computer Science
  • Proceedings. Seventh Pacific Conference on Computer Graphics and Applications (Cat. No.PR00293)
  • 1999
TLDR
This paper presents an efficient wavelet-based compression method providing fast visualization of large volume data, which is divided into individual blocks with regular resolution, resulting in a fairly good compression ratio and fast reconstruction. Expand
A multiresolution volume rendering framework for large-scale time-varying data visualization
TLDR
A new parallel multiresolution volume rendering framework for large-scale time-varying data visualization using the wavelet-based time-space partitioning (WTSP) tree to eliminate the parent-child data dependency for reconstruction and achieve load-balanced rendering. Expand
Wavelet‐Based 3D Compression Scheme for Interactive Visualization of Very Large Volume Data
TLDR
This paper presents an effective 3D compression scheme for interactive visualization of very large volume data, that exploits the power of wavelet theory and minimizes the overhead caused during run‐time reconstruction of voxel values. Expand
Fast and Efficient Compression of Floating-Point Data
TLDR
This work proposes a simple scheme for lossless, online compression of floating-point data that transparently integrates into the I/O of many applications, and achieves state-of-the-art compression rates and speeds. Expand
Wavelet based 3D compression with fast random access for very large volume data
  • Flemming Friche Rodler
  • Computer Science
  • Proceedings. Seventh Pacific Conference on Computer Graphics and Applications (Cat. No.PR00293)
  • 1999
TLDR
Experimental results on the CT dataset of the Visible Human have shown that the proposed wavelet based method for compressing volumetric data with little loss in quality provides very high compression rates with fairly fast random access. Expand
FPC: A High-Speed Compressor for Double-Precision Floating-Point Data
TLDR
FPC is described and evaluated, a fast lossless compression algorithm for linear streams of 64-bit floating-point data that works well on hard-to-compress scientific data sets and meets the throughput demands of high-performance systems. Expand
Advanced techniques for high-quality multi-resolution volume rendering
TLDR
Using a prototype implementation of this compression-based multi-resolution rendering of very large volume data sets at interactive to real-time frame rates on standard PC hardware, a high-quality interactive rendering of large data sets on a single off-the-shelf PC is performed. Expand
Wavelets applied to lossless compression and progressive transmission of floating point data in 3-D curvilinear grids
TLDR
A method of lossless compression using wavelets is presented that enables progressive transmission of computational fluid dynamics (CFD) data in PLOT3D format and provides for lossless reconstruction of the original data. Expand
Interactive remote large-scale data visualization via prioritized multi-resolution streaming
TLDR
This work provides a generalized distance visualization architecture for large remote data that aims to provide interactive analysis and provides the necessary interactivity and full-resolution results dynamically on demand while maintaining a full-featured visualization framework. Expand
...
1
2
3
4
5
...