On the use of data compression measures to analyze robust designs

  title={On the use of data compression measures to analyze robust designs},
  author={Irad Ben-Gal},
  journal={IEEE Transactions on Reliability},
  • I. Ben-Gal
  • Published 6 September 2005
  • Computer Science
  • IEEE Transactions on Reliability
In this paper, we suggest a potential use of data compression measures, such as the Entropy, and the Huffman Coding, to assess the effects of noise factors on the reliability of tested systems. In particular, we extend the Taguchi method for robust design by computing the entropy of the percent contribution values of the noise factors. The new measures are computed already at the parameter-design stage, and together with the traditional S/N ratios enable the specification of a robust design… 

Figures and Tables from this paper

The alpha error of the Taguchi method with L16 array for the LTB response variable using simulation
Taguchi's quality engineering concepts are of great importance in designing and improving product quality and process. However, most of the controversy and mystique have been centred on Taguchi's
Alpha Error of Taguchi Method with Different OAs for NTB Type QCH by Simulation
Abstract Taguchi method has been widely used for parameter design in many industrial applications. Nevertheless, it has been the subject of discussion and much debate in different platforms. This
Novel Cyclic Redundancy Check software for compressed & decoded Data
The aim of this paper is to describe a software using which, size of any file can be reduced, using the CRC (Cyclic Redundancy Check) algorithm.
Image compression using pictorial representation of data
  • M. Raj, Divjot Kaur
  • Computer Science
    2017 International Conference on Wireless Communications, Signal Processing and Networking (WiSPNET)
  • 2017
Data compression which is independent of redundancy and has a compression ratio equal to or more than 2.5 is discussed.
Adaptive Cross-Correlation Compression Method in Lossless Audio Streaming Compression
A novel method called Adaptive Cross- Correlation Compression Method (ACCM) to compress audio data using Cross-Correlation value to remove the unused space using cross-correlation values between the samples, collapse the repeat values and data encryption.
Robust design optimisation via surrogate network model and soft outer array design
This study presents a soft computing-based robust optimisation that merges control and noise factors into a combined experimental design to establish a surrogate using artificial neural network and provides a superior robust optimum using a much smaller sample and less controlling cost compared with Taguchi method and a conventional response surface method.
Alpha Risk of Taguchi Method with L 18 Array for NTB Type QCH by Simulation
Taguchi method is a widely used approach for parameter design to achieve quality and yield improvements for many business applications. Nevertheless, there has been much discussion in literature
Landauer’s Principle of Minimum Energy Might Place Limits on the Detectability of Gravitons of Certain Mass
According to Landauer’s principle, the energy of a particle may be used to record or erase N number of information bits within the thermal bath. The maximum number of information N recorded by the
Improving Websites' Quality of Service by Shortening Their Browsing Expected Path Length
The proposed method deletes links among webpages to minimize the expected path length of the website and achieves optimality in more than 80% of the tested cases; however, this came with a much higher computational cost.


Performance Measures Independent of Adjustment: An Explanation and Extension of Taguchi's Signal-to-Noise Ratios
Parameter design is a method, popularized by Japanese quality expert G. Taguchi, for designing products and manufacturing processes that are robust to uncontrollable variations. In parameter design,
Self-correcting inspection procedure under inspection errors
This work leads to the development of a Self-Correcting Inspection (SCI) decision rule that does not require complete knowledge of inspection error probabilities, and shows that the proposed rule assures correct classification, if the number of inspection errors is less than a certain number.
Modern Industrial Statistics: Design and Control of Quality and Reliability
Taylor and Francis shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of the Content.
A mathematical theory of communication
In this final installment of the paper we consider the case where the signals or the messages or both are continuously variable, in contrast with the discrete nature assumed until now. To a
Analysis of dynamic robust design experiments
Robust design is an important method for improving product or manufacturing process design by making the output response insensitive (robust) to diDcult-tocontrol variations (noise). Most of the
Quality engineering (Taguchi methods) for the development of electronic circuit technology
Most technology development engineers use traditional reliability engineering methods to calibrate the objective functions of their new systems to meet various marketing requirements. These methods
Robust design: seeking the best of all possible worlds
  • S. Sanchez
  • Computer Science
    2000 Winter Simulation Conference Proceedings (Cat. No.00CH37165)
  • 2000
It is shown how the use of a loss function that incorporates both system mean and system variability can be used to efficiently and effectively carry out system optimization and improvement efforts.
Dispersion Effects From Fractional Designs
It is shown how it is sometimes possible to use unreplicated fractional designs to identify factors that affect variance in addition to those that affect the mean.
A method for the construction of minimum-redundancy codes
  • D. Huffman
  • Computer Science, Business
    Proceedings of the IRE
  • 1952
A minimum-redundancy code is one constructed in such a way that the average number of coding digits per message is minimized.
Multi-criteria Decision Making Methods: A Comparative Study
This paper presents an introduction to Multi-Criteria Decision Making methods and some cases of Ranking Abnormalities when Some MCDM Methods are used, as well as a computational evaluation of the Original and the Revised AHP.