- Full text PDF available (3)
- This year (0)
- Last 5 years (0)
- Last 10 years (3)
Journals and Conferences
BACKGROUND Purchasers can play an important role in eliminating racial and ethnic disparities in health care. A need exists to develop a compelling "business case" from the employer perspective to put, and keep, the issue of racial/ethnic disparities in health care on the quality improvement agenda for health plans and providers. METHODS To illustrate a… (More)
We used data from the U.S. National Health Interview Survey to estimate the effect of diabetes on labor market outcomes. In the year 2050 an estimated 1.46 million U.S. adults will not be working; 597,000 will be work disabled; and 780,000 will have work limitations as a result of diabetes.
Reducing the dimensionality of a classification problem produces a more computationally-efficient system. Since the dimensionality of a classification problem is equivalent to the number of neurons in the first hidden layer of a network, this work shows how to eliminate neurons on that layer and simplify the problem. In the cases where the dimensionality… (More)
Hyperspectral images provide a vast amount of information about a scene. However, much of that information is redundant as the bands are highly correlated. For computational and data compression reasons, it is desired to reduce the dimensionality of the data set while maintaining good performance in image analysis tasks. This work presents a method of… (More)
ZENG, HUIWEN. DIMENSIONALITY REDUCTION AND FEATURE SELECTION USING A MIXED-NORM PENALTY FUNCTION. (Under the direction of Professor H. Joel Trussell). Dimensionality reduction, which is the process of mapping high-dimension patterns to lower dimension subspaces, is a key issues in enhancing the processing efficiency of high dimensional data such as… (More)
The dimensionality of a problem that is addressed by neural networks is related to the number of hidden neuron in the network. Pruning neural networks to reduce the number of hidden neurons reduces the dimensionality of the system, produces a more efficient computation and yields a network with better ability to generalize beyond the training data. This… (More)
Feature selection is the process of selecting effective subsets of features that are effective in performing a given task. We propose an approach using a penalty function combined with a neural network to select a subset from a collection of features while maintaining the performance possible with the larger set. The penalty function is related to a… (More)
Often in dealing with images, the training data must be extracted for a limited data set. In particular, the illumination conditions of the sample images is limited and, in many cases, unknown. In this paper, we show that artificial variation of the illuminant of hyperspectral images can be used to overcome the limitations of a small sample set.
With the proliferation of digital cameras, more consumers are faced with the problem of making color corrections to their pictures. while most picture editors allow some white point, or illuminant, correction and some enhancements, these methods work on the image as a whole rather than on specific regions or colors. This work describes a simple method, that… (More)