Reduction of detection limit and quantification uncertainty due to interferent by neural classification with abstention

@article{Hagen2022ReductionOD,
  title={Reduction of detection limit and quantification uncertainty due to interferent by neural classification with abstention},
  author={Alex Hagen and Kenneth D. Jarman and Jesse D. Ward and Gregory C. Eiden and Charles J. Barinaga and Emily K. Mace and Craig E. Aalseth and Anthony J. Carado},
  journal={ArXiv},
  year={2022},
  volume={abs/2205.07609}
}

References

SHOWING 1-10 OF 24 REFERENCES
Limits for qualitative detection and quantitative determination
A visiting professor at NIST once pointed out that our measurement professionals are given a difficult task by some of our customers. In a (macroscopically) continuum universe, we are asked to
Combating Label Noise in Deep Learning Using Abstention
TLDR
A loss function is proposed that permits abstention during training thereby allowing the DNN to abstain on confusing samples while continuing to learn and improve classification performance on the non-abstained samples to introduce a novel method to combat label noise when training deep neural networks for classification.
Decision trees for optimizing the minimum detectable concentration of radioxenon detectors.
Generalized ODIN: Detecting Out-of-Distribution Image Without Learning From Out-of-Distribution Data
TLDR
This work bases its work on a popular method ODIN, proposing two strategies for freeing it from the needs of tuning with OoD data, while improving its OoD detection performance, and proposing to decompose confidence scoring as well as a modified input pre-processing method.
Background rejection in NEXT using deep neural networks
We investigate the potential of using deep learning techniques to reject background events in searches for neutrinoless double beta decay with high pressure xenon time projection chambers capable of
Selective Classification for Deep Neural Networks
TLDR
A method to construct a selective classifier given a trained neural network, which allows a user to set a desired risk level and the classifier rejects instances as needed, to grant the desired risk (with high probability).
Background rejection in atmospheric Cherenkov telescopes using recurrent convolutional neural networks
TLDR
A new, high performance algorithm for background rejection in imaging atmospheric Cherenkov telescopes is presented, with significantly improved performance over the current standard methods, with a 20–25% reduction in the background rate when applying the recurrent neural network analysis.
On the psychology of prediction
In this paper, we explore the rules that determine intuitive predictions and judgments of confidence and contrast these rules to the normative principles of statistical prediction. Two classes of
On Mixup Training: Improved Calibration and Predictive Uncertainty for Deep Neural Networks
TLDR
DNNs trained with mixup are significantly better calibrated and are less prone to over-confident predictions on out-of-distribution and random-noise data, suggesting that mixup be employed for classification tasks where predictive uncertainty is a significant concern.
The Nearest Neighbor Classification Rule with a Reject Option
  • M. Hellman
  • Computer Science
    IEEE Trans. Syst. Sci. Cybern.
  • 1970
TLDR
Here the (k,k?) nearest neighbor rule with a reject option is examined, which looks at the k nearest neighbors and rejects if less than k? of these are from the same class; if k? or more are from one class, a decision is made in favor of that class.
...
...