In this paper we consider generalized classes of potentially non-monotone risk functions for use as evaluation metrics in learning tasks. The resulting risks are in general non-convex and non-smooth, which makes both the computational and inferential sides of the learning problem difficult. For random losses belonging to any Banach space, we obtain sufficient conditions for the risk functions to be weakly convex, and to admit unbiased stochastic directional derivatives. We then use recent work… CONTINUE READING