• Corpus ID: 232035861

DNN2LR: Automatic Feature Crossing for Credit Scoring

  title={DNN2LR: Automatic Feature Crossing for Credit Scoring},
  author={Qiang Liu and Zhaocheng Liu and Hao Zhang and Yuntian Chen and Jun Zhu},
Credit scoring is a major application of machine learning for financial institutions to decide whether to approve or reject a credit loan. For sake of reliability, it is necessary for credit scoring models to be both accurate and globally interpretable. Simple classifiers, e.g., Logistic Regression (LR), are white-box models, but not powerful enough to model complex nonlinear interactions among features. Fortunately, automatic feature crossing is a promising way to find cross features to make… 

Figures and Tables from this paper



AutoFIS: Automatic Feature Interaction Selection in Factorization Models for Click-Through Rate Prediction

This work proposes a two-stage algorithm called Automatic Feature Interaction Selection (AutoFIS), which can automatically identify important feature interactions for factorization models with computational cost just equivalent to training the target model to convergence.

AutoCross: Automatic Feature Crossing for Tabular Data in Real-World Applications

This paper presents AutoCross, an automatic feature crossing tool provided by 4Paradigm to its customers, and shows that AutoCross can significantly enhance the performance of both linear and deep models.

xDeepFM: Combining Explicit and Implicit Feature Interactions for Recommender Systems

A novel Compressed Interaction Network (CIN), which aims to generate feature interactions in an explicit fashion and at the vector-wise level and is named eXtreme Deep Factorization Machine (xDeepFM), which is able to learn certain bounded-degree feature interactions explicitly and can learn arbitrary low- and high-order feature interactions implicitly.

Deep & Cross Network for Ad Click Predictions

This paper proposes the Deep & Cross Network (DCN), which keeps the benefits of a DNN model, and beyond that, it introduces a novel cross network that is more efficient in learning certain bounded-degree feature interactions.

Attention is All you Need

A new simple network architecture, the Transformer, based solely on attention mechanisms, dispensing with recurrence and convolutions entirely is proposed, which generalizes well to other tasks by applying it successfully to English constituency parsing both with large and limited training data.

Higher-Order Factorization Machines

The first generic yet efficient algorithms for training arbitrary-order higher-orderFactorization machines (HOFMs) are presented and new variants of HOFMs with shared parameters are presented, which greatly reduce model size and prediction times while maintaining similar accuracy.

Practical Lessons from Predicting Clicks on Ads at Facebook

This paper introduces a model which combines decision trees with logistic regression, outperforming either of these methods on its own by over 3%, an improvement with significant impact to the overall system performance.

Accurate intelligible models with pairwise interactions

This paper develops a novel, computationally efficient method called FAST for ranking all possible pairs of features as candidates for inclusion into the model, and shows the effectiveness of FAST in ranking candidate pairs of Features.

AutoInt: Automatic Feature Interaction Learning via Self-Attentive Neural Networks

An effective and efficient method called the AutoInt to automatically learn the high-order feature interactions of input features and map both the numerical and categorical features into the same low-dimensional space is proposed.

Visualizing and Understanding Neural Models in NLP

Four strategies for visualizing compositionality in neural models for NLP, inspired by similar work in computer vision, including LSTM-style gates that measure information flow and gradient back-propagation, are described.