• Corpus ID: 243986094

Characterization of Frequent Online Shoppers using Statistical Learning with Sparsity

@article{Sambasivan2021CharacterizationOF,
  title={Characterization of Frequent Online Shoppers using Statistical Learning with Sparsity},
  author={Rajiv Sambasivan and Mark Alexander Burgess and J{\"o}rg Schad and Arthur K. Keen and Christopher Woodward and Alexander Geenen and Sachin Sharma},
  journal={ArXiv},
  year={2021},
  volume={abs/2111.06057}
}
Developing shopping experiences that delight the customer requires businesses to understand customer taste. This work reports a method to learn the shopping preferences of frequent shoppers to an online gift store by combining ideas from retail analytics and statistical learning with sparsity. Shopping activity is represented as a bipartite graph. This graph is refined by applying sparsity-based statistical learning methods. These methods are interpretable and reveal insights about customers… 

References

SHOWING 1-10 OF 43 REFERENCES

Data mining for the online retail industry: A case study of RFM model-based customer segmentation using data mining

Many small online retailers and new entrants to the online retail sector are keen to practice data mining and consumer-centric marketing in their businesses yet technically lack the necessary

Session-Based Recommendations Using Item Embedding

It is found that recurrent neural network that preserves the order of user's clicks outperforms standard neural network, item-to-item similarity and SVD and produces a highly condensed item vector space representation, item embedding, with behavioral meaning sub-structure.

Statistical Learning with Sparsity: The Lasso and Generalizations

Statistical Learning with Sparsity: The Lasso and Generalizations presents methods that exploit sparsity to help recover the underlying signal in a set of data and extract useful and reproducible patterns from big datasets.

Latent Dirichlet Allocation

Graph Reduction with Spectral and Cut Guarantees

Sufficient conditions are derived for a small graph to approximate a larger one in the sense of restricted similarity, which gives rise to nearly-linear algorithms that, compared to both standard and advanced graph reduction methods, find coarse graphs of improved quality, often by a large margin, without sacrificing speed.

Convex and Semi-Nonnegative Matrix Factorizations

This work considers factorizations of the form X = FGT, and focuses on algorithms in which G is restricted to containing nonnegative entries, but allowing the data matrix X to have mixed signs, thus extending the applicable range of NMF methods.

On Classification and Regression

A unified framework to approach the problem of computing various types of expressive tests for decision tress and regression trees is presented, and the design of efficient algorithms for computing important special cases is revisited.

Orthogonal NMF through Subspace Exploration

This work presents a new ONMF algorithm with provable approximation guarantees, which relies on a novel approximation to the related Non-negative Principal Component Analysis (NNPCA) problem; given an arbitrary data matrix, NNPCA seeks k nonnegative components that jointly capture most of the variance.

Algorithmic Aspects of Machine Learning

This book bridges theoretical computer science and machine learning by exploring what the two sides can teach each other. It emphasizes the need for flexible, tractable models that better capture not

The Lasso Problem and Uniqueness

The LARS algorithm is extended to cover the non-unique case, so that this path algorithm works for any predictor matrix and a simple method is derived for computing the component-wise uncertainty in lasso solutions of any given problem instance, based on linear programming.