Synthesizing Configuration Tactics for Exercising Hidden Options in Serverless Systems

@article{Kuhlenkamp2022SynthesizingCT,
  title={Synthesizing Configuration Tactics for Exercising Hidden Options in Serverless Systems},
  author={J{\"o}rn Kuhlenkamp and Sebastian Werner and Chinh Tran and Stefan Tai},
  journal={ArXiv},
  year={2022},
  volume={abs/2205.15904}
}
A proper configuration of an information system can ensure accuracy and efficiency, among other system objectives. Conversely, a poor configuration can have a significant negative impact on the system’s performance, reliability, and cost. Serverless systems, which are comprised of many functions and managed services, especially risk exposure to misconfigurations, with many provider-and platform-specific, often intransparent and ‘hidden’ settings. In this paper, we argue to pay close attention to the… 

References

SHOWING 1-10 OF 27 REFERENCES

Sizeless: predicting the optimal size of serverless functions

This paper introduces an approach to predict the optimal resource size of a serverless function using monitoring data from a single resource size, which enables cloud providers to implement resource sizing on a platform level and automate the last resource management task associated with serverless functions.

COSE: Configuring Serverless Functions using Statistical Learning

This paper presents COSE, a framework that uses Bayesian Optimization to find the optimal configuration for serverless functions, and uses statistical learning techniques to intelligently collect samples and predict the cost and execution time of a serverless function across unseen configuration values.

AI-based Resource Allocation: Reinforcement Learning for Adaptive Auto-scaling in Serverless Environments

This work investigates the applicability of a reinforcement learning approach to request-based auto-scaling in a serverless framework, and shows that within a limited number of iterations the proposed model learns an effective scaling policy per workload, improving the performance compared to the default auto- scaling configuration.

BATCH: Machine Learning Inference Serving on Serverless Platforms with Adaptive Batching

BATCH uses an optimizer to provide inference tail latency guarantees and cost optimization and to enable adaptive batching support and demonstrates performance and cost advantages over the state-of-the-art method MArk and the state of the practice tool SageMaker.

Primula: a Practical Shuffle/Sort Operator for Serverless Computing

The experience in designing Primula, a serverless sort operator that abstracts away users from the complexities of resource provisioning, skewed data and stragglers, yielding the most accessible sort primitive to date is reported.

Sequoia: enabling quality-of-service in serverless computing

Results with controlled and realistic workloads show Sequoia seamlessly adapts to policies, eliminates mid-chain drops, reduces queuing times by up to 6.4X, enforces tight chain-level fairness, and improves run-time performance up to 25X.

An effective resource management approach in a FaaS environment

This paper investigates and proposes a new resource management approach in a FaaS platform, based on intelligent techniques, which is employed to deliver optimal solutions in a multiobjective environment.

Costless: Optimizing Cost of Serverless Computing through Function Fusion and Placement

This paper presents an efficient algorithm that optimizes the price of serverless applications in AWS Lambda and shows that the algorithm can find solutions optimizing the price by more than 35%-57% with only 5%-15% increase in latency.

EMARS: Efficient Management and Allocation of Resources in Serverless

We introduce EMARS, an efficient resource management system for serverless cloud computing frameworks with the goal to enhance resource (focus on memory) allocation among containers. We have built

An Evaluation of FaaS Platforms as a Foundation for Serverless Big Data Processing

A novel evaluation method (SIEM) is proposed to understand the impact of automatic infrastructure management on serverless big data applications remains unexplored, and new metrics to quantify quality in different big data application scenarios are introduced.