Estimating the capacities of function-as-a-service functions

@article{Jindal2021EstimatingTC,
  title={Estimating the capacities of function-as-a-service functions},
  author={Anshul Jindal and Mohak Chadha and Shajulin Benedict and Michael Gerndt},
  journal={Proceedings of the 14th IEEE/ACM International Conference on Utility and Cloud Computing Companion},
  year={2021}
}
  • Anshul JindalMohak Chadha M. Gerndt
  • Published 6 December 2021
  • Computer Science
  • Proceedings of the 14th IEEE/ACM International Conference on Utility and Cloud Computing Companion
Serverless computing is a cloud computing paradigm that allows developers to focus exclusively on business logic as cloud service providers manage resource management tasks. Serverless applications follow this model, where the application is decomposed into a set of fine-grained Function-as-a-Service (FaaS) functions. However, the obscurities of the underlying system infrastructure and dependencies between FaaS functions within the application pose a challenge for estimating the performance of… 

FaDO: FaaS Functions and Data Orchestrator for Multiple Serverless Edge-Cloud Clusters

FaDO is presented, designed to allow data-aware functions scheduling across multi-serverless compute clusters present at different locations, such as at the edge and in the cloud, and capable of load balancing high-throughput workloads.

FedLesScan: Mitigating Stragglers in Serverless Federated Learning

FedLesScan is a novel clustering-based semi-asynchronous training strategy, specifically tailored for serverless FL that dynamically adapts to the behaviour of clients and minimizes the effect of stragglers on the overall system.

FedLess: Secure and Scalable Federated Learning Using Serverless Computing

FedLess is the first to enable FL across a large fabric of heterogeneous FaaS providers while providing important features like security and Differential Privacy and demonstrates the practical viability of the methodology by comparing it against a traditional FL system and showing that it can be cheaper and more resource-efficient.

Courier: delivering serverless functions within heterogeneous FaaS deployments

Courier can improve the overall performance of the invocations of the functions within a HeteroFaaSD as compared to traditional load balancing algorithms and two approaches were developed: Auto Weighted Round-Robin (AWRR) and PerFunction Auto Weighting Round- Robin (PFAWRR) that use functions execution times for delivering serverless functions within an FaaS platform to reduce the overall execution time.

References

SHOWING 1-10 OF 27 REFERENCES

Architectural Implications of Function-as-a-Service Computing

FaaS containerization brings up to 20x slowdown compared to native execution, cold-start can be over 10x a short function's execution time, branch mispredictions per kilo-instruction are 20x higher for short functions, memory bandwidth increases by 6x due to the invocation pattern, and IPC decreases by as much as 35% due to inter-function interference.

Sizeless: predicting the optimal size of serverless functions

This paper introduces an approach to predict the optimal resource size of a serverless function using monitoring data from a single resource size, which enables cloud providers to implement resource sizing on a platform level and automate the last resource management task associated with serverless functions.

Performance evaluation of heterogeneous cloud functions

This paper developed a cloud function benchmarking framework, consisting of one suite based on Serverless Framework and one based on HyperFlow, and evaluated all the major cloud function providers: AWS Lambda, Azure Functions, Google Cloud Functions, and IBM Cloud Functions.

Function delivery network: Extending serverless computing for heterogeneous platforms

This work introduces an extension of FaaS to heterogeneous clusters and to support heterogeneous functions through a network of distributed heterogeneous target platforms called Function Delivery Network (FDN), which reduced the overall energy consumption by 17× without violating the SLO requirements in comparison to scheduling on a high‐end target platform.

Architecture-Specific Performance Optimization of Compute-Intensive FaaS Functions

The underlying processor architectures for Google Cloud Functions (GCF) are examined and their prevalence across the 19 available GCF regions are determined to show that optimization of the FaaS functions for the specific architecture is very important.

Peeking Behind the Curtains of Serverless Platforms

This work conducts the largest measurement study to date, launching more than 50,000 function instances across AWS Lambda, Azure Functions, and Google Cloud Functions, in order to characterize their architectures, performance, and resource management efficiency.

Serverless Computing: An Investigation of Factors Influencing Microservice Performance

Results are presented from a comprehensive investigation into the factors which influence microservice performance afforded by serverless computing and hosting implications related to infrastructure elasticity, load balancing, provisioning variation, infrastructure retention, and memory reservation size are examined.

Performance Modeling for Cloud Microservice Applications

The paper addresses the challenge of identifying MSC individually for each microservice and proposed performance modeling for individual microservices are deemed as a major input for the microservice application performance modeling.

Evaluation of Production Serverless Computing Environments

This work claims that the current serverless computing environments can support dynamic applications in parallel when a partitioned task is executable on a small function instance and deploys a series of functions for distributed data processing to address the elasticity.

COSE: Configuring Serverless Functions using Statistical Learning

This paper presents COSE, a framework that uses Bayesian Optimization to find the optimal configuration for serverless functions, and uses statistical learning techniques to intelligently collect samples and predict the cost and execution time of a serverless function across unseen configuration values.