Learn More
The likelihood ratio (LR) is a probabilistic method that has been championed as a 'simple rule' for evaluating the probative value of forensic evidence in court. Intuitively, if the LR is greater than one then the evidence supports the prosecution hypothesis; if the LR is less than one it supports the defence hypothesis, and if the LR is equal to one then(More)
TTL caching models have recently regained significant research interest, largely due to their ability to fit popular caching policies such as LRU. In this extended abstract we briefly describe our recent work on two <i>exact</i> methods to analyze TTL cache networks. The first method generalizes existing results for line networks under renewal requests to(More)
TTL caching models have recently regained significant research interest due to their connection to popular caching policies such as LRU. This paper advances the state-of-the-art analysis of TTL-based cache networks by developing two exact methods with orthogonal generality and computational complexity. The first method generalizes existing results for line(More)
Frequency jamming is the fiercest attack tool to disrupt wireless communication and its malicious aspects have received much attention in the literature. Yet, several recent works propose to turn the table and employ so-called friendly jamming for the benefit of a wireless network. For example, recently proposed friendly jamming applications include hiding(More)
Much of the work in this unpublished draft paper has subsequently been published in the following (which should be cited): (2014). "When 'neutral' evidence still has probative value (with implications from the Barry George Case)", A. (2014). "Calculating and understanding the value of any type of match evidence when there are potential testing errors".(More)
TTL cache models provide an attractive unified approximation framework for caching policies like LRU and FIFO, whose exact analysis is notoriously hard. In this paper, we advance the understanding of TTL models by explicitly considering stochastic capacity constraints. We find in particular that reducing the variance of the cache occupancy is instrumental(More)
Adversarial models of traffic generation replace probabilis-tic assumptions by considering the deterministic worst-case. The framework of adversarial queueing theory (AQT) has discovered unexpected results on the stability of networks and has seen continuous research efforts over more than 15 years. So far, almost all AQT results have been derived(More)
Meeting tail latency Service Level Objectives (SLOs) in shared cloud networks is both important and challenging. One primary challenge is determining limits on the multi-tenancy such that SLOs are met. Doing so involves estimating latency, which is difficult, especially when tenants exhibit bursty behavior as is common in production environments.(More)