#### Filter Results:

#### Publication Year

2001

2016

#### Publication Type

#### Co-author

#### Key Phrase

#### Publication Venue

Learn More

We revisit the rate-1 blockcipher based hash functions as first studied by Preneel, Govaerts and Vandewalle (Crypto'93) and later extensively analysed by Black, Rogaway and Shrimpton (Crypto'02). We analyse a further generalization where any pre-and postprocessing is considered. This leads to a clearer understanding of the current classification of rate-1… (More)

This paper investigates the Random Oracle Model (ROM) feature known as programmability, which allows security reductions in the ROM to dynamically choose the range points of an ideal hash function. This property is interesting for at least two reasons: first, because of its seeming artificiality (no standard model hash function is known to support such… (More)

We present a new approach to the design of IND-CCA2 secure hybrid encryption schemes in the standard model. Our approach provides an efficient generic transformation from 1-universal to 2-universal hash proof systems. The transformation involves a randomness extractor based on a 4-wise independent hash function as the key derivation function. Our… (More)

Loosely speaking, an obfuscation O of a function f should satisfy two requirements: rstly, using O, it should be possible to evaluate f ; secondly, O should not reveal anything about f that cannot be learnt from oracle access to f alone. Several denitions for obfuscation exist. However, most of them are very hard to satisfy, even when focusing on specic… (More)

We consider how to build an efficient compression function from a small number of random, non-compressing primitives. Our main goal is to achieve a level of collision resistance as close as possible to the optimal birthday bound. We present a 2n-ton bit compression function based on three independent n-ton bit random functions, each called only once. We… (More)

Although identity based cryptography ooers a number of functional advantages over conventional public key methods, the computational costs are signiicantly greater. The dominant part of this cost is the Tate pairing which, in characteristic three, is best computed using the algorithm of Duursma and Lee. However, in hardware and constrained environments this… (More)

This paper describes several speedups and simplifications for XTR. The most important results are new XTR double and single ex-ponentiation methods where the latter requires a cheap precomputation. Both methods are on average more than 60% faster than the old methods, thus more than doubling the speed of the already fast XTR signature applications. An… (More)

This paper describes several speedups for computation in the order p + 1 subgroup of F * p 2 and the order p 2 − p + 1 subgroup of F * p 6. These results are in a way complementary to LUC and XTR, where computations in these groups are sped up using trace maps. As a side result, we present an efficient method for XTR with p ≡ 3 mod 4.