Chi-Kit Lam

Learn More
1 Overview In this lecture we will talk about adaptive sparse recovery. 2 Adaptivity in group testing In sparse recovery, we have y = Ax = In our previous non-adaptive setting, v i 's were chosen independently. Intuitively, it seems we may be able to do better if they are not independent. So, in the adaptive setting we talk about today, The idea is, for(More)
Consider a complete weighted bipartite graph G in which each left vertex u has two real numbers intercept and slope, each right vertex v has a real number quality, and the weight of any edge (u, v) is defined as the intercept of u plus the slope of u times the quality of v. Let m (resp., n) denote the number of left (resp., right) vertices, and assume that(More)
1 Overview In this lecture we will talk about information and compression, which the Huffman coding can achieve with the average number of bits sent almost precisely equal to the entropy. Then we will talk about communication complexity and information cost, which is the asymptotic number of bits two parties need to transmit in a conversation in order to(More)
  • 1