Rad Niazadeh

Learn More
Recently, it has been proved in [1] that in noisy compressed sensing, a joint typical estimator can asymptotically achieve the CramérRao lower bound of the problem. To prove this result, [1] used a lemma, which is provided in [2], that comprises the main building block of the proof. This lemma is based on the assumption of Gaussianity of the measurement(More)
In this paper, we firstly propose an adaptive method based on the idea of Least Mean Square (LMS) algorithm and the concept of smoothed l0 (SL0) normpresented in [1] for estimation of sparse Inter Symbol Interface (ISI) channels which will appear in wireless and acoustic underwater transmissions. Afterwards, a new non-adaptive fast channel estimation method(More)
In this paper, which is an extended version of our work at LVA/ICA 2010 [1], the problem of Inter Symbol Interface (ISI) Sparse channel estimation and equalization will be investigated. We firstly propose an adaptive method based on the idea of Least Mean Square (LMS) algorithm and the concept of smoothed l0 (SL0) norm presented in [2] for estimation of(More)
The problem of estimating a sparse channel, i.e. a channel with a few non-zero taps, appears in many fields of communication including acoustic underwater or wireless transmissions. In this paper, we have developed an algorithm based on Iterative Alternating Minimization technique which iteratively detects the location and the value of the channel taps. In(More)
For a number of problems in the theory of online algorithms, it is known that the assumption that elements arrive in uniformly random order enables the design of algorithms with much better performance guarantees than under worst-case assumptions. The quintessential example of this phenomenon is the secretary problem, in which an algorithm attempts to stop(More)
For selling a single item to agents with independent but non-identically distributed values, the revenue optimal auction is complex. With respect to it, Hartline and Rough garden showed that the approximation factor of the second-price auction with an anonymous reserve is between two and four. We consider the more demanding problem of approximating the(More)
A prevalent market structure in the Internet economy consists of buyers and sellers connected by a platform (such as Amazon or eBay) that acts as an intermediary and keeps a share of the revenue of each transaction. While the optimal mechanism that maximizes the intermediary’s profit in such a setting may be quite complicated, the mechanisms observed in(More)
Bidders often want to get as much as they can without violating constraints on what they spend. For example, advertisers seek to maximize the impressions, clicks, sales, or market share generated by their advertising, subject to budget or return-on-investment (ROI) constraints. Likewise, when bidders have no direct utility for leftover money – e.g., because(More)
Nearly fifteen years ago, Google unveiled the generalized second price (GSP) auction. By all theoretical accounts including their own [29], this was the wrong auction — the Vickrey-Clarke-Groves (VCG) auction would have been the proper choice — yet GSP has succeeded spectacularly. We give a deep justification for GSP’s success: advertisers’ preferences map(More)
Quasiliearity is a ubiquitous and questionable assumption in the standard study of Walrasian equilibria. Quasilinearity implies that a buyer’s value for goods purchased in a Walrasian equilibrium is always additive with goods purchased with unspent money. It is a particularly suspect assumption in combinatorial auctions, where buyers’ complex preferences(More)