laude Shannon established the field of Information Theory in 1948. His groundbreaking work provided a quantitative characterization of information (the notions of bits and entropy), quantitative limits on information transmission (the notion of channel capacity) and methods to achieve the capacity through coding. With this new language to describe information, including, compression, and coding, the technologies for information communications and storage have flourished. The past fifteen years have presented new challenges for reasoning about information and networks, including (i) the marriage of information storage and communications, namely, the emergence of the World Wide Web and (ii) continuous connectivity between people, namely, the emergence of wireless networks. In a sense, these years saw the emergence of a connected civilization—or a natural information network. These natural information networks are not limited to the macro level, and it is also intriguing to consider micro level networks, such as biological networks, including neural networks and gene regulatory networks. Also, the emergence of natural information networks has led to studies in social science and econonomics. My research program strongly benefited from the generous support of the Lee Center and allowed me to explore topics relating to natural networks that are not yet supported by traditional funding agencies. Examples of topics include: wireless networks and percolation theory, computation with biological networks, networks of relation and capacity of storage, and most recently, information representation in flash memories. In addition, it provided a flexible framework to attract and train the best graduate students and postdocs. In fact, two of my students supported by the Lee Center won the Wilts prize for the best PhD thesis in Electrical Engineering at Caltech (M. Franceschetti in 2003 and M. Riedel in 2004).