22C:118 Fall 2004 Week 16 Summary 22C:118 Fall 2004 Week 16 Summary

  1. Monday: briefly covered was the way that applications like BitTorrent and eDonkey split files into chunks. This enables parallel downloading of files, which is done by a server that keeps track of which chunks are available and where they are. Parallel downloading could make freeloading problems worse, and BitTorrent solves this by regulating download rates. Only users that upload chunks are allowed to use higher speed, parallel downloading of chunks. This mechanism enforces "fair play" among the clients.
    Trust is a significant issue with peer-to-peer networks, as well as other network applications. While security techniques (passwords, encryption) can solve some problems of how we trust the network hardware, we need other ideas for how to trust the quality of a web search, or the response to a P2P file search. One idea for trust is algorithmic: the pagerank algorithm used by Google, for instance, attempts to find the most valuable (trusted) links that match a search. Pagerank is a graph-theoretic measure of how many URL-links refer to a page; the more links, the (hoped-for) more valued is the page. Another example of an algorithmic technique is the way that spam-filters identify email as untrusted. Beyond algorithms, another idea is to use existing ßocial networks" that model the way humans trust each other. The lecture looked at http://trust.mindswap.org/trustProject.shtml where there is a presentation about how email could rank incoming mail by a trust metric. The trust metric could be set by instant message contacts or other ways to identify which email senders are trusted. Once each user has a database of trusted contacts, a networked application could actually calculate the trust of unknown parties ("friend of a friend"), which is the theme of several social network software applications.
  2. Wednesday continued the theme of trust, looking again at the topic of reputation networks. The study of reputation networks and how to use them for improving network software is an emerging area; the lecture looked at one example, buddybuzz.org, which proposes ranking reading recommendations based on the structure and measures in a social network.
    Wednesday also mentioned another new area in networking: ubiquitous computing, pervasive computing, and proactive computing (a search engine will turn up many interesting resources for these terms). What will happen with the number of computers vastly outnumber humans? How will they be managed and networked? The new generation of such computers may be very tiny, even to the size of a grain of dust, using wireless links for communication, designed for a specific purpose, yet capable of supporting more general applications.