Skip to Content

Cybersecurity & Standards

My First IETF

Last week I had the privilege to attend the 72nd meeting of the Internet Engineering Task Force (IETF), the Internet’s longest-established technical standardization body. CDT has long been engaged in the work of the IETF and other standards bodies, with an eye towards both increasing public interest input into standards processes and fostering understanding among Internet technologists, policymakers and advocates.

The IETF describes itself as a “large open international community of network designers, operators, vendors, and researchers.” It has no membership. Anyone who wants to attend a meeting or submit a proposal for a new protocol or standard is welcome to do so. The organization is comprised purely of volunteers. No other experience that I’ve had has strengthened and renewed my appreciation for the Internet as much as IETF 72. What I witnessed last week was 1,200 dedicated individuals who travel on their own dime three times per year – and spend countless hours in the interim – to ensure that your computer, my computer, and all of the other devices on the Internet can communicate with each other to their fullest abilities. IETFers fight tooth-and-nail to make certain that the protocols and standards that are developed are the absolute best that they can be. It is in part because of the efforts of this group that the Internet sustains an unprecedented level of interoperability and continues to flourish.

The IETF is, as always, attacking some of the most critical issues facing the Internet today. I took particular interest in two sessions dealing with the strains that peer-to-peer file-sharing has caused on some ISPs’ networks. The first session centered on the notion that if computers (or “peers”) participating in peer-to-peer file-sharing could find better peers on the network, the load of file-sharing may be lessened. There is still much work to do to determine what “better” means, how peers might learn this information, and where they would get it from. But as one prong of a multi-pronged approach to reducing the impact of file-sharing, having a standard way of communicating this information between the network and the peers drew substantial support from the crowd last week. The second session explored how applications like file-sharing that don’t involve real-time interactivity could act more flexibly when the network gets congested.

The general idea here is that the user experience for applications across the board might be improved if file-sharing programs decide to take a backseat to applications like VoIP and online gaming during times of congestion. Although there was some disagreement about the best way to proceed in this area (it wouldn’t be an IETF session without a little bit of heat), there was definite consensus that this idea is worth pursuing. What does all of this mean for the policy world? As we’ve noted previously, open technical standards are essential to continued innovation and interoperability on the Internet. Software developers, including peer-to-peer applications providers, leverage standards to ensure that their applications are widely deployable. Network operators use standards as the basis of interconnection with other operators’ networks. As ISPs grapple with increased traffic, the IETF is going to be a crucial venue for the development of open, interoperability solutions that any software developer or network architect can build to. And in the long run, the creation of standards in these areas will provide a better experience for software developers, network operators, and consumers alike.