Skip to Content

Cybersecurity & Standards

IETF Standards Coming to a Device Near You

The Internet runs on technical standards created in a variety of forums, most notably the Internet Engineering Task Force (IETF). These technical standards are what make the Internet possible. By using standardized communications protocols created in the IETF and elsewhere, computers, services, and software created by different companies can seamlessly speak with one another.

Last week, the 86th meeting of the IETF highlighted a number of new standards efforts with the potential to impact user privacy, net neutrality, and global governance. These efforts include:

  • White spaces – Regulators in several countries, including the US and the UK, are in the process of converting certain radio spectrum that was once reserved for broadcast television (so-called “white spaces”) into unlicensed spectrum available to the general public for network connectivity. The IETF is creating standards to support these efforts, including a communications protocol that lets a mobile device send its location to a white spaces database to determine if there is available spectrum in its geographic area. This protocol is leveraging the work already completed in the IETF’s GEOPRIV working group by participants from CDT and elsewhere. The “PRIV” part of “GEOPRIV” refers to usage rules that, not unlike Do Not Track, get sent along with users’ geographic location information and specify users’ preferences about the retention and further distribution of that information. The white spaces protocol could be a good potential avenue for providing those rules to white spaces database operators so they know how users would like their data to be protected.
  • Broadband measurement — The IETF is getting closer to the formation of a working group concerning large-scale measurement of broadband performance. Part of the motivation for this work is to standardize some of the components that comprise existing measurement frameworks that have been rolled out across Europe, the US, and Asia. With widespread use of such standards, every mobile device or home router could become capable of running the same suite of tests, allowing consumers, regulators, and network operators to compare performance across ISPs. These tests could potentially reveal differential performance of different applications, providing data to inform net neutrality debates.
  • End of the PSTN and call origin identification – The meeting’s plenary session featured the FCC CTO, Henning Schulzrinne, discussing “The End of Plain Old Telephone Service (POTS)”. One of the key topics of the talk was the lack of reliable, secure identifiers for the source and destination of phone calls (i.e., phone numbers) on the hybrid phone/IP system that we now all use to make calls. In the context of WCIT, CDT and others raised concerns last year about governments’ desires to require the source of all traffic (voice or otherwise) to be identifiable. The potential future development of technical mechanisms to make call origin identifiers more secure and reliable – which would be optional to use and user-controlled — could provide a helpful counterbalance to governments’ claims about the necessity of mandated call origin identification.

With more than 100 working group meetings taking place over the course of the week, this list just scratches the surface as far as exciting IETF standards work goes. More than just highlighting technical developments, these examples also show how deeply intertwined Internet engineering, policy, and governance have become, and how important standards decisions are for keeping the Internet open, innovative, and free.