Original post by Jeff Winter via Daily Barrage
In 1997, the Institute of Electrical and Electronic Engineers (IEEE) developed a set of standards required for implementing wireless communication between computers and pre-existing networks, including home and office networks, and the Internet. It was an awesome idea, and in the decade and a half since its release, WiFi has matured and grown immensely in capability, changing the face of computing, and communication in the world at large. However, WiFi has also become a large, confusing jumble of standards, tethered to the past by necessary backwards compatibility. What is WiFi and how does it work? Is WiFi 802.11ac worth upgrading to? Why does WiFi use multiple frequencies? If 802.11a WiFi using 5 Ghz was so bad when it first came out, why are new routers pushing 5 Ghz so hard? I will try to answer these questions in this article, in as plain English as I can use without oversimplification. I will focus on Home WiFi so I can avoid some unnecessary discussion (like basic WiFi before 802.11a), but this article is applicable to WiFi in all usage cases.
WiFi for the home began in earnest in 1999 with the release of routers,[i] or wireless access points,[ii]that used technology based on the first two commercial wireless standards: 802.11a and 802.11b. Computer networking by wire was already standardized under the code IEEE 802, so WiFi as a subset of computer networking became IEEE 802.11. Deciding to start at the beginning of the alphabet for naming the first WiFi protocol, the IEEE called the first commercial WiFi protocol 802.11a. It was supposed to be simple! Unfortunately, physics got in the way of simplicity.