Subscribe
Apple Podcasts | Google Podcasts | Spotify | Amazon
Castbox | Stitcher | Podcast Republic | RSS | Patreon
Transcript
If you’ve done your share of flying, you are probably familiar with the three-letter airport codes which identify every commercial airport in the world.
Airports like DFW, LGA, and HOU are easy to figure out. However, why is there an X in LAX? How did Washington Dulles wind up with IAD? And what is the deal with almost every airport code in Canada?
Learn more about airport codes and the weird logic behind them on this episode of Everything Everywhere Daily.
———————–
This episode is sponsored by Audible.com
My audiobook recommendation today is Skygods: The Fall of Pan Am by Robert Gandt.
Skygods is the saga of America’s most glamorous airline – from its meteoric ascent to its plunge to extinction. Pan Am blazed the way across the world’s oceans with its magnificent Clipper ships, launched the first international jet service, was the first to fly the behemoth 747, was the lead customer for America’s SST and the Concorde, and was even taking reservations for the first commercial flights to the moon.
You can get a free one-month trial to Audible and 2 free audiobooks by going to audibletrial.com/EverythingEverywhere or clicking on the link in the show notes.
———————–
The history of airport codes dates back to before airplanes were even invented.
The United States National Weather Service was created after the Civil War. They would take weather observations at military bases around the country and would transmit the weather reports via telegraph to other bases.
To facilitate sending the reports over the telegraph, codes were developed for American cities. These two-letter codes were shorthand for the telegraph operators.
When commercial aviation developed in the United States in the 1930s, pilots began to use the National Weather Service codes to identify the cities they flew to.
However, there were two big problems. First, not all US cities had a weather service code, and second, there weren’t enough two-letter combinations to cover all the cities, especially if the system were to be used outside of the United States. There are only 676 possible 2-letter combinations.
To solve these limitations, a three-letter system was developed. Assuming that every combination of letters was used, there are 17,576 possible 3-letter combinations.
This three-letter system was eventually codified by the International Air Transport Association or IATA. Today they are in charge of assigning the three-letter codes to airports and municipalities around the world.
The IATA codes are only given out to airports with regular commercial flights. If you include every minor landing strip and airport in the world, including those which have been closed, there are 46,465 in the world.
There is another set of codes that are also used for airports that have four letters. The four-letter codes are assigned by the International Civil Aviation Organization or ICAO.
The ICAO is a UN based intergovernmental organization, whereas the IATA is a trade group. The four-letter ICAO airport codes are used for official purposes and most airline passengers have no clue what the codes are.
For the purposes of the rest of this episode, I’ll be focusing on IATA codes, which are the ones we all use when booking flights, and are prominently displayed on our luggage tags.
So why are airport codes the way they are?
They fall into several categories.
The first category is the codes that make perfect sense. These are usually taken right from the first three letters of the name of the city.
For example, HOU is the code for Houston, DEN is the code for Denver, ATL is the code for Atlanta, and AMS is the code for Amsterdam. Fukuoka, Japan falls into this category and I’ll let you figure out what that is.
There is another small category of codes which are holdovers from the days when pilots used the National Weather Service Code. These airport codes were adapted by just putting the letter X at the end of the original two-letter city code.
LAX for Los Angeles, PDX, for Portland, and PHX for Phoenix are all based on their original weather service codes.
Sioux City, Iowa has the unfortunate code of SUX for this reason.
Some codes are based on the area the airport is in, or from multiple cities.
MSP is for Minneapolis and Saint Paul. FLL is for Fort Lauderdale-Hollywood. DFW is Dallas-Fort Worth, and DTW is Detroit-Wayne County.
Other airport codes are harder to decipher, but they do make some sense if you know the reason behind it.
CVG is the code for Cincinnati, which at first doesn’t make sense. However, the airport isn’t located in Cincinnati, but in Kentucky, and the closest town is Covington, which is what the code is named after. The code CIN is actually used for Carroll, Iowa.
Many codes are named after the name of the airport, not the name of the city.
JFK is for John F. Kennedy International and LGA is for Laguardia in New York. CDG is for Charles De Gaul in Paris.
Dulles International Airport in Washington has the code IAD which never made any sense. There is, however, a good reason for it. The code used to be DIA, which was just the initials of the airport. However, it was confused with DCA, which was the other Washington Airport, so in 1968 they just moved the letters around. DCA stands for District of Columbia-Arlington.
Other airport codes are a holdover from what the airports used to be named.
O’Haire airport in Chicago is ORD, which has nothing to do with either the city or the airport. ORD comes from the old name for the airport, Orchard Field.
Likewise, Orlando’s airport code is MCO. MCO was the code for McCoy Air Force Base, which is what the airport used to be.
The Kahului Airport in Maui has the code OGG which really makes no sense. However, it’s named after Bertram J. “Jimmy” Hogg, who was an aviation pioneer in Hawaii. OGG are the last three letters in “Hogg”.
New Orleans’ code is MSY which stands for Moisant Stock Yards. The name comes from daredevil aviator John Moisant, who died in 1910 after a plane crash on the farm where the airport now stands.
There are some very obvious codes that are not assigned to airports. They are used for entire cities when doing searches.
CHI is for all of Chicago. NYC is for all of New York, and LON is for London. If you search those codes it will find all airports in that city. Likewise, other codes exist for cities with multiple airports including Moscow, Rio, Rome, Seoul, Jakarta, and Buenos Aires.
Because every airport is limited to three letters, you get all sorts of combinations. Russia’s Bolshoye Savino Airport code is PEE, and Brazil’s Poco De Caldas Airport’s code is POO.
The Omega Airport in Namibia is OMG and Lovelock City, Nevada is LOL.
No matter how weird the code sounds, there is usually a method behind the madness, and there is some reason, even if it is historical, why it has the airport code it does.
Now there is one final category of airport code that requires extra explaining because they make absolutely no sense at first glance, but here too there is a reason behind the naming conventions. That would be Canadian airports.
Toronto’s Lester B. Pearson International Airport has a code of YYZ. There are no Y’s or Z’s to be found anywhere in the name of the city or the airport. Nor is there any historical name for the airport which uses those letters.
In fact, most major Canadian airport codes start with the letter Y.
Montreal is YUL, Vancouver is YVR, Calgary is YYC, and Ottawa is YOW.
So, what is the deal with the letter Y and Canadian airports?
Just as with the United States, the development of airport codes in Canada has to do with the weather.
Pilots needed to know if an airport had a weather station. In Canada, they used a prefix in the front of the two-letter airport code to indicate if they had a weather station.
The letter Y was used to indicate “yes” they had a weather station. The letter “W” was used if there was no weather station.
Fast forward to 1947 when they codified all of the airport codes, and by that time all of the major Canadian airports had weather stations, so they all began with Y. When it came time to select codes, all the Canadian airports stuck with what they were using.
OK, so that explains why everything begins with a Y. But what about the other two letters? Why is Toronto YYZ? What’s with the YZ part?
That has to do with telegraph stations which were used by the Canadian National Railway. Each telegraph station had a two-letter code which really had little to do with the name of the city it was located in. The Malton, Ontario telegraph station where the airport is now located was YZ.
Put them all together and you get Y, for yes there is a weather station, at location YZ.
This is basically how all of the first airports in Canada were named, and the convention of using the letter Y just stuck.
I’ll close by noting that there are three letters that do not start any airport code in the United States: K, N, and W.
The letters K and W are so there is no confusion with North American radio stations which all have call signs using those two letters.
The letter N is used for US Naval Bases.
So, the takeaway from all of this is that there is always some sort of reason why an airport has the code it does. Sometimes it’s obvious, sometimes it might be shrouded in history, or sometimes it might be Canadian.
Either way, there is always some sort of hidden logic behind the creation of airport codes.
The original content (article & images) is owned by Gary Arndt. Visit the site here for other interesting stories.
No comments:
Post a Comment