Altitude can affect the performance of computing infrastructure and the datacenters that house that equipment.
Just ask the people in Chile building a massive radio telescope called the Atacama Large Millimeter/submillimeter Array (ALMA). It consists of 66 radio antenna dishes, 50 of them measuring 12 meters in diameter. (The rest are 7 meters.) They're being used to look deep into the universe and collect information about its earliest days and how stars and planets get formed.
All this is taking place at an altitude of around 5,000 meters above sea level, or three-plus miles, in a cold, barren landscape that reminds one of those images of Mars that NASA is receiving from its Curiosity rover up there: red dirt, mountain peaks on the horizon, dust blowing in the wind, not a soul in sight.
By collecting signals from space with a widely-scattered array of antennas and using computers to filter and correlate those signals, it's possible to achieve super-high resolution -- as if, in fact, a huge dish of more than a mile across were being used. Together, these antennas create what's called an interferometer with a spatial resolution (a measure of how well a camera or other imaging device can distinguish small details) of 10 milliarseconds, or five times that of the Hubble space telescope. It's also 10 times better than the next-best radio-astronomy setup.
Key to it all, as you might suppose, is the ability to collect and analyze the incredibly weak signals streaming in from space. To do that, the telescope relies on what is essentially a supercomputer that will, when completed, comprise some 134 million processors. It will be able to execute as many as 17 quadrillion operations per second, according to Space Daily. It will comprise 2,912 printed circuit boards, 5,200 interface cables, and more than 20 million solder points.
What's altitude got to do with all this? It turns out that with this correlator, as the supercomputer is called, situated three miles up, the air is so thin that much more airflow is needed to remove a given amount of heat, compared to working at normal altitudes.
What's more, that thin air makes the read/write heads in hard disk drives not work properly. Normally, those heads fly just microns above the surface of the disk platter's surface, held there by the same forces that keep airplanes in the sky. (To get an idea of the tiny tolerances involved, though, just imagine a 747 flying a few feet above the ground.) In the thinner air, though, there's not enough uplift, so the correlator uses 100 percent solid state drives (SSD).
Seismic activity regularly shakes the ground in these parts, so the datacenter has had to be built with that in mind. And finally, the correlator's location is so remote that it's not practical for anyone to be stationed there on a regular basis. Just unpacking and installing the gear in such thin air took 20 weeks.
Not too many everyday datacenters are going to get built under these conditions, it's true. The telescope array is high up to avoid interference from the atmosphere. But it turns out that altitude is definitely a factor even in the commercial market. Most backup-power generators, for example, are designed to run at a mere 3,300 feet, which is a potential problem for a state such as Wyoming, which has been touting its cool air, cheap electricity, and favorable tax situation to lure new datacenter construction.
Indeed, the moral of the story seems to be, as a commenter at Slashdot put it:
Don't take anything for granted, regardless of altitude. The more you scrutinize [a datacenter] project plan ahead of time, the more you can sidestep major issues like disk functionality or cooling needs. Just because projects at sea level don't suffer from such exotic problems doesn't mean that there are no problems at all.
There you have it. Plan ahead, be prepared, when out at night, wear white.