
We often take for granted the vast array of tools at our disposal when it comes to weather observation and weather forecasting, and that makes it easy to forget just how incredible a lot of this technology is when you step back and break down how it all works. My specialty, for example, is doppler Radar – if you told somebody 100 years ago what we’re doing with it these days, they’d look at you cross-eyed and think you a lunatic. Even today, however, we have technology that even 10 years ago would have blown the minds of scientists of the time. One such example, in my opinion, is the GOES series of satellites in their most modern form.

From the very early days of weather observation, the need for “big picture” analyses was known, and the first efforts to expand our view beyond the horizon began in the early 1950s. Our first successes at this had to do with radar, which expanded our view from what we could see from the tallest hill to hundreds of miles in any direction. It’s key limitation, however, came from the fact that the earth curves away with distance and after a far enough distance, the beam is in space, at that point looking clear over any incoming weather. As the USSR and US began launching their first satellites into space, the obvious next step became clear – satellite based weather observation. Throughout the first decade of American space history, low earth satellites experimented with space based camera technology, forecasting, and more, which laid the foundation for what was to come – geostationary imaging. 14 years after the first satellite ever, GOES was born. It could only image in visible light and in black and white, but it being completely stationary as opposed to orbiting the earth changed forecasting forever. It manages this by sitting at geostationary orbit – that is, the distance at which the speed of the orbit matches the speed of earths rotation perfectly, giving it the appearance of being stationary over a given point. This point sits some 23,000 miles away from earth.
In it’s modern incarnation, GOES has been upgraded into a beast of a platform – it takes full-color and infrared imagery of the entire globe every 15 minutes, and of the US every 5 minutes. If the need arises, it can even image small areas every 30 seconds! Needless to say, your iPhone camera isn’t going to cut it when it comes to snapping pictures from this distance. Perhaps the most difficult hurdle in the modern era of satellite based observation was building a camera capable of doing all of this and more from so far away, so frequently, and at a quality that is good enough to make anything out to begin with.


To snap high-quality images of earth from so far away, GOES uses a Camera suite the size of 3 grown men called the ABI, or “Advanced Baseline Imager”. It takes photos in 16 different color frequencies, of which more than half are infrared, which helps us see the clouds, both day and night (if you’ve ever been curious how we see satellite pictures at night, that is how!). With this, we can make out supercells, hurricane eyes, and even wildfires on the ground – again, from 23,000 miles away. For comparison, this is like being able to see someone holding a penny from ten miles away. With our in-house Baron Lynx software, we are able to pump every image this thing takes into our app and onto our website for free, and the end result is forecasts that increase in accuracy, year by year, month by month, and day by day. New technology promises bigger and better hardware, so who knows what’s next!