I listened to and watched the Mars Science Lab mission land the Curiosity rover last Sunday night. It was a mix of high drama with advanced technology. Below is one of the first pictures returned after the landing showing a fish eye view of a the rover wheel, and at the upper right corner, the edge of the Gale crater.
Did you ever wonder what it takes to get that image to Earth?
What is it like to design software that can capture that image and runs on a space craft on another planet where there is no “help desk”? How do you design the code to process it with a 20 MHz CPU (that’s right, 1,000 times slower than what’s in your iPhone or laptop), very limited memory of a bit less than 10 MB (an iPhone 4s can have 64 GB or about 6,400 times more memory), then send it anywhere from 36,000,000 up to 250,000,000 miles (+/-) to earth using a 10 watt transmitter (one fluorescent tube in the light above you is 32 watts)?
Here is an interesting description of what it takes to design that software based on an earlier Mars mission, the Phoenix mission, that was sent to the Martian north pole to look for frozen water. http://cdn.oreilly.com/radar/2012/08/Beautiful_Data_Chapter3.pdf
You can read about the tradeoffs that were made to meet the design goals. Sometimes we forget that products and technologies have limitations, and the act of engineering includes figuring out how to accomplish the goal but not exceed the technology limitations. That’s the art in engineering.
The Phoenix mission did find water, and that discovery helped drive the design of the current Mars Science Lab mission with the Curiosity rover to search for biological precursors of life on Mars.
BTW, my name is on the Phoenix lander, along with many others, inscribed on a silica mini-DVD with a collection of literature written about Mars provided by The Planetary Society.
I wonder who might find and decode the contents of that digital DVD time capsule one day?