"Starship captains are like children. They want everything right now and they want it their way. The secret is to give them what they need, not what they want."- Scotty, offering La Forge advice on handling Starfleet captains</p>"I cannot change the laws of physics! I've got to have thirty minutes."
Another tip of the hat to one of my favorite hacker archetypes.
Almost without exception, writing trustworthy software takes two or three times longer than junior engineers and managers estimate. Seasoned engineers know better. Plug and play is a saying best reserved for Legos.
Tuesday I began coding an image simulation program in c++ which I suspect will weigh in at under 1k lines of new source. I was given 24 hours to complete the task. It's a straight forward optical and digital focal plane sim that takes in a description of sensor parameters and objects, and produces image data (without complex backgrounds).
In addition to the functional requirements, the software should have minimal dependencies for easy porting (it will run under Linux although are libs have been crafted on Windows systems), and it needs to run faster than real time (single process). Those requirements translated into the necessity for a small bundle of new code to do the job.
I got a bit cocky while slapping together the headers and structures. Instead of writing and testing one element at a time, I wrote all the processing steps in one fell swoop. It was great seeing everything compiling so quickly without the mental overhead of a heavy weight code base, in contrast to my current main project which is coming to a close. Everything went smoothly until I began carefully thinking about all the steps that require testing:
- I/O for common params
- I/O for individual fields of view
- transformations of input data into common units
- initialization of all object state
- ensuring sure base classes that are part of stl array structures have clone methods to handle their children pointers copying. I came across auto_ptrs for future tasks, but tracking down that careless error took 20 minutes longer than it had to
- identifying a test case that exercises all components
- constructing test data including reasonable sensor attitudes, object state, and varied conditions
- generating intermediate output data at each step. The visual studio 2010 debugger is inadequate for reviewing object state compared to visual studio 2008
- visualization of output data. My old standard cube viewer and it's predecessors work with a specific file format. Matlab provided a rapid plotting tool
- and the list goes on...
One bunny trail after another
Just shaking down the file I/O took an hour to pass muster. While doing so I uncovered a number of required inputs that were left out of the first pass, and moved a few parameters from specific elements to common structures. After that review everything fell into place on the input/control side.
Constructing a fairly robust test case demanded additional geometric transformations. After adding those components I noticed the conversions from radiance to counts looked off. Tomorrow morning I'll do a thorough review of those optical transfer equations, and make sure the results match what I expect.
Modular testing is a wide angle lens for the mind's eye
What I reminded myself over the past couple of days is that there's no short cut to writing good software. When you're rushed to deliver functional programs on a tight budget, it's worth putting the breaks on to review each element individually first.
"I'm giving her all she's got captain!"
On rare occasions it's ok to pull out all stops and crank out reckless code like a runaway train.
Just be sure to clean up after yourself when things settle down. If the situation never settles into a sustainable rhythm, odds are you're in a soon to be extinct business.