Friday, September 19, 2008

Presubmit testing at Google

Here is an interesting blog post from Marc Kaplan, test engineering manager at Google, on their strategy of running what they call 'presubmit tests' -- tests that are run automatically before the code gets checked in. They include performance tests, and they compare the performance of the new code with baselines from the previous week, then report back nice graphs showing the delta. Very cool.

1 comment:

Doug Napoleone said...

We do something similar in my group at Nuance. There all the tests must pass before you can check in, including the performance tests (takes about 30-40min on a slow single proc machine). Usually any change you make affects at least one test baseline so its not a onerous as it sounds.

The accuracy tests are a bit different as they run on the grid and take much longer. Those are run for releases (about 4 a month).

We have gotten some flack in the past about being so strict about our testing and commit policies. Every so often we are forced to give an 'under the table' release to a researcher, and without exception this had come back to haunt them. Not sure why they keep thinking its a good idea...

Modifying EC2 security groups via AWS Lambda functions

One task that comes up again and again is adding, removing or updating source CIDR blocks in various security groups in an EC2 infrastructur...