I found this podcast very interesting and worth listening to more than once. A group of Microsoft engineers responsible for both Sql Azure and the boxed version of Sql Server discusses the new approaches they are using to provide a cloud-based database service. The changes are also having an effect on the conventional database product.
The interesting section is about six minutes in, when the engineers discuss the question of “testing versus tasting” They feel that if you can push changes (and corrections) into production quickly, by working on one main line of code and automating the distribution, you will gain benefits that make this a better approach than the conventional “branch, merge, test exhaustively and then release” process. They set up operational measurements which they are now hitting on a regular basis, and they are serving some massive customers with this approach.
There are questions in my mind as to how this “cloud cadence” quick release cycle would marry up with a conventional development and testing regime at a cloud customer. Everywhere I’ve worked, a new database release has always needed testing. This is largely because it’s going to run on local hardware: not the case when the hardware is managed by the service provider. What about the feature set? We would test our application against the new version, to check that the results were not changed.
In development testing, the assumption was always that we would keep the database version constant. In a cloud scenario, the feature set and the internal code will be moving targets (even if everything is generally moving in the right direction). Perhaps that isn’t a problem? If not, maybe the business should adopt the same methods? How would we know when the risks would outweigh the benefits?
I don’t have a point of view yet, but it’s clearly something that needs thinking about, and if the big effort Microsoft are putting into selling cloud services succeeds, it’s an issue some of us will need to deal with.