A few months ago, a client had made a few changes in some nifty little WinForms application I’d written for them and then wanted me to pick it up again from there.
I know that sounds like a nightmare, but the person who did the changes is really sharp, and he definitely improved on my work. So it wasn’t like I had to go and fix all the problems he introduced. There weren’t any.
No, I’m writing this because of a different complication.
See, I’d used Microsoft Visual Studio 2005 to write the original program. The client was standardized on Visual Studio 2008, so they were easily able to load and build the C# code. But then I couldn’t load their updated version into my one-version-behind installation.
It was no big deal, actually, because I needed to buy and install that version upgrade sooner or later anyway.
Yes. I’m lazy. A good kind of lazy.
I realized though that that is my upgrade strategy: “buy and install, sooner or later.” It’s like the “lazy evaluation” algorithm in some programming languages–putting off the expensive and time-consuming tasks until you know you’re going to need them. Sometimes they don’t need to be done at all.
In Agile terms, we put it this way: you ain’t gonna need it, or “YAGNI” for short. There’s never a benefit to having a feature implemented before someone needs to see it or use it in some way, because the seeing and the using are the only reasons you’d bother in the first place.
Now as an independent consultant, who generally pays for his own tools, for me this is a great strategy. There are products I ended up never buying, operating systems I never installed, tools I didn’t get around to trying out. In a way, I count each of those instances as a win.
Is that so wrong? It feels like I’m proud of my ignorance. I tend to miss some swirls and eddies in the tides of technology, true–such as never having had Vista on any computer I’ve owned. On the other hand, was missing out on Vista so bad?
A necessary evil?
Maybe not. I’m saying that upgrades are not always evil, but they’re not always necessary either, so that point of view is necessarily kind of speculative.
When I ran this question past my LinkedIn network (thanks everyone!) I got some really interesting and surprising responses:
- “We often wait until versions go definitively out of support before making changes.”
- “The reason to upgrade regularly is that employers are asking you to have experience with the latest tools and software.”
- “Each upgrade to our toolchain poses a risk, so we only upgrade to fix major showstopper bugs…. We are still on VC6 for the primary project I work on.”
- “When I do finally upgrade my dev environment… I always take a complete snapshot of the VM.”
In summary, what I heard from my network came down to:
- In the production toolchain, upgrade only when you simply have to.
- Alternatively, upgrade the production toolchain when there is a specific quality benefit to doing so.
- In big shops, an upgrade is a project unto itself. Treat it like one.
- In sandbox and R&D environments, use virtual machines to try out new versions of everything, check compatibility, and learn.
- Experimenting with the new releases, at least, is probably good for your career.
Excellent advice! How do you manage dev software upgrades? Hit the comments and let me know.