I’m curious to hear how different teams and individuals approach this. How do you handle frequent upgrades? Do you see it as essential, or do you take a different approach?
What has your experience been?
I’m curious to hear how different teams and individuals approach this. How do you handle frequent upgrades? Do you see it as essential, or do you take a different approach?
What has your experience been?
9 comments
2. As a general rule, newer means fewer security vulnerabilities, particularly if the project is careful about introducing new features versus bug-fixes. Not always, and maybe you don't want super-bleeding edge releases, but mostly.
3. I've worked in some areas with bureaucratic or governmental impediments, where you want to avoid things that might trigger re-testing or re-certification. That's a reason not to upgrade much, but it does mean you need to actually read the changelogs etc. and have some sort of process for noticing when something is important enough.
Staying up to date is important but not urgent.
What you really don't want is that you don't do it, and one day it becomes urgent (usually for some external reason.)
You don't want to upgrade your networking the day after TLS 1.0 is rejected by that server you interact with. That seldom ends well.
When things are important to do, you should schedule them I as part of the routine. So you might have a twice-yearly event of "get everything up to date". Doing it regularly keeps it manageable, plus you get better at it. The longer the gap, the more work it is, and the more work it will cause.
It is 99% easier to do when your system is working than when it's broken. The phrase "don't fix it if its not broken" is literally the dumbest thing any programmer can tell you. preventing it from breaking is a gazillion times easier than fixing it after it's broken, while everyone around you is screaming.
Otoh, upgrading your server to support TLS 1.2 also meant upgrading to support HeartBleed.
It depends on your dependencies whether up to date is better than old and seems to work. And also, some dependencies will have breaking changes every release, skipping a few releases may mean skipping ahead on the cycle of churn.
Picking stable and high quality dependencies is nice, but not always an option.
Is it an open source package where every user inherits your dependencies? Then dependency updates are important.
The immediate benefits (or lack thereof) all come down to user impact. The longer term benefits are a balancing act between opportunity cost (what could you be developing instead of updating?) and tech debt (if you don’t update frequently, you’ll eventually need to do a really painful one).
I’d say the most important thing is making sure your project will build successfully in five years even if you do no updates. Make sure all dependencies are cached, versions pinned, lockfiles used, etc. As long as your build process is deterministic, and you control when updates happen, then “when to update” is a manageable problem. You get into trouble when your tools are pulling in minor updates to dependencies just because the author pushed a new version. Don’t do that. Pin your versions.
I’m also curious to hear how teams handle dependency updates in software development projects, things like versions listed in package.json, build.xml, or similar files. How do you decide when to update these kinds of dependencies, and how frequently do you do it?
Providing the pipeline is green, any minor, patch or image digest update can be merged automatically (with an approval coming from the renovate approve bot) and major updates need the approval of a developer.
It's similar logic to why continuous deployment is better than releases a few times a year.
Otherwise, compiler upgrades have improved our runtime and prevented errors (which in our case typically result in the user experiencing a 500), and database upgrades have improved query performance.
So, yes. Update your stuff.
IMO the sooner the better, lets us outline the maintenance work.