I have been using git for about four months now. While I like most of its features, I find it rather inconvenient that multiple developers can make concurrent changes to the same file and then a need for merging ensues. In fact, I would prefer waiting for a file to get unlocked over tedious and brain hurting merges.
I was wondering if there was a team etiquette or workflow to circumvent this terrible predicament. In my ideal world of development, there would never be a need to merge. E.g. is there a combination of hooks, commands, and workflows for git in order to have a remote repository notified when a local repository changes a file so that other developers get a git notification if they themselves try to modify the same file concurrently (e.g. when they run git status
or something like that)?
4 Answers 4
In an ideal world, merge conflicts never happen. There are certain scenarios where merge conflicts are more common... So my advice to you would be avoid these scenarios. And here they are.
Tightly coupled concerns
Let's say you have a front-end engineer and a back-end engineer. And you run into conflicts. This is usually because your code is not architected such that the two parts can evolve without interrupting each other, like mixing database access with templating. You get around this by introducing just enough abstraction to add distance between the two concerns.
API changes and concurrent feature development
Let's say I'm adding a small feature to my product, and it is orthogonal to the innards. It will be a point release. Meanwhile, the lead architect is refactoring the guts of the product to bring it from 1.0 to 2.0. I'm expecting 1.0 to 1.1 for the release of my feature.
Here, the architect will be touching a lot of surface area. And if you don't have another layer of abstraction between your feature and the raw API, you will run into a very messy conflict.
Sometimes in large organizations this cannot be helped.
Overly aggressive refactoring
This is actually the same as API changes. Even though there are no functional changes, there lots of textual changes, and from Git's perspective, that is the same.
Remedies
- Keep changes physically small. Small diffs, small amount of lines changed
- Release to master frequently
- Integrate master frequently
- Communicate about disruptive changes (like API changes/large refactors)
- Separate business concerns across "physical" boundaries (modules/files/repos)
-
This is a good answer regardless of the VCS implementation: +1.user22815– user2281511/27/2013 01:57:30Commented Nov 27, 2013 at 1:57
If you need that kind of functionality, then git is not for you. It is fundamentally designed to allow independent, disconnected working. Choose another VCS that fits your needs.
As far as minimizing merge conflicts, this can be done by communication about what you're working on, as well as good code architecture. If you follow good design principles, you can minimize how widespread any changes are, which minimizes the chance of a conflict, although you can't eliminate conflicts altogether.
Also, I don't know if you've ever worked on a large team that used a locking VCS. There's a reason hardly anyone uses them anymore, except for binary file formats. You say you'd be fine waiting for a file to be unlocked, but most people don't feel that way. What ends up happening is people make changes locally anyway, while they're waiting, and merge them in manually outside of version control once the file gets unlocked. That's a very error prone system, and it's very difficult to get people to not do it.
In other words, merges happen whether you let your VCS help you with them or not. Best to accept it and use a VCS that makes it easy.
-
+1 For identifying he has a nail and screwdriver and suggesting a hammer is the tool he needs.mattnz– mattnz11/26/2013 20:37:03Commented Nov 26, 2013 at 20:37
-
1+1 for stating that people will do what they are doing regardless of whether the tool has the feature or not. Software needs to conform to real-world processes, not the other way around. That is core to software engineering, and is the reason that modern VCS do not use locking.user22815– user2281511/27/2013 01:59:29Commented Nov 27, 2013 at 1:59
Git really is designed to improve the merge process not remove it. If you find that merging is overly painful then I would suggest that you want to refine your process.
I would also take a look at this question ...
Excellent answer from Mark Canlas. This is how we plan our the same strategy in our environment:
Commit often, and push often
One of the strategy that we use in our office is to commit often, and push often. If I commit and push to server often, the guy who make large changes will have to perform the merge, not me. This effectively encourage members of the team to keep their changes simple to that merges are not too complicated.
Have CI server to verify changes
We also have CI server to verify commits, which also run unit tests in additional to building the solution, so whoever breaks the build will have to fix it. The worst time to break the build is before leaving office, as our policy is to not leave the build broken through the night. The risk of having a broken build Friday afternoon is enough for developers to try to keep their changes small.
Have CI server send out emails on broken build
Our CI server sends out notification email when a build breaks, or when a broken build has been fixed. This email is sent to owner of the commit, as well as administrator of the project. Our broken builds were fixed quickly thanks to this.
Explore related questions
See similar questions with these tags.
git merge
horrors explained: stackoverflow.com/questions/20358933/…