Briefly, a version control system is a repository to store source code and keep track of changes over time. It allows multiple developers to share source code, change it, merge and so on. In particular, anyone can track back the changes over time to retrieve a particular piece of code.

Among the first technical things to do during the setup of a new software project (i.e., business), is to choose a version control system. It is an essential tool, and you need to use one; do not run without it otherwise project costs and risks will dramatically increase. It is almost crazy to do a new project without the support of such a system... Currently, well-known and used systems to store source code are: SVN and Git. Of course other systems exist. As I'll explain after they are very different in the architecture, the first one is centralised, the second one is distributed. But I'll return back on this soon.

Given that the aim is to reduce source of risks as much as possible, another decision to take is running a private instance or rent one in the cloud. It is not only a matter of costs to install a private virtual machine versus buying an existing ready to use service. A key point, in my opinion is: who will manage the repository security? It is the biggest invisible cost difficult to estimate.  It concerns the management of firewalls, guarantee up-times, and so on. All things technically doable, but they need someone in charge of them, able to react in emergencies, so costs not easy to estimate in advance. How much do you estimate the cost to have an entire team of developers waiting because your repository of source code is facing a DDoS attack? Thus, my personal opinion is to use an external cloud service, and there are many of them over Internet.

But the story is not at is end. Also the cloud service can experience "out of services" and down times (as well as the Internet connection itself can suffer technical malfunctions), a risk that needs to be considered. Am I exaggerating? No, I'm not. Two recently cases are meaningful. A source repository closed in the 2014, and the title of this article on the case is eloquent: how to lose your business in one week.  Github, the popular repository for the open source code,  recently faced a DDoS attacks. The question at this point is: how to mitigate also this risk?

As previously said, Git has a distributed architecture, and one of its great feature is to locally download the entire repository (i.e., all branches, all commits, all files, all revisions, everything!). Technically speaking there is the cloud remote repository (i.e., the one you are paying for) somewhere in the world. The fetching downloads the full repository locally on the developer workstation; as you notice it is a backup! Every source code change is committed into the local copy, only when every thing is satisfactory, the push will send the updated code into the remote repository. So, what is the advantage to use Git and reduce the previous described risks?

In case of remote out of service, just every developer continuous to work on its own local copy. Furthermore, any one in the Intranet can be a repository to merge the source code, so just elect one of the repository as the Intranet copy and continue to work pulling and pushing from this. Finally, when the remote service will be up, it easy to reverse everything remotely, continuing to work as usual. Sound very good!

But wait, is there any drawback? In my experience the git learning curve is deeper then other version control systems (e.g., SVN), especially the use of branches and the merges of changes. In my opinion this initial effort will be paid in the long run.

Rate this post

Leave a Reply

Your email address will not be published. Required fields are marked *


*

You may use these HTML tags and attributes:

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>