Version control system exclusively on shared drive?

  softwareengineering

My company manages a lot of web sites, and we are currently working without a VCS. I want to propose a VCS for a few reasons, but I think our requirements may be too unique. We work off of a shared drive that is mapped to windows explorer that has a mirror for each site. When edits are made they get pushed to staging, then when approved to the live site.

I’m not too familiar with VCS, but i have looked into Git and SVN, and I don’t think we would be able to use them (correct me if I’m wrong). The problem is, we have so many sites, and we are constantly working on different sites, that keeping a local version of every site to push to the shared drive would be too much.

What i picture, ideally, is something that is stored on the shared drive that the developer can “branch” and work on a temporary file that merged back to the file that the developer opened. Also, it is important that another developer that has never worked on one of the sites before be able to start editing without too much setup. Having a system that tracks changes without changing our workflow too much (or no one will use it or even consider implementing it) and gets rid of the tons of “*_backupTodaysdate.html” files is the goal.

5

I would quit using a shared drive for shared editing. Go with a proper version control system and personal work spaces. Shared work spaces lead to all sort of issues including lost changes.

There is no reason to have all the sites checked out all the time. Checkout the sites you need and remove the related work space when you are done with the site. Your work flows will need to change accordingly.

I’ll avoid repeating a lengthy answer. This question is similar to: What version control system can manage all aspects?

1

The Short Answer is: DON’T DO THAT! This won’t just end in tears. it will probably end up on the 10 o’clock News as a workplace mass shooting!

With modern DVCS (Mercurial or Git) there is no sane reason to use a shared drive / directory for anything! Each dev has their own machine / directory. They pull from a “master repository” (although with a DVCS this is more of an opinion than a fact), they do whatever, and then they either push to the master, or request that someone higher up the food chain pull from their repository.

Our standard workflow is N dev machines with their own copies, a staging repository on the production machine (or an exact duplicate), and a master / stable repository. We have scripts to handle most of the scut work, including a revert script to go back to the previous version now just in case all hell breaks loose when the production server goes Tits Up on the latest “stable” release!

4

The problem is, we have so many sites, and we are constantly working on different sites, that keeping a local version of every site to push to the shared drive would be too much.

This is not an unusual situation in the least. I regularly use git clones of 20 microservices I have push access to and another 10 or so from open source projects or other teams in the company I have dependencies on. I still have around 200Gb of free disk space, but if I didn’t, cloning typically takes under 10 seconds. I think nothing of cloning a repo just for the duration of a code review, for example.

My suggestion is to just try git out on your local machine for a few days, so you have a better idea of what the workflow and overhead would be. Svn is a little harder to set up for local use, but might be a better fit if you share a lot of files between projects in a hierarchy.

LEAVE A COMMENT