I am sorry if this has been covered, but I can honestly say I have spent at least 3 solid days trying to come up with a version control solution and my head is about to explode.
I have also skimmed through the subversion book but I am still very confused.
Basically I have a SAAS application that has been growing steadily. Currently its really only 1 developer (me) working on the app, but if the interest in it continues I might have to start hiring.
The application is written in PHP, uses a MySQL database and is hosted on a bog standard LAMP stack.
Currently I have GIT installed on my development machine however my lack of understanding has meant that my commits are irregular and often irrelevant and I am having problems with it not tracking changes to directories.
My main concern is deploying to our production server. Our clients each have their own application folders and their own databases.
Currently when we run an update we have to write a custom update script that:
1.Duplicates clients installation into a backup folder
2.Removes the live installation folder
3.Copies the new updated installation folder
4.Copies the users config files from the backup to the live install
5.Tells the operator to make the changes to the users database using a third party app
6.Cleans up.
It was boring with 5 users, but now we are approaching 50 and its an absolute nightmare.
To make things complicated (and a little more secure) each install folder contains unique database settings which means database schema's can only be updated from within that application.
I have been looking into setting up a gitorious server but thought I would seek some advice on how to proceed before i dig myself any deeper.
Thanks
Maybe you could keep your application files in a separate directory outside the users' home directories and create a symbolic link in each user directory that will point to your application? Example:
ln -s /var/myapp/publicfiles /home/someuser/lib
That way you would only need to update the code once and then update the schema for each user. The publicfiles
directory could be updated from a git repository so no manual file copying would be needed.
I think your problem here is the # of deployments you have. That's what's causing the biggest nightmare. However, it's still manageable in this form. Basically, what you want to do is:
This should merge in changes without messing up anything within the application. The only change then is writing a script that can do your database upgrades/migrations after the new code is deployed.
If each client has a different configuration and potentially slightly different source files, you might want to keep a separate git repository for each client and use rebase
to bring in the changes from the master repository.
I.e., have a directory - on the server - holding the master repository (a bare git repository). You push changes into this master repository, then perform a rebase operation for each client.
In each client you can check in additional files, or change existing files. The way rebase works, is by undoing the local changes, pulling all changes from the master and then re-applying the local changes. So essentially, the local changes/configuration overrides the master (common configuration).
You probably want to use some sort of migration files similar to Rails (and probably others). In your source code tree, have a directory of migration files (probably SQL commands) making the necessary changes to the database (it's best if you also have both a 'migrate forward' and a 'migrate back' files).
In principle, you can have a post-rebase hook to automatically run a script and apply the new migration scripts to update the database. In practice, I'd be more careful with client data.