I needed to put an older site under git version control and I wanted to be able to at least semi-automatically publish that site to the webserver. This is an old project for very good friends. Time to put it under git!
Copying files to the server and configuring the application there is called deployment.
The local development system and the webserver are called environments.
Common names for environments are Development, Staging, Production or Live and as well Testing.
In my scenario which is a smaller site there are two environments: Development and Live.
Taking a look around
How are these environments organized? I wanted to find out so I took some little look around. First I took a look into Symfony2.
Environments in a Symfony2 based Application
Depending on which file you request, the default application starts with a specific environment, it is tight with the Kernel component, so very central:
- app.php for Production (prod)
- app_dev.php for Development (dev)
Further on, it loads a configuration file based on the environment (e.g. /app/config/config_dev.yml). There is also a concept with a parameters file that contains credentials. I suppose this will be handy to better control security related credentials later on. Actually passwords should not be part of the github repository, while I want to maintain the configuration data (apart from secrets) under version control. And naturally for the multiple environments.
As for my little scenario the environemnt represents as well the different server, I think having environments will solve most of my deployment problems. However for more advanced configurations / deployments this would not be enough. Let’s take another look around, this time to Heroku.
Environments on the Heroku Platform
Applications on Heroku are deployed by git push to the server. Inside the git repository everything is in there to configure that application. It’s possible to run it locally as well with some setup on the local machine. So there are two environments by default: 1.) Development on the local machine and 2.) Production on the Heroku platform. Additional can be added (e.g. Staging to test local changes in a similar environment).
One key point of Heroku is that there is a local toolbelt taking care of diverse settings next to the many possibilities git itself offers. Heroku for example models around Processes controlled by a Procfile. Also it has some concept of environment variables that carry information.
Revisiting the Scenario
All this is quite interesting, and really worth to look into, however this does not map well on my scenario. Lets see my scenario:
Two environments, a local Development one and a remove Production one.
The main website is mainly static files with some php scripts. It makes use of a .htaccess file that takes care of some redirects and domain normalization. This file differs between the two environemnts. The other files do not differ.
Next to that part, there is a simple WordPress blog installation. That has it’s typical wp-config.php configuration file conaining database information and credentials. The passwords should not go into revision control but the different configurations should go there.
Thinking about the Processing
So this leaves some open questions but already shows that the concept of the two different environments already offers a handle to grab.
Also one might need to think about to block access while the site is being deployed. With Heroku this is simple, because the Dyno isn’t even started, so configuration is made before the dployment kicks active.
On a live-system that is a plain webserver that is differently. So probably – regardless of development or poduction environment – one could create a bare environment/configuration. That one is then copied into the active environment, and the active environment is modified based on environment configuration data and server parameters.
Sounds like a good processing which should be easy to map on either shell-scripts or for some lame windows-linux compatibility with php scripts. Such a script could be uses as well to simulate a deployment locally (or to replay a configuration).
Also one might think about the data-flow back, e.g. for backing up specific data from the production database, just to keep the mind open.
But more importantly right now is to just put it under git. Hopefully I find the time later on to throw some tests onto this processing idea.