[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: running a website the Debian way



On 13/03/14 06:18, Peter Michaux wrote:
Hi,

I've had some small hobby-type websites running on a Debian VPS for a
several years. My sysadmin approach has been very amateur and ad hoc.
It has worked fine because I have not depended on these sites for
anything. I'd like to learn how to install and maintain a website in a
more professional way. If there is such a thing, I'd like to know what
"the Debian way" would be.

The website in question could be something like

  * an Apache virtual host configuration file
  * a PostgreSQL database with a few users with various permissions
  * some static web content: HTML, CSS, JavaScript, images
  * a few Perl CGI scripts

What I'm thinking is that I'd like to be have some .deb packages that
I can install on a brand new Debian VPS. These packages would create
database users, database, and tables as necessary. These packages
would configure and restart Apache to use the new configuration files.
I would be trying to avoid type anything other than commands to
install packages. Perferably one package that installs all other
packages.

Any pointers to good resources that describe doing all of this for a
website the most Debian way possible?

Does this even seem like a reasonable objective?

Thanks,
Peter



I am not sure how much of this is the debian way, but this is what I do.

1) every week, in a cron job I run dpkg --get-selections to create a list of packages installed on my virtual machine
2) every week in a cron job I tar /etc
3) the files produced in 1 and 2 are kept in off machine storage - and are also copied to the archive I describe in point 5). These two files are sufficient, I hope, to record the current state of the Debian installation - I have never had to use it - currently my virtual machine has an uptime of 341 days and the reboot before was something of the same order of magnitude. 4) in a cron job I run daily, backup any databases used by any of the web sites. - these are rotated so there are always two copies latest and one day old 5) daily I copy the backups produced in 4 to another machine, and have them cycle through daily and then weekly and then monthly backup archives (ie I keep every day for a week, then one snapshot from the daily each week for a month, and then one snapshot from the weekly each month for 6 months). Each 6 month segment gets written to a DVD+R when complete 6) I do the same as 4) and 5) for any volatile files. For instance I host some smf forums, and the attachment directory gets the same treatment (via tar) 7) at a varying frequency - mainly monthly - I take a backup of the static elements of the site

Obviously the backup of /etc includes everything inside /etc/apache/sites-available

For each web site that I manage, I maintain a mirrored development copy on my development machine. These are all stored under git, with the master branch being the development.

I generally split this development directory into two - one holds the main web site, the other holds various supporting files - such as the database creation script (if not part of the main web site), and the apache configuration file.

I use a local copy of apache to host local domain names that represent each of these web sites (so for instance for client xxx.com I have a local domain at home called xxx.home with a web site to match. This included database copies - the location apache uses is the development directory

each web site has a separate branch in git (generally called "site") which I branched off early in the development and made changes necessary for the site configuration (ie different database passwords etc). I have a git hook, that on merge or commit to this branch, the web contents are rsync'ed into place on the virtual machine. This means I do development, test it on the master branch (or sometimes branches off the master branch - before merging back). When I am ready to put it live I checkout the site branch and issue a git merge master. This makes appropriate changes to the configuration for site and updates the site with the latest changes.

I have maintained one rather complicated smf forum (its the base forum software, with lots and lots of personal local modifications) up to date with the latest smf distribution since 2007 this way.

If you want any details of how I do this exactly, just ask. Also see http://www.chandlerfamily.org.uk/2011/03/managing-smf-software-in-git/









--
Alan Chandler
http://www.chandlerfamily.org.uk


Reply to: