Basic Security Steps?

What basic steps do you use to secure your server and site?

I’m new to the Estore scene and want to make sure our site is a secure as possible.

We’re using Hostgator. I know they have some layer of protection but there must be more we can do?

OK - No one is taking any security steps?

You don’t need to be specific about your site just general.

I’ve looked at the cart structure and there isn’t anything restrictive in Robots.txt or .htaccess. That could use some work.

What about something like Crawl Protect, Bad Behavior or a firewall of some sort?

There are many different levels of security and they vary greatly between shared hosting and Dedicated/VPS servers and their PHP environments so this isn’t an easy question.


If you are on a shared hosting plan then you will be very limited on what you can apply to further increase security. In this case, your host should already have adequate measures in place to prevent cross-account and many other common vulnerabilities. They should also be running Intrusion detection systems that will prevent all known attack parameters. Those are only a few of many security enhancements a quality host should provide. All of them should also be providing exceptional firewall and logging policies as well so every event can be traced to the source IP so it may then be blocked or reported as needed.

If you are on a VPS or Dedicated server then there is much more at play. Unless you are a Linux guru you should hire a professional to at least properly secure and test the server environment. Once that is done it is not very difficult to maintain a secure server until major updates are needed. (I do offer this service if needed by anyone)

Security possibilities for enhanced security

Regardless of shared or dedicated hosting, you will first need to determine and understand how PHP is loading

mod_php (DSO):

This system is ultimate for performance because, PHP as an Apache module is in its native form without the limitations of emulation but there are major drawbacks if used in a shared environment. It isn’t very user friendly for users without root shell access and the ability to chown files back to the user level for editing as needed without using PHP again to do it. This means; if you were to upload files using a PHP script (images and your store cache files fall into this), they will not be editable or deleted using FTP because they are not owned by you. These files are now owned by the Apache (web server) user and cannot be edited or removed using your account credentials any longer. This creates another problem with security. Since files uploaded or created by PHP are owned by the web server, they have access to anywhere on the server it does including other user accounts and many critical system files. Systems running mod_php are targets for the hacking scene because, they know if they can find a single vulnerable account or script on that server, it may allow them to remote include or upload a phpshell script which could give them access to every other account hosted on it (if the host applies poor or average security). You can now see the problem with shared hosting and mod_php… Your host can correct the ownership problems as needed when you wish to edit or remove these files but this is very inconvenient since you must wait for them before you can do what is needed. If you currently are on one of these systems then you should ask your host if they will install a cron job that will correct your file ownerships on 30 minute intervals automatically. This will save you both headaches but some hosts will refuse if your site has many files and this will cause a strain on resources.

CS-Cart was and still is to some extent engineered to run on mod_php systems but, they really need to change this since less than 10% of share hosting providers use DSO anymore because of the security risks and additional headaches associated with it… We ran mod_php combined with extreme security policies on our shared servers for nearly 8 years and this is why we quickly became a popular host for CS-Cart since it was designed to run best on that platform. We eventually had to switch to suPHP like most other hosts because it was just too much work maintaining adequate security and managing the file ownership problems mentioned above.

Another headache with mod_php is with file permissions.

To make a file writable it must be chmodded to 666 and directories to 777 if they are to be written to. This makes it more difficult to install and upgrade some scripts but afterwards it is no problem and often some of these can be reverted after installation. Many of you might think that making files and directories writable creates a security vulnerability but that isn’t true at all. Actually, limiting only certain files and directories to be writable in this fashion is much more secure than with the PHP/CGI methods below where EVERY file and folder is writable.

Mod_php is also more susceptible to manipulation of its configuration by the end user and this is another potential security problem on a shared system. This would require another book to explain in detail but briefly, any user on that shared system has the ability to consume all system memory and crash the entire system either accidentally or intentional. These are difficult to trace and correct because all PHP processes are owned by the same user (web server)

For obvious reasons, shared hosting on a server running mod_php should be avoided but, if you are on a dedicated or vps system and have a qualified server admin, it is your best performing option and can also be very secure with a little knowledge and effort.

suPHP or fCGI:

There are differences between these that won’t matter much to users on shared systems but are important to those running dedicated systems such as the ability to use eAccelerator or xcache with fCGI but not with suPHP.

Both of these run PHP as a CGI process owned by the user and not the web server. This means that any files managed by a PHP script will still be owned by the user PHP is running as. You can immediately see the security and convenience advantages if used in a shared environment but it comes with a large performance cost.

All files and directories are writable and must be given permissions of 644 (files) and 755 (directories).

Many methods of PHP manipulation can still be done with these systems but no longer within .htaccess files and the Host has much more control and the ability to easily trace these problems to their source (often resulting in an account suspension)

PHP shell scripts are still a potential problem if a vulnerability is found within your account that allows a hacker to include it but, they will be limited to the refines of only your account and server privileges. This means they can still destroy your site but all others sharing the server will be safe. A good quality host will have disabled most of the unique functions these scripts require but some damage will still be possible.

Ok, the boring stuff is done let’s move on to better securing your site itself.

• PHP file permissions

Ask your host what is the minimal permission level PHP will run at on your server and set at least your most critical php files to that level. A chmod of 600 will work with most servers and some as low as 400 (overkill)… This is helpful to protect you if PHP ever fails to load during an Apache restart and php files are then exposed to the public. 644 permissions will allow that file to be opened and read but 600 will not. I suggest a permission of 600 for all configuration files because they contain private data. If you insist on keeping site or database backups on the server (you shouldn’t) then these should also be prevented from public access.

• Remove unneeded directories and files

Proper housekeeping is one of the best security tips I can give. This is great not only for security but also server performance during maintenance tasks like backups.

Many people keep 3 or more versions of their stores plus other scripts they have tested/forgotten online for whatever reasons and that is asking for trouble.

  1. Older scripts become vulnerable and should be removed
  2. They also provide many additional hiding places that can be used to hide bad things if a hacker ever was to get access. One of the first things they will do is make multiple copies of their tools and hide these in many different places in case one is found, they will then be able to access again using one of the hidden copies.
  3. You can also clean the installs you decide to keep by removing any unused folders such as skins or old image folders

    • You mentioned the robots.txt file

    This file does contain necessary info for the spiders but can also disclose information you don’t want others to see so be careful not to use it to disallow paths you wish to remain private. You could instead use a crafty .htaccess file in these private directories that will prevent spiders. Programs like Crawl Protect, Bad Behavior and any other visitor analyzer while might be cool they will degrade your sites performance at least slightly. These are overkill and pointless if your host is using intrusion detection with a quality rule set

    • Rename the admin.php file as CS-Cart suggests

    Ok, I think that is pointless but it will at least get rid of the stupid warning in the control panel… Use Secure Login Information and you have nothing to worry about

    • Move configuration files outside of the web root

    This is possible to do but pointless if you apply the more secure permissions mentioned above. The reason you would do either is the same (if PHP fails the file cannot be read by the public)

    • .htaccess files

    Many things can be done to control or prevent access within these files and many tricks would be unique to each situation so I won’t go into it here. I will just add that, this file is read before each page load so you don’t want to clutter it too badly or it will slow your site.



Nicely written book! :wink:

[QUOTE]I suggest a permission of 600 for all configuration files[/QUOTE]

Regards to CS-Cart are you only referring to config.php & config.local.php, or are there other configuration files as well?


Those 2 files are the most important because they contain database, path and other sensitive info.

Most other php files will be worthless to anyone reading them. An exception to that would be someone trying to find exploitable coding that may exist in some of the files if they know where to find them.

Normally when PHP fails, it will present the visitor with the option to save or open the index.php file or any other php file they attempt to load. I have also seen some servers where it will list the entire directory instead of loading the index file and that allows anyone to view and open any files that are listed.

This will help to prevent that possibility for some users but may not work on every server.

Open your root .htaccess file and add this command below

DirectoryIndex index.html index.php

[COLOR=“DarkRed”]Options -Indexes[/COLOR]

Now if Apache does ignore the index.php file during a failure it should no longer list the open directory