It's been a few days now, since the incident and I am updating our situation, so that people get more information.
We experienced two VERY HIGH LOAD incidents on the server. On Monday, May 12th and on Monday, May 26th. The server was hitting loads of 100-300 and was unresponsive and "heavy" for the most part of the day. Neither me, nor our datacenter technicians were able to determine what caused this high load values. It appeared to be MySQl and I/O related.
In both occasions the load went away by itself...
- As soon as the announcement of the vulnerability was public we checked our 25+ CS-Cart installations and we found that almost all (about 21 out of 25) of them were affected. We found both files (test.gif & thumbs.php) in our installations. Both files were modified on Sunday the 25th of May.
- Checking our Apache access logs showed NO access to those files at all!!! Furthermore, we did not find the attacking IP (the one most of you here mentioned) and any trace of access to hsbc & atos files in any of our logs (access and error)... So, the attackers has either cleaned up his tracks or something more sinister is going on.
- We have checked our server using ClamAV several times and we found no other known threat.
- We have changed all Admin, MySQL & FTP passwords and we also changed the admin panel path.
- What I do not completely understand is why people need to change MySQL passwords? If the installation is now clean and remote MySQL access is not allowed, how could someone connect to MySQL even if he has the correct credentials? Is this necessary only as a precaution in case there is any other malicious file left that reads the db and logs and/or sends data to the attackers?
In any case, the inconvenience, the damage and the frustration that this incident caused is GREAT!
|