Google Data Feed Export Hangs At ~100 Products Mark

Hi,

We have ~2500 products in the store, but the google data feed export and download stop progressing at around 100 products. Any ideas how to diagnose it and how could we fix it?

Thanks!

Hello,

It could be that the webserver (probably NGINX) kills the process when it exceeds the 60 second mark. For this to work you would have to execute this from the CLI.

The best thing would be to create a php file outside of your store directory which handles everything when approached (this would only be possible through the CLI). You would simply include the init.php file and run your desired functions (e.g. the sitemap rebuild function).

If you need more information, please contact us at info@poppedweb.com

Kind regards,

Thanks for the suggestion! Isn’t there existing command to launch the feed generation from CLI however? I’ve seen one for regular products export for instance.



Thank you!

If you can't export 100 products in 60 seconds then you have much more serious issue on your server. My guess is that your memory limit is way too low.

How long is it actually taking to die?

If you can't export 100 products in 60 seconds then you have much more serious issue on your server. My guess is that your memory limit is way too low.

How long is it actually taking to die?

Well, you can be surprised with how slow some servers actually are but you could indeed try increasing the limits in the /app/addons/google_sitemap/init.php file.

Around a minute, but there’s some progress first (numbers do increase), doesn’t it work iteratively?why would a memory limit kick in after nth batch? Also, I can’t see any errors in the log (and I get memory errors there eg.when resizing huge images).

If you can't export 100 products in 60 seconds then you have much more serious issue on your server. My guess is that your memory limit is way too low.

How long is it actually taking to die?

What about server error logs? Do you see any new records?

What about server error logs? Do you see any new records?


Unfortunately not

Could anyone suggest please how the data feedprocess could be launched from CLI? I guess indeed that way some webserver configuration could be circumvented.

Could anyone suggest please how the data feedprocess could be launched from CLI? I guess indeed that way some webserver configuration could be circumvented.

Hello,

Make a PHP file with the following contents in a certain path (preferably ouside of your current store directory):

// Let the script know we are an admin
define(‘AREA’, ‘C’);

// Initialize all the code required
require(‘path/to/install/init.php’);

echo ‘Init::success
’;

// Generate the sitemap
fn_google_sitemap_get_content();

echo ‘GoogleSitemap::succes
’;

Now you can call it from the cli as follows:

If you want to save the file produced to a html file:
php -f createdFile.php > reults.html

If you just want to execute it:
php -f createdFile.php

If you need more customization regarding this matter you may contact us at sales@poppedweb.com. If you just have a few more questions, feel free to contact us at info@poppedweb.com.

Kind regards,

Around a minute, but there's some progress first (numbers do increase), doesn't it work iteratively?why would a memory limit kick in after nth batch? Also, I can't see any errors in the log (and I get memory errors there eg.when resizing huge images).

No, it is not "iterative". It processes everything in memory and then writes out the data in the prescribed format.

Could anyone suggest please how the data feedprocess could be launched from CLI? I guess indeed that way some webserver configuration could be circumvented.

When you click on the datafeeds addon from the addons menu the right sidebar shows you the cron command to use. That is what you would use for CLI.

When you click on the datafeeds addon from the addons menu the right sidebar shows you the cron command to use. That is what you would use for CLI.

Indeed, thanks!
Unfortunately when running from CLI there's a 'Killed' message at some point. It could be memory or sth else. I think the data feed export may be written incorrectly if the memory usage increases during its execution. Shouldn't it release the resources needed to process already exported products?

If it's memory, there should be a message in your php error_log file.

What are your memory settings? Are you on a dedicated, VPS or cloud server? If you're on a shared server, please move to a more commercially rugged environment.

If it's memory, there should be a message in your php error_log file.

What are your memory settings? Are you on a dedicated, VPS or cloud server? If you're on a shared server, please move to a more commercially rugged environment.

Nginx kills the process regardless of the memory usage. It could be that, just like many other people, his database isnt set up correctly which causes huge slowdowns, hence the few amount of products exported.

Where does the user state he's using NGIX? Apache timeouts can be adjusted and the default is 600 seconds. Some hostings reduce that down to 60 but it can easily be changed via WHM.

Also, if he's seeing the same symptoms running PHP as CLI, then NGIX or Apache is NOT involved in the process which takes us back to memory_limit being the most likely suspect.

Hello,

If you want to try and see whether the memory limit is indeed the cause you could add the following declaration in the beginning of the PHP file:

@ini_set('memory_limit', '1024M');

Kind regards,

Hello,

If you want to try and see whether the memory limit is indeed the cause you could add the following declaration in the beginning of the PHP file:

@ini_set('memory_limit', '1024M');
Kind regards,

If his hosting environment allowed it to be set that high. We're all guessing because not enough info about the environment is available.

If his hosting environment allowed it to be set that high. We're all guessing because not enough info about the environment is available.

Well, if it affects the result we will know :)

You'll know that the request for 1GB of memory was ignored or that it isn't and his problem went away. But if it doesn't go away, you can't say it wasn't memory_limit. Maybe he should add:

iini_set('memory_limit', '1024M');
$v = ini_get('memory_limit');
if( $v !== '1024M' )
die("Current memory limit is $v");

Then he'd really know.