0

I have a script which reads a text file which contains sku information.

Then in a loop the script loads each product's ID then creates the product object:

$id = Mage::getModel('catalog/product')->getIdBySku($isbn); 
$product = Mage::getModel('catalog/product')->load($id);

Then the stock data is updated:

$product->setStockData(array(
 'manage_stock' => 1,
 'use_config_manage_stock' => 0,
 'is_in_stock' => 1,
 'qty' => $qty
 ));
$product->save();

This works however on larger amount of products to loop through the script just suddenly stops. For example one text file had 14,000 lines and on approximately the 7,000th loop the script just stops with no error message.

If we change the text file the same thing occurs.

I have checked the max_execution_time on the server and it looks fine.

I'm unsure if there's a Magento reason why the script breaks or anything else.

Any help is much appreciated.

Thanks.

asked Oct 29, 2013 at 15:05
1
  • Are you able to split the text file into more than one? It might be handy to do so and set the max size that is parsed at once to an acceptable value. I ran into a similar problem once building a resource model from a CSV and found it worked much better to split the CSV's data up into smaller chunks. Commented Oct 29, 2013 at 23:51

3 Answers 3

3

If you have a CSV file of the format:

sku,qty
IDE123,12

you can just use Import/Export to update your qty. It is faster then your script and fires one bulk query to the database instead of 14k queries for each product.

Which version of magento are you using?

We built a plugin to use bulk queries based on ImportExport but through the magento product api, you can find it here: https://github.com/magento-hackathon/cutesave

answered Oct 29, 2013 at 21:50
1

If it's not the server's PHP max execution time, you are probably running out of memory because running Mage::getModel('catalog/product')->load($id) will cause memory leak that can become quite large over many iterations. How does your script run? In CLI?

You should still get an error message stating you ran out of memory, though, and increasing memory limit is not really a good solution if you have tens of thousands of products. I think 14K is OK.

(削除) If you suspect a memory problem and can't afford any more memory, try loading a collection and saving a product model using that. That will not cause memory leaks in my experience. So like.. (untested) (削除ここまで)

$productCollection = Mage::getModel('catalog/product')->getCollection()
 ->selectAttributesToFilter('*')
 ->addAttributeToFilter('entity_id', $id)
 ->load();
answered Oct 30, 2013 at 1:06
1
  • this should be the same... :) a model is created and all attributes are set via setData() Commented Oct 31, 2013 at 19:45
0

This works however on larger amount of products to loop through the script just suddenly stops. For example one text file had 14,000 lines and on approximately the 7,000th loop the script just stops with no error message.

Looks like you have a problem with server configuration. Try to set bigger value for max_execution_time and also for memory_limit

And if this won't help, check your server log files.

answered Oct 29, 2013 at 15:16

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.