I am trying to download products from Bigcommerce and import them into my local database so that I can check the inventory values are accurate. Then, I plan on updating BC with the new inventory values.
I built my script using the BC API and using the call getProducst(Limit=250, Page=X). The problem is that I am doing this with over 30K products and the script dies. Each time I call the BC API it takes about 1 minute to return a response.
Is there a better way of approaching this? Or, is there a way to, through the API, get dump of all the products in the production environment?
Thank you,
Akshay
Two thoughts;
1) Have you tried lowering the limit? May be choking on too much data maybe?
2) Are you hitting your api call limit and if not handled correctly causing it to die prematurely rather than wait for the calls to become available again?
I guess I'd ask from an application prospective why you're checking the inventory level then updating them; it's 2x the amount of data moving around, and the size of the product json return is pretty beefy.
If you just want to update 30k products, that's ~120 service calls. A simple loop with a thread wait (so you don't blow past bigcommerce's rate limit) is doing the trick for me.