So I have a large scale WordPress site 100K+ users.
I need to loop over each user and write some of the data that is stored in their WP_Users row and some of the metadata contained in the WP_Usermeta table.
The problem with this is that with 100K+ users this runs out of memory pretty quickly and generates 3 queries per user. You can see I have batch size and paged setup for paginating this but what I want to know is how I can recursively run this until it has finished iterating over every user. With a batch size of 500, it takes 195 pages and I cant sit there and run each batch by hitting the webpage each time. Plus it's only growing and going to get worse. How can I automate this process? Is a message queue like rabbitMQ the answer is there a quick tool for this or am I just an idiot? I wrote some javascript to hit the webpage over and over 195 times and increment the paged variable each time but its very slow and requires me to leave my machine on a specific webpage.
<?php
$users = get_users(array('fields' => array('ID'), 'number' => $batch_size, 'paged' => $i));
$filetowrite = fopen($filename, 'a');
foreach ($users as $user) {
$userdata = get_userdata( $user );
$phone_number = get_user_meta($user, 'billing_phone', true);
if (isset($userdata->first_name)) {
$first_name = $userdata->first_name;
} else {
$first_name = '';
}
if (empty($first_name)) {
$first_name = get_user_meta($user, 'billing_first_name', true);
}
if (isset($userdata->last_name)) {
$last_name = $userdata->last_name;
} else {
$last_name = '';
}
if (empty($last_name)) {
$last_name = get_user_meta($user, 'billing_last_name', true);
}
fwrite($filetowrite, $phone_number.','.$userdata->user_email.','.$first_name.','.$last_name.','."
");
}
fclose($filetowrite);