限制每秒获得php curl请求以获取网站页面上的所有链接

I'm trying to get all links at concrete web-page. I need only 'a' tags with concrete parameter. But firstly, as far as I know, I have to download the whole page.

I use this (mostly not mine) code:

<?php
function file_get_contents_curl($url) 
{
    $ch = curl_init();
    curl_setopt($ch, CURLOPT_HEADER, 0);
    curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); //Устанавливаем параметр, чтобы curl возвращал данные, вместо того, чтобы выводить их в браузер.
    curl_setopt($ch, CURLOPT_URL, $url);
    curl_setopt($ch, CURLOPT_FORBID_REUSE, true);
    $data = curl_exec($ch);
    curl_close($ch);
    return $data;
}
$startUrl = 'address';
$data = file_get_contents_curl($startUrl);
echo($data);
?>

By this I'm getting error "Too many requests". The question is: can I change the amount of requests for finging elements of links array? I think about curl_multi, but as far as I understand, it assumes that I already have the array and only need to make multiple threads.

Help, please.