I need to retrieve quite a lot of remote data using soap requests (retieving about 12000 delivery-orders, and make a request for each separate order). On my test server (windows with Xampp) it works fine, and takes about an hour or so. Putting the scripts on the production server (Linux), the connection gets lost after about 10 minutes (it varies).
I tried changing te header from "Connection: Close" to "Connection: Keep-Alive", but that makes retrieving each record so slow for some reason - like 3 records instead of 170 records per minutes. Note sure if it is because of the fputs or the fgets-loop.
Can anybody tell me how to force the connection to be alive for a given period?
Basically my script is:
if (!is_resource($fp))
{
$fp = @fsockopen($host, $port, $errno, $errstr);
if (!$fp) { echo "soap_parser error message is: $errstr ($errno)<br />
";exit;}
stream_set_timeout($fp,3600);//one hour
}
$soap_out = "POST " . $path . " HTTP/1.1
";
$soap_out .= "Host: " . $hostname . "
";
$soap_out .= "User-Agent: MYSOAPREQUEST
";
$soap_out .= "Content-Type: text/xml;charset=UTF-8;
";
$soap_out .= "Content-Length: ".strlen($xml_request)."
";
$soap_out .= "Connection: Close
";
$soap_out .= "Cache-Control: no-cache
";
$soap_out .= "SOAPAction: ".$soap_action."
";
$soap_out .= $xml_request;
fputs($fp, $soap_out); //send request SOAP
$soap_in = "";
while (!feof($fp))
{
$soap_in .= fgets($this->fp, 512);
}
return $soap_in;