Hi All,
I am trying to write a script to check the HTTP response code for a
largish database of URLs (some 12,000 sites). I have tried a couple of
third party classes such as Snoopy and URLHelper which allow me to
feed in a URL and should return the response header. The problem I am
getting is that the script will not handle more than about 30 URLs.
Any more than that an it just fails.
Reading around a little suggests that I am running up against a limit
set for the number of open sockets in Linux (I don't know very much
about Linux I'm afraid), however, as each URL is checked, the socket
*should* be closed by the fclose() call within the third party stuff.
Can anyone shed any light as to what may be going on or perhaps
suggest a way that I could monitor how many sockets are being opened.,
Alternatively can anyone suggest another way that I might be able to
check these links?
Many tia
Chris
I am trying to write a script to check the HTTP response code for a
largish database of URLs (some 12,000 sites). I have tried a couple of
third party classes such as Snoopy and URLHelper which allow me to
feed in a URL and should return the response header. The problem I am
getting is that the script will not handle more than about 30 URLs.
Any more than that an it just fails.
Reading around a little suggests that I am running up against a limit
set for the number of open sockets in Linux (I don't know very much
about Linux I'm afraid), however, as each URL is checked, the socket
*should* be closed by the fclose() call within the third party stuff.
Can anyone shed any light as to what may be going on or perhaps
suggest a way that I could monitor how many sockets are being opened.,
Alternatively can anyone suggest another way that I might be able to
check these links?
Many tia
Chris
Comment