Re: Retrieve Custom 404 page.

Collapse
This topic is closed.
X
X
 
  • Time
  • Show
Clear All
new posts
  • Albert Hopkins

    Re: Retrieve Custom 404 page.

    On Mon, 2008-11-17 at 13:59 -0800, godavemon wrote:
    I'm using urllib2 to pull pages for a custom version of a web proxy
    and am having issues with 404 errors. Urllib2 does a great job of
    letting me know that a 404 happened with the following code.
    >
    import urllib2
    url = 'http://cnn.com/asfsdafsadfasdf/'
    try:
    page = urllib2.urlopen ( url )
    except urllib2.URLErro r, e:
    print e
    >
    returns: HTTP Error 404: Not Found
    >From the urllib2 docs: HTTPError is also a valid HTTP response, so you
    can treat an HTTP error as an exceptional event or a valid response:

    import urllib2
    url = 'http://cnn.com/asfsdafsadfasdf/'
    try:
    page = urllib2.urlopen (url)
    except urllib2.URLErro r, e:
    print e.read()


  • godavemon

    #2
    Re: Retrieve Custom 404 page.

    Perfect! Thanks!

    On Nov 17, 4:16 pm, Albert Hopkins <mar...@python. invalidwrote:
    On Mon, 2008-11-17 at 13:59 -0800, godavemon wrote:
    I'm using urllib2 to pull pages for a custom version of a web proxy
    and am having issues with 404 errors.  Urllib2 does a great job of
    letting me know that a 404 happened with the following code.
    >
    import urllib2
    url = 'http://cnn.com/asfsdafsadfasdf/'
    try:
        page = urllib2.urlopen ( url )
    except urllib2.URLErro r, e:
        print e
    >
    returns: HTTP Error 404: Not Found
    From the urllib2 docs: HTTPError is also a valid HTTP response, so you
    >
    can treat an HTTP error as an exceptional event or a valid response:
    >
    import urllib2
    url = 'http://cnn.com/asfsdafsadfasdf/'
    try:
        page = urllib2.urlopen (url)
    except urllib2.URLErro r, e:
        print e.read()
    >
    >
    >>
    >

    Comment

    Working...