How to save all search engine results urls in a text file

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • dbojan
    New Member
    • Dec 2006
    • 16

    How to save all search engine results urls in a text file

    From this search engine:

    https://siteexplorer.se arch.yahoo.com/mysites

    when I make all subdomains search just by typing a domain url in a search box like
    blogspot.com -and after I verify my yahoo email password I get 28 millions webpages listed on thousands of search result pages.I want to save the all urls and again only the one that have flv files in a text file so some flash downloader preferably freeware could preview or download them selectively or all.
  • AricC
    Recognized Expert Top Contributor
    • Oct 2006
    • 1885

    #2
    I'm not really sure what you are asking.

    Comment

    • dbojan
      New Member
      • Dec 2006
      • 16

      #3
      Originally posted by AricC
      I'm not really sure what you are asking.
      On this search engine:

      https://siteexplorer.se arch.yahoo.com/mysites

      go enter the domain url in the search box (the one where you would enter keywords normaly before the invention of this search engine) .Enter

      blogspot.com

      you will get on the top left corner of the search page notification that there were

      28 million websites found.Each web search page can list 50 of that number.
      I want to copy as much as I want these urls (that are under the title of each found webpage on a search page)to have them copied till the end if I don't interrupt them and can acces them even unfinished and continue their saving from the place they have stopped -save them into a text file.Note that yahoo site explorer lists only the subdomains of the domains we enter and lists nothing else.

      so for example we got only two websites of one search engine quest:

      1. title A
      (www.1domain.co m/1subdomain.html )

      2.title B
      (www.1domain.co m/2subdomain.html )

      all I would like is to have them copied in a text file:

      www.1domain.com/1subdomain.html
      www.1domain.com/2subdomain.html

      and if there were 28 million of such than 28 millions such lines were be written.Name it

      1a.txt

      and again to copy but only the urls of the websites that contain flash -flv,swf movies
      inside them.Save them as 1b.txt

      and again,if possible some seo data of each webpage.E.g number of views,rank.As

      1c.txt

      Comment

      Working...