I've tried inurl:http but it takes forever to get even a bunch of sites right and I have to think of new keywords everytime to get the sites. Is there kind of a directory or a script i could use to filter the http from all the sites on the web. Or some sort of software?
can I get a list of all http sites in the world?
Collapse
X
-
holy - this is probably something that always will take ages unless you are going to filter out according to inactivity etc. To get a rough idea of how many websites exist and to go on from (may be you can find more detailed and useful information from there on) you can have a first look here:
-
thanks, gits I am sorry for my absurdity and wasting your time. One last thing?
If that wont work then Can I create a script which could filter out all the https results in a search result leaving only http results and repeat that process over and over and scrape all the website addresses using a web scraper plugin or somethingComment
-
well - so what exactly do you have and want to achieve? do you have a searchresult with urls that you want to filter for http/https or what exactly? basically its just a simple string comparison or using a regexp to find out all urls with http vs https from a list of strings (urls). but that wasnt my understanding of your question in the first place - where you wanted to do that with 'all http sites in the world'. if you have a search result already - where you just want to filter out the urls that start with http it should be pretty simple with a regexp. so long story short - in that case - yes you can create such script.Comment
Comment