yordan 10 Report post Posted January 24, 2010 If one has a Gb downlink to download 100 different websites at a time, he'll use (rougly) 10Mbps of your website's bandwith. Nothing to worry about If he really wants to do that, he will have tens of 8-gig fiber links, so he will exhaust your own website server network bandwidth until the backup completes. Share this post Link to post Share on other sites
wutske 0 Report post Posted January 25, 2010 If he really wants to do that, he will have tens of 8-gig fiber links, so he will exhaust your own website server network bandwidth until the backup completes.Then he'll have 800 simultaneous downloads running at the same time Share this post Link to post Share on other sites
tansqrx 0 Report post Posted January 26, 2010 I don’t believe that an unknown Internet user creating a backup of your site will cause a significant site slow down. While reviewing my Xisto hosted site, ycoderscookbook.com, I have seen several instances of a complete site download. Some of the automated tools even left their calling card in the form of browser identification in the HTTP headers. I’m not sure why someone would want to copy my entire site but I don’t see anything wrong with it. During the months that this happens I only see a slight increase in Xisto bandwidth usage and since Xisto has ISP level servers and bandwidth, I don’t think anyone noticed the site was any slower. Now if someone decides to copy the entire site a few thousand times a month then I would have an issue because that would use my bandwidth limit of the month. Outside of this, feel free to copy as long as you remain a respectable human in the process (don’t plagiarize or use all of my bandwidth). Share this post Link to post Share on other sites
BCD 1 Report post Posted February 4, 2010 Talking about the main topic. It is very much possible to have a backup of World Wide Web. To do so we would first need to create a clone of earth, probably revolving round the earth. Moon is the best contender, create enough data centers and other infrastructure and then start the download engines. May be in a few years moon would have a backup of internet and infrastructure to store further tons of GBs every minute. Share this post Link to post Share on other sites
FirefoxRocks 0 Report post Posted February 6, 2010 Well my main concern here is speed. I have more than enough hard disk space to store infinite copies of data. Right now there are enough hard drives to cover every square centimeter of Canada and half of the USA. And if I need more hard drives, I can get them for free and by the trillions instantaneously. I don't even know how many hard drives are hooked up, each one of them holding 1,208,925,819,614,629,174,706,176 bytes of data.The problem is, as tansqrx mentioned, is that I do not have an "insane Internet backbone", as I do not run an ISP network or anything close to that, and I'm just on a regular personal (not business) high-speed internet connection (no, not even high speed Extreme or high speed Nitro). The max upload speed I have seen is around 40 kb/s and the max download speed I have ever seen is 7 MBps when downloading a Linux distribution from a server.Furthermore with the network latency issues and server speed, that raises another issue.As for processing power, I only have two computers, none of them are servers, one with a Pentium 4 3.00 GHz processor and the other one with Intel Core 2 Duo 2.66GHz.Even if I do download the Internet, I will have to make backups of the hard drives, in case any of them fail, and that is not a problem of capacity, but a problem of CPU power. Share this post Link to post Share on other sites