[web] Low-Cost Hosting w/ lots of Storage, Transfer/Mo, and Bandwidth?

Started by
24 comments, last by Extrarius 18 years, 4 months ago
The logic is simple, yes, but if you do it like that then you don't get doubled space because each file is in two places, so you have to pay for more space on a single account which usually is bought per 100MB for a high price, which makes it not worth it.
The only way the idea would be viable would be to have only part of the site at each URL, and then files would have to be moved based on popularity, and moving the file would take time and bandwidth itself. It could MAYBE be automated, but it would still take a long time to transfer the file and would use bandwidth so it couldn't be done constantly.
"Walk not the trodden path, for it has borne it's burden." -John, Flying Monk
Advertisement
Id say K.I.S.S

since apparently you cant find a single server to put everything on:

Simply put the less popular downloads on the extra server with less space and bandwidth. Im sure you already have statistics for which are popular and which are not.
What i would do (this is just me tho...)

Make a \index.dat file, which stores which parts of which files are where. (ie. Randomfile-part2.dat is at www.something.com\Shared\Randomfile-part2.dat)

Your prog then downloads \index.dat, reads off where each part is of the file.

It then downloads a few (like 2-3) simultaniously off different websites. (ie. downloading part 1,3,and 72).

This then lets you download faster. (they do a similar thing in limewire, kazaa, ect. so your not capped by the other peers upload speeds), and it lets you share the burden over many different servers.

Then you just shove them together in one big file at the end. (which is pretty easy to do).

Overall time for project: One day (including debugging, testing and converstion)

From,
Nice coder
Click here to patch the mozilla IDN exploit, or click Here then type in Network.enableidn and set its value to false. Restart the browser for the patches to work.
Nice Coder: You are suggesting either "load balancing" (if done on the server side, which is _very_ expensive) or that a special download program be required, which would be insanely annoying to all the users of the site. Not only that, but it would requrie a lot of work to set up and maintain, and would would require more than a day of work due to the numerous versions that would be required to support various operating systems.
"Walk not the trodden path, for it has borne it's burden." -John, Flying Monk
well it would seem like your screwed, since all your doing is shooting down every idea right away and not contributing anything yourself besides a "That wont work, stupid" attitude. There are SEVERAL good solutions and tips on this page (especially the mention of reducing your start page), good luck asking for more help since your only contributing a VERY negative attitude to the problem. Since you know everything, why dont you come up with a solution on your own from now on.

I recommend giving up, because its not your problem and your not going to be satisfied with any solution that either takes more effort or costs more.
Quote:Original post by kanzler
I just browsed through the Terms & Conditions of 1&1. In section 2.1.5 they state: "Bandwidth use, including but not limited to data retrieval from Your Web Site, e-mail traffic, and downloads, shall not exceed six gigabytes per month. Your combined Mailbox use per account shall not exceed twenty-five gigabytes per month." How does that relate to the 1,500 GB bandwidth they offer?
I sent them an email asking whether the package or the ToC was correct, and the reply was:
Quote:email from 1&1
[...]The Terms and Conditions refer there to packages that do not specify otherwise, such as the Instant Domain package. We are in the process of clarifying this in the Terms and Conditions.[...]
"Walk not the trodden path, for it has borne it's burden." -John, Flying Monk

This topic is closed to new replies.

Advertisement