Jump to content
  • Advertisement
Sign in to follow this  
UnshavenBastard

Unity Network Calculations - what PCs should I get

This topic is 4426 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Howdy there, Since I'm an impatient person, I am considering getting some el cheapo office mini-PCs, building a network, and splitting up some of my graphics and other calculations. My current main box is an Athlon XP 2000+, 512MB DDR RAM, and a crappy noname notebook: P4 2 Ghz, 512MB RAM. From your experience, would you say: If I have the opportunity, I should get 4..5 really *really* cheap PCs from a company which is throwing out stuff, each a P3 750MHz, 128MB SDRAM, and this will increase the speed of my computations satisfactorily, or would I be better off searching for maybe 1..2 newer, faster PCs that come more close to my Athlon XP 2000? (which might cost me more money than a few more P3 750's) Anyone got statistics to offer? :-) Thanks in advance, unshaven P.S. "train to be more patient" or "enroll for a yoga course" are not replies I want to see :-D

Share this post


Link to post
Share on other sites
Advertisement
Those P3s aren't going to help much.

I would look for a single new computer using a dual-core CPU, such as the Athlon X2. You can build an Athlon X2 3800 based machine with two gigs of ram for really cheap, especially if you can salvage a network card, case, and hard disk from a "free" computer.

This, of course, have nothing to do with multiplayer programming, so I'm moving the thread.

We really need a hardware forum...

Share this post


Link to post
Share on other sites
Quote:
Since I'm an impatient person, I am considering getting some el cheapo office mini-PCs, building a network, and splitting up some of my graphics and other calculations.
What kind of calculations?

Share this post


Link to post
Share on other sites
"What kind of calculations?"

well, the thing I'm working on right now is global lighting (nothing of the new stuff, nothing exciting or special, but I'm having my fun figuring out stuff myself and refusing to look at other peoples implementations heh. this is me, yeah I'm back *lol*). I have no experience with network programming whatsoever, but at least I know that a common scheme is to have a server and some clients *grins* I have already drawn abstract pseudo code how I would split those calculations up for several PCs with my existing (incomplete but working so far) algorithms...

haha, I haven't been programming my private stuff for 2 or more years, and just recently I started to get back to my hobby. I program apps all day for 2 years now , and had lost motivation for private software projects for a long time, especially after loosing my code base I once had due to a hdd crash and non-presence of sufficient backup hardware (ie DVD burner...) and thus pretty old and useless backups... my hdd was only 1,5 years old and was not under big stress the way I used it... was completely unexpected :-(
Yes, I learned from that...

I shouldn't post when tired, I tend to blether...

Share this post


Link to post
Share on other sites
A second hard disk drive is usually only $100. Make it the same as the first drive, and turn on mirroring. RAID-1 FTW!

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
It depends on the data flow versus the processing on the data.

If you have fine-grained batching -- alot of data interdependancy with partial results moving between servers, then the lag across the network (and other network overhead) can greatly slow down things (tasks waiting for farmed out results to be integrated for the next step...).

If you have course-grained batching -- more independant processing with partial results staying on the same server machines you may have overlapping data sets that have to be kept in sync (large update data transfers) -- setting up the batches being significant overhead.


What you want is minimum working data transfer and small result transmissions and ALOT of processing done in a mostly independant manner to make splitting up a problem across many computers pay off.

If you have high data thruput versus processing, then using fewer (like one) more powerful computers can be alot faster.

The new multi-core computers are dropping in price, (you coud easily do alot better low ender than a Athlon 2000 ) and you have to take into consideration maintenance and failure of older machines (you going to need a rack to put a gaggle of old machines and all their wires on... adaquate power and ventilation...)

Gigabit LAN cards probably wont work efficiently with near-ancient machines....



I have a similar problem on a simulation project with Zoneservers seperated from AI servers (and one Masterserver to coordinate/dispatch) as well as the seperate clients, and because of the heavy numbercrunching, found that fewer heavyduty computers were more advantageous.



Share this post


Link to post
Share on other sites
thanks for the elaborate reply.

yeah I have thought about how much has to be transfered etc, the way I want to implement this thing I'm working on I hope it will stay moderate, but I'll have to test it :D

I think since I've never done any network thing, this is a good exercise to learn what I can do and what I cannot do, hehe.

the beowulf link is also interesting, thanks nmi, btw. tach auch, landsmann ;-)

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!