• Announcements

    • khawk

      Download the Game Design and Indie Game Marketing Freebook   07/19/17

      GameDev.net and CRC Press have teamed up to bring a free ebook of content curated from top titles published by CRC Press. The freebook, Practices of Game Design & Indie Game Marketing, includes chapters from The Art of Game Design: A Book of Lenses, A Practical Guide to Indie Game Marketing, and An Architectural Approach to Level Design. The GameDev.net FreeBook is relevant to game designers, developers, and those interested in learning more about the challenges in game development. We know game development can be a tough discipline and business, so we picked several chapters from CRC Press titles that we thought would be of interest to you, the GameDev.net audience, in your journey to design, develop, and market your next game. The free ebook is available through CRC Press by clicking here. The Curated Books The Art of Game Design: A Book of Lenses, Second Edition, by Jesse Schell Presents 100+ sets of questions, or different lenses, for viewing a game’s design, encompassing diverse fields such as psychology, architecture, music, film, software engineering, theme park design, mathematics, anthropology, and more. Written by one of the world's top game designers, this book describes the deepest and most fundamental principles of game design, demonstrating how tactics used in board, card, and athletic games also work in video games. It provides practical instruction on creating world-class games that will be played again and again. View it here. A Practical Guide to Indie Game Marketing, by Joel Dreskin Marketing is an essential but too frequently overlooked or minimized component of the release plan for indie games. A Practical Guide to Indie Game Marketing provides you with the tools needed to build visibility and sell your indie games. With special focus on those developers with small budgets and limited staff and resources, this book is packed with tangible recommendations and techniques that you can put to use immediately. As a seasoned professional of the indie game arena, author Joel Dreskin gives you insight into practical, real-world experiences of marketing numerous successful games and also provides stories of the failures. View it here. An Architectural Approach to Level Design This is one of the first books to integrate architectural and spatial design theory with the field of level design. The book presents architectural techniques and theories for level designers to use in their own work. It connects architecture and level design in different ways that address the practical elements of how designers construct space and the experiential elements of how and why humans interact with this space. Throughout the text, readers learn skills for spatial layout, evoking emotion through gamespaces, and creating better levels through architectural theory. View it here. Learn more and download the ebook by clicking here. Did you know? GameDev.net and CRC Press also recently teamed up to bring GDNet+ Members up to a 20% discount on all CRC Press books. Learn more about this and other benefits here.


  • Content count

  • Joined

  • Last visited

Community Reputation

522 Good

About Jarrod1937

  • Rank
  1. Check your IDE cable. The older cables were purely 40 wire conductor cables. However, when IDE updated the standard it increased the number to 80 wires (the extra wires are just more grounds to limit crosstalk). However, using a 40 wire cable on faster drives will make them behave very slowly. Though, the other possibility is the drive is dieing. Does hd tune turnup any bad blocks under its error scanner? Does the SMART report look ok?
  2. Quote:Original post by RivieraKid can you pirate games on steam? Yep, you can download the game files, install steam, then overwrite the steam files with a cracked version. This version bypasses the online check.
  3. I would take a photo of some water spray, add motion blur to it, and then elongate your particles. That should give some good results. Waterfalls look the way they do because of motion blur, if you're not going to add any to your rendering, you need to do it prerender like in photoshop. Though, personally, i would model the water and use a simple pixel shader with a scrolling overlay of two normal maps going slightly different speeds. Then use a single mist particle source.
  4. The actual signal traveling through the speaker wires is low voltage and high current. Meaning it would take quite a lot to inject interference into the signal. If you do have any interference it is likely on the amplification end. You either need to shield your amp. Although, even considering that possibility, it would seem odd for the extremely high frequency carrier wave of cell phones to cause any type of audible interference, as they are way, way, out of range of speakers, they are usually 824 Mhz and up. Even considering that's possible, the grounding of the amp should prevent this. Make sure your amp is properly grounded. And with all of this taken into consideration, i must ask, are you sure the phone is the source of this interference?
  5. I assume he means if you're using a quick mask or the alpha channel. Using brightness/contrast on the 8 bit grayscale channel, you can eliminate the in between pixels to 100% opacity pixels. However, while i use this technique myself, you need to clean it up at the end, or else you'll get some jaggy results. As for cs4, i can not help you there, i only have cs3.
  6. The only way to remove them from exisitng works is the manual method. Either by duplicating the layers as you have been doing or by limiting your selection to only 100% opacity pixels and deleting all other pixels. However, to prevent this from happening in the future you should turnoff anti-aliasing for the eraser and brush. Google "photoshop turnoff anti-aliasing."
  7. I too use Vista 64 and could not be happier. However, even though i may get yelled at for doing so, i enabled the true admin account and completly did away with uac. In its place i use tea timer to monitor all registry edits by any program. To me its just as safe and is a lot less compatible with older apps. So far i have only had problems with one app. Display mate, a test signal generator program for video calibration... but i'm sure that is because i'm using an old version. So far i've had no troubles with playing games or developing them (from an artist's standpoint).
  8. Quote:Original post by davepermen I don't see an actual reason for raptors at all. an ordinary disk is close to the same performance. no one needs a raptor or an ssd right now. and if one wants more performance, ssd's are a bigger step than the raptors. First you say that you should not focus purely on the throughput and instead more on the access/seek times. Then you say that a raptor (or any 10,000rpm drive) is not that big of a leap over regular drives... even though its seek time is generally half of that of any 7,200 rpm drive. The larger drives like the terabyte ones offer nice space along with high throughput because of high data density, but they are generally even worse with access times. Now, while it is true ssd's are a bigger step than higher rpm drives, the whole point is the price per gig. For less than the price of a two raid 0'd ssd's like mentioned earlier you can get 4 raptors and get almost twice the throughput, getting 600 gigs of space. I would love to have a ton of nice ssd's myself, but they just are not there yet. However, as i stated before, they do have their place in laptops right at this moment. Its hard to compete against the size of most ssd's, which can easily fit into a laptop.
  9. Quote:Original post by davepermen well, the prices for ssd's are in movement. and the latency difference of x80 from a raptor to an ssd is a feelable and great difference. getting a 64gb ssd for 389$ is not that much of an investment for the highend and definitely enough for a lot of apps installed (not so for games yet :( but there, a cheaper mlc disk would be enough, as only fast read is important)). sure, they're more expensive than raptors. on the other hand, they're really much faster in usage. raw MB/s is not the most important measure. the latency helps much much more. currently my tiny tablet notebook is much faster than my quadcore i'm writing on right now. this only thanks to a 1.8" ssd, which doesn't even run at max performance. numbers here: http://www.davepermen.net/SSDs.aspx there are much faster harddisks. still my notebook is much quicker to boot, to start apps, to do anything where snappiness is important. getting a fast ssd into an existing notebook boosts the notebooks performance much more than getting a new notebook, and you use much less money. i look at them from that point of view. they are hella expensive as an item on their own. but instead of buying a new pc/laptop, they're cheap. Well, it really comes down to what your doing. However, generally, if you graph out the performance increase from both throughput and access times you'll see a point that is reached where one matters more than the other. Because of this, the faster access times may only be noticed by a certain range of file sizes and queue depths. So, it is not correct to say access times matter more than throughput or vice versa, both are equally important. "currently my tiny tablet notebook is much faster than my quadcore i'm writing on right now. this only thanks to a 1.8" ssd, which doesn't even run at max performance." Well, if your only qualitative measurement of speed is loading... And loading small files at that. Access time is the time it takes to find the file and start delivering it. After that, its throughputs job to get the data as fast as possible. Faster access times, like those in SSD's, while nice, are only truly noticeable if your load is, A.) Lots of tiny small files or B.) You're running in a server environment and you have a large queue depth. Because of this, i recommend to most to not go the SSD route atm and instead use 10,000 rpm drives in some sort of raid config. SSD's are not worth their price per gig when you consider most will be using them in single user environments with an average file size load where the importance of throughput is starting to gain over the importance of access times. Although of course, the performance cutoff in even a single user environment for lower and lower access times is still lower than that of even a 15,000 rpm drive, the point is that it still is not worth it for the price per gig of the SSD's. Although, i suppose this is a matter of opinion, if you have the money, then why not go with SSD's i guess. Now, a server/notebook environment is a bit different, and SSD's may be right for those uses. Its especially nice for high queue load database servers where high IOPS is nice. [Edited by - Jarrod1937 on November 6, 2008 1:41:55 PM]
  10. Quote:Original post by BeanDog Quote:Original post by Jarrod1937 Quote:Original post by davepermen dude, just get some ssd's and raid0 them up to any performance you want :) see here i'm currently waiting for two mtron 3500 64gb for a raid0.. 200mb read/write, 0.1ms latency. <800$ investment. I really feel SSD's are not there yet. Their price per gig is still terrible. If you want good performance with the risk of RAID 0, you can get 4 150 gig 10,000 rpm Raptor X's and raid 0 them. I was able to achieve around 340 MB/s Max throughput with excellent access times. And their price is less than the SSD's in your example and you get a lot more storage space. Remember that the risk of RAID 0 is much less for modern SSDs than mechanical drives. SSDs have no moving parts to speak of, and with wear leveling maturing, SSDs fail very slowly and (more importantly) predictably. Thats not entirely true. Because of the arrangement of their "clusters", if one bit/cell goes bad, the entire block is cutoff from access. If you're running a stripped with no parity/mirroring that can potentially corrupt a few bits from a lot of files. That and other failures and vulnerabilities exist for SSd's, such as more failure from ESD and other similar electronic failures. The majority of deaths in hard drives are actually not from damage because of the moving parts, but from similar electronic damage and corruption (damaged onboard controller, corrupted firmware, bad head...etc). They're both equal in the risk factor area imo.
  11. Quote:Original post by davepermen dude, just get some ssd's and raid0 them up to any performance you want :) see here i'm currently waiting for two mtron 3500 64gb for a raid0.. 200mb read/write, 0.1ms latency. <800$ investment. I really feel SSD's are not there yet. Their price per gig is still terrible. If you want good performance with the risk of RAID 0, you can get 4 150 gig 10,000 rpm Raptor X's and raid 0 them. I was able to achieve around 340 MB/s Max throughput with excellent access times. And their price is less than the SSD's in your example and you get a lot more storage space.
  12. Quote:Original post by oliii Yeah, SAS. He is getting 8 gig memory and some funky top of the line multicore processor. They are fast, but I don't know his budget. The good thing is that can be added as an upgrade, but it's bloody expensive. This is for his work, so I would expect his budget would be almost 'unlimited'. SAS are fast, but I'm wondering how far is the next generation, it's not a huge leap from the SATA2. The deal with SAS, is that even though it too is simply a serial data interface like SATA, SAS drives can be linked, just like the old SCSI. That and the SAS interface protocol is built off of the old SCSI protocol and so is already more mature and faster than that of the SATA protocol. Depending on your use, SAS may be a better choice, especially if your use will have a large amount of random read/write requests, since SAS's TCQ has been shown to be better than SATA's NCQ in high queue depth situations.
  13. Quote:Original post by Instigator Still unresolved. I increased my page file to 8Gb's and yet the program becomes unresponsive as soon as I start the radiosity task. It seems like there's a bug in 3DS Max.. If anyone else gets this tutorial to work with 3ds max please let me know! Thanks. Well, first thing, how much actual physical ram do you have? Page file size can help considerably, but not if your system is dieing for more ram. And secondly, watch your settings with radiosity, it can be an extremely ram hungry render calculation.
  14. Quote:Original post by Chris Reynolds "In all seriousness, we are becoming far too advanced for our own good. We're curing diseases, transplanting vital organs, creating vaccines.. and if/when we socialize healthcare, these will be available to just about anyone. We seem to be fighting natural selection. At what point do we decide to let nature control our population? I know these seem like radical ideas, but we can all agree that we have an often ignored population problem in the world. It may not seem evident in the United States yet, but ~50 years down the line we're going to have twice as many people on this earth and much more than an energy crisis at hand. And with medical and social advances, this rate becomes exponential." At some point we will outgrow our own natural resources, right? Will our human compassion to save lives ultimately become a problem? If you've ever studied sociology, you can see that the stats show, as we progress technologically, the birthrate decreases. There are many factors as to why this is the case, but the point is, the whole overpopulation of the earth was a scare of the 1970's based off of old data. "I know these seem like radical ideas" Don't be mistaken, you're far from the first to carry such thoughts.
  15. Quote:Original post by hplus0603 Quote:This create a write speed penalty for all data being written to the array, since it must be written to both drives Not necessarily. First, if you have spare controller bandwidth, because it's writing to two drives at the same time, it can write to both drives in parallel. This is a good reason to put the two drives on different channels. Second, write-behind RAID 1 controllers will buffer up the additional writes, and then flush them out once there's a lull in activity, thus if you're doing something other than full-on video capture, you may never see the slow-down, even if you put both drives on the same channel. Personally, I find that RAID 1 is the easiest to set up, and the best trade-off of performance and reliability for me. Hard drives do go bad after a few years, and in the last five years, I've replaced 4 failed drives in 2 different systems, without losing a single bit of data, all because of RAID 1. I do, however, still have a remote back-up that gets taken once weekly, in case the entire computer burns out, is stolen, etc. And, finally, a spelling nit: it's called "striping" as in "the stars and stripes." "Stripping" is something else, usually found in bars where they've sealed the windows and serve cheap, crappy beer. You're correct actually, though it depends on how intelligent the controller is. I'll make the corrections later.