Jump to content
  • Advertisement

About This Group

Maximize performance with free optimization guides, testing tools, and QA support while distinguishing yourself from the rest with testing certification.

  1. What's new in this group
  2. It would be nice to have more games that actually work on Linux using software rendering instead of spending months installing GPU drivers and fixing broken dependencies to eventually get broken GLSL shaders from yet another dialect. You would basically have to design your game around what's fast on the CPU, but the reward is to get any feature with determinism, faster and 100% deterministic 2D GUI drawing, partial updates of static scenes with heavy light effects, faster performance on business machines and reduced maintenance/support costs. If you build it, they will come. Low-end on Linux On an Intel I5 or I7 CPU, a software renderer can outperform the integrated GPU unless the game logic is also heavy. This would allow reaching a few more players on Linux with buggy or non-existing GPU drivers and provide a quick temporary fallback if a new platform has its own graphics API. One can also use the CPU for rendering in a low resolution and the GPU for up-scaling, deferred light and bloom effects so that more hardware is utilized. CPU rendering tricks in C/C++ using SSE2/NEON * If the camera only slides sideways over a flat surface, the majority of the scene can be drawn by copying one row of pixels at a time using memcpy, which can reach around 800 FPS in 1080p depending on the memory bandwidth and how predictable the stride is. SIMD intrinsics can be used for interpolating sub-pixel movement, which is faster than pre-stored offsets by reading in cache predictable order. If the remaining scene is small items, these can be drawn quickly using software rendered triangles. * Textures can be stored in the system's native color format to avoid the red-blue channel swap on Ubuntu. If using float calculations for rendering, this becomes pointless because planar to packed conversion can add a custom pack order for no extra cost. * Image uploads and up-scaling to the window can be sent while doing game logic for the next frame with double buffering. * If something is only seen from a few camera angles, pre-render while loading the game and store in a 2D atlas. This also works for deferred rendering by storing diffuse, normal and depth. Height can be used instead of depth for isometric top-down games with deferred light. * Normalizing light directions per pixel is very expensive, so directed lights costs a lot less than point lights on the CPU, while also needing less light sources to cover the whole screen. * Use eroded models with less triangles for casting shadows from light sources. When combined with low-resolution depth-mapping, one can hardly notice the difference while getting casted shadows for almost nothing. * If perspective depth maps store in a 1/depth format, one can render the depth using SIMD addition without having to approximate any reciprocal. * 2x2 nearest neighbor upscaling can be done using SIMD element zipping with the same row of pixels for both arguments. CPU upscaling the final image is not as fast as changing resolution in native full-screen, but some Linux distros might crash or require a manual reboot to change the resolution, so it's essential to have a safe full-screen mode to try first.
  3. Games live and die by performance, don't let yours conclude with the latter. See how Disc Jam* achieved fundamental performance and reached 60 fps on Intel® processor graphics using Unreal Engine* 4. Performance is a key focus for this project because maintaining 60 frames per second is an integral part of Disc Jam’s responsive and fluid gameplay style. As a result, we’ve learned a lot of lessons targeting this framerate using Unreal Engine* 4. Below, I discuss our experience working with integrated graphics processing units from Intel and how we ultimately achieved our performance target without raising our minimum system requirements. Read more
  4. In this 4 part series, explore important AI concepts and learn how to optimize for multi-core processors. Part 1 discussed ways to govern basic decision making for AI. In this 2nd part, the focus will be on giving AI context behind decisions. Part 1 discussed ways to govern the basic decisions that an intelligent agent-as artificial intelligence (AI) research refers to entities that use AI-may make. In this article, I give our hero (or monster or any type of game entity) some context to the decisions that will be made. Intelligent agents need to identify points of interest in the game world, then figure out how to get there. Finally, this article shows how to optimize these methods and provides ways of organizing them to account for multithreading. This article gets dangerously close to real artificial intelligence (AI). All intelligent agents need to have a basic ability to perceive their environment and some means of navigating and moving within the world around them-be it real or otherwise. Your entities will need to do the same, although with a much different approach. You can also cheat-which you will to make sure everything runs nice and fast. Read more
  5. Designing an adapting foil for a player that will match their moves and encourage growth is no simple task. In this 4 part series, explore important AI concepts and learn how to optimize for multi-core processors. Over the course of the last few decades, the gaming industry has seen great strides. Beginning with simple games like Pong* and Pac-Man* which offered players a short escape from reality and growing into such involved games like World of Warcraft* and Call of Duty 4* which are serious hobbies to those that play them. Today’s gamers, who according to the Entertainment Software Association (ESA) have an average of 13 years of gaming under their belt, have grown accustomed to seeing each new game become increasingly complex, engaging, and intelligent. For developers, the challenge becomes pushing the envelope to create games that are increasingly compelling. Computer-controlled Artificial Intelligence (AI) has evolved in many forms to meet the test. However, creating an adaptive foil for the player that can match their moves and encourage player growth is no simple task. Read more
  6. Allow the Intel® Test Suite to test your game's performance and playability on Intel® Core™ processors and Iris® Graphics. Stand out by getting your Plays Great on Intel® certification. Learn more
  7. Reaching your quality goals are important, which is why Intel® created the Plays Great on Intel® Program. This solution makes testing for bugs and improving game quality less exhausting. The Intel® Game Developer Program, is designed to help game developers at every stage along their journey. We are here to help you write highly optimized games that take advantage of Intel’s latest technology. To make a good game you need good Quality Assurance (QA) testing. Implementing a robust QA process is vital for releasing a problem-free game for your customers. However, QA testing is not a simple thing. Major bugs can be difficult to find and reproduce. Some only occur when certain conditions arise, difficult to observe, or not even 100% reproducible. It is impractical to test everything on every platform you intend to support. Efficient time management and comprehensive methodologies are required to get good coverage and reduce bug escapes. Learn more.
  8.  
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!