Jump to content
  • Advertisement
    1. Past hour
    2. There are a couple things in your post that aren't quite clear to me, but based on what you posted, the first thing that comes to mind is to make the score an argument to assignValue(), and then pass 'this.inputScore' to that function. (You may already know this, but the value of 'this' within a function depends on the circumstances, and in assignValue(), 'this' likely doesn't have the value you want, depending on how that function is called.)
    3. Gnollrunner

      Java's Weaknesses in Game Creation

      As someone heavily into Procedural Generation, I personally think it's just slow. I used to hate Java when it first came out. This was mainly because some managers where I worked tried to force us to use it for things it wasn't suited for. Since then I've mellowed on Java a bit and I've seen some pretty good benchmarks for it. However.... benchmarks aside whenever I actually try to write some high performance code in Java it ends up being a lot slower. I generally program in C++ these days and I'm someone that likes to pull out all the stops. This means things like designing heaps and memory management, reference counting systems, yada, yada . I use all sorts of tricks that just aren't available in Java......Bottom line is at the end of the day Java tends to crawl for me. I know there are a lot of successful applications including certain types of games written in Java. I guess it just comes down to what your needs are. If most of the performance sensitive code comes from external libraries perhaps Java is OK. However if you end up having to code it yourself, IMO using Java is way sub-optimal.
    4. Today
    5. Hello everyone, in the game code below (written using the Phaser framework) I am trying to pass the variable this.inputScore to the global function assignValue. How can I do this? The assignValue function is being read by an external Index.html file on game over and I need this.inputScore to be passed to the Index.html file. I would greatly appreciate any help - new to programming. Thank you. //GLOBAL FUNCTION function assignValue() { document.getElementById("inputScore").value = this.inputScore; //how can I get this.inputScore from the game code below? }; //GAME CODE var CrystalRunner = CrystalRunner || {}; CrystalRunner.GameState = { init: function() { //...code here }, create: function() { //...code here }, update: function() { //..code here //check if the player needs to die if(this.player.top >= this.game.world.height) { this.gameOver(); } }, gameOver: function(){ //..code here this.updateHighscore(); //..code here }, updateHighscore: function(){ this.highScore = +localStorage.getItem('highScore'); if(this.highScore < this.myScore){ this.highScore = this.myScore; this.inputScore = this.highScore; //I need this.inputScore to be passed into the assignValue function this.submitScoreButton = this.game.add.sprite(this.game.world.centerX-135, this.game.world.centerY+100, 'submitScoreButton'); this.submitScoreButton.events.onInputUp.add(function() { window.location.href = "index1.php"; //Index1.php will have a form input value into which this.inputScore will be passed }, this); } localStorage.setItem('highScore', this.highScore); }, };
    6. Very nice, definitely relaxing and captures the coldness as well as the serenity. I liked the use of arpeggiation a lot to bring out some variety and motion, and the register change near the end closes it out nicely. I would wander a snowy forest to this (especially if some story or lore just got hinted at recently... it has just enough tension for that kind of thing, I feel). Edit: I also like the softness of the instruments, almost as if they themselves are dampened by snow.
    7. I haven't, but I am curious about them, and what they can do (I could RTFM, but I'm still crawling through Cubase's 1200 page .pdf - fortunately it's mostly familiar things in occasionally unfamiliar places so far). Nuendo seems a big bonus to Cubase for professional sound design, but it is an expensive upgrade to bite before I could say I realistically know what I am doing with game audio, beyond designing sound fx or possibly writing music.
    8. I find this question interesting as an electronic musician interested in working with game audio in sound design, audio engineering, and scoring. Things like adaptive soundtracks are fascinating to me, but I am simply not a programmer - and I know a lot of those things (as well as engines I hope to work with in the future like KYMA, but I digress) require at least a fair bit of scripting knowledge. Like right now I use Cubase, which is a great DAW, can sync to video, but then there's Nuendo which has all kinds of other scripting options and integration with video game engines. I'm really interested in the process by which sound is incorporated into games, so I guess I'm kind of piggy-backing this thread for replies which I'd find relevant too. I know a lot can be done in terms of scoring and sound design just in a DAW or modular environment, and sent off to the programmers as raw audio, but I know that especially nowadays that has limitations. I still want to learn more about programming, as well, but am mostly curious how much a "serious" freelance sound designer "ought" to know, before one can consider oneself remotely serious.
    9. slayemin

      Shipped a pilot

      (Warning: Rambling ahead!) I met a new client in the beginning of November through a friend. The client asked my friend if he knew how to use Leap Motion. He said, "No, I don't, but I know the just guy who does! Eric is probably one of the best Leap Motion developers in the country.", and a referral was sent my way! I need to remember to buy my friend a nice dinner as a show of thanks. Anyways, this project was interesting for a variety of reasons. First off, the bulk of development was being outsourced to Ukraine. The outsourcing of work meant that the project owner needed to work late into the night to coordinate efforts with his Ukraine team. They were having some problems with the wrist joint causing the mesh to twist (candy wrapper effect) and nobody was making much headway. So, I was brought into the project. I downloaded the full project, tried it out to see where they were at, saw the problems, and then I started looking at the implementation. Pardon my unprofessional trash talk here, but the implementation looked like it was being held together with bubblegum and duct tape. I told them that I was going to redo the project. It was such a simple project that it took me about two days to rebuild. I was able to simplify, condense and cut about 75% of whatever those Ukrainian guys did, and it was a "correct" implementation which was maintainable. It was valuable as a throw away prototype, but for production work, it needed to be redone. I told them, "If you can't tell the difference between what I did and the original, it means I did my job right." This is always a tough sell for non-technical people, but you see the real value as the project progresses. When everything is done right from the very beginning, you skip over lots of problems you would have otherwise had (which costs time, energy and money). As far as project implementation goes, this was a super easy project (for me) because I had already done several similar projects and had built out a template library to use. The hardest problem was actually dealing with the wrist and elbow joints with leap motion. So, I'll describe some of that pain here: The initial problem was that when you rotate your wrist (roll along arm axis), the wrist bone rotates but you start to get some pretty bad deformation of the arm mesh, to the point where the wrist gets pinched to look like a candy wrapper. The problem was a combination of technical implementation, incorrect rigging, incorrect mesh weighting, and a bit of fighting the leap motion plugin. Fixing the problem can be sort of described as trying to straighten wet spaghetti noodles, because so many different factors contributed to the problem. There was no single silver bullet fix that magically fixed it all. My greatest secret weapon was my strong debugging skills and approach to debugging. Although my client liked to call other experts with industry experience, they could only do vague hand waving and guess work at the causes of the problem. The true, best way to figure out what's going on is to dig into the problem, get deep into the weeds, add a ton of debug scaffolding code and visualization helpers, and really spend the time and effort to look at what's going on. A slow, methodological, scientific approach is tried and true for me and far more illuminating than the guesswork of experts unfamiliar with the project. As far as skeletal rigging goes, I ended up downloading a freely available Paragon character with exposed arms with rotating animations. I looked at the bone setup. To fix the candy wrapper effect, you need two twist joints in the arm, parented off of the elbow joint. The wrist twist joint needs to be about 1cm away from the wrist. You also need a forearm twist joint in the middle of the arm. When you apply roll rotation to the wrist, you only apply roll to the wrist twist joint but yaw and pitch are handled at the wrist joint itself. You also need to take about 50% of the wrist roll and apply that to the forearm twist joint as well. Otherwise, you start to get bad mesh deformation and loss of volume. You also need to be very robust with your mesh weighting on each of the bones. It's good to use a working reference model to get an approximation on the best vertex weights. There were also some challenges with the out of the box leap motion plugin as well. The problem is that you get the wrist transform, but the rotation values are for the wrist joint itself, not for arm twist joint roll values. I discovered that I needed to derive my own arm aligned wrist rotation, relative to the elbow position. The rotational value needs to be independent of wrist pitch and yaw. Anyways, I won't get into the vector math required to do that, but I ended up having to derive these rotational values myself. It's important to note that the wrist joint is being driven by the user (via IK) and the elbow is an intermediate joint, with transforms being set by the IK solver. The user can put their arm into any position and contortion, and the virtual arm you present needs to match as closely as possible with the physical arm location. This is really hard to get right because the animations need to support *anything* the user does -- any popping or glitches are obvious and break immersion. Doing good elbow placement with limited data has been one of the harder challenges facing most VR developers, so it hasn't been a well solved problem. Most VR developers will skip it all together and just show people a pair of disembodied hands at the motion controller locations, but if you are trying to use a full body avatar for the player character, it's a problem you eventually need to solve. I think I did a pretty good job on the elbows. I also made a discovery about the hardware capabilities of leap motion: Though this shouldn't be a surprise, the leap motion has great trouble finding elbow placement when people are wearing coats or sleeves. If you want reliable elbow tracking, you need to take off your coat and roll up your sleeves. You also need to be very aware of bright light sources in your environment because those light sources will interfere with leap motion. Keep in mind, the leap motion uses an IR camera to to track hand gestures, and the HTC Vive uses two laser base stations which emit IR lasers, so there is a slim chance that you can get laser base station interference, depending on placement and what direction you're looking. And... if it's not obvious, sunlight causes bad tracking environments for leap motion. Anyways, I nailed it on this project. You can pretty much do anything with your arms and the digital avatar will do the same thing. I'd show a video of the result, but I think the NDA restricts me from showing that. I don't think I'm the "best" leap motion developer in the country, but I'm firmly in the top 10%. My bigger worry about Leap Motion right now is the health of the company and what that might mean for their product. Are they going to be in business two years from now? Who knows. But, the library of content built for their hardware platform is slowly growing. On a VR production side note, I made an amazing discovery in this project. If you setup an Oculus to work from your chair, you can develop in VR using the built-in desktop virtualization app. As long as you can type without looking at the keyboard, you can develop in VR. Instead of having a 17 inch monitor, you can have a 5 foot screen in VR. And if you want, you can drag out windows and surround yourself with as many screens as you want. This taxes the GPU a little bit, but who cares? The beauty about this is that if you're developing VR content, you don't have to keep on putting the headset on and off between development iterations. Since you're already in VR, switching between your IDE and your VR app is reduced to a single button press. I worked like this for several days to see how I liked it and if it was feasible. I thought I'd get sweaty face, eye strain, or have some other related draw back issue. Aside from not being able to see my keyboard, it was pretty solid! There are small preference issues I'd like to change. I prefer to listen to music on youtube while I work, and when I work in VR, I am stuck with the ambient music in the oculus home environment, and switching between IDE and project switches the audio (this should be a preference). What's extra magical about working in VR is that it has a pair of headphones and a microphone embedded into the hardware, so if you wanted to do a VoIP call over Google Chat and do screen sharing, you can do it all in VR without taking off the headset! I truly believe this is going to eventually become the future of how people work and interact with computers and each other. I feel like I'm on the bleeding edge here and I saw a glimpse of the future, and damn, it looks awesome! In closing, this project was relatively small and simple. I was instrumental in shipping it. I feel that I have a fantastic relationship with my client, they are extremely happy and impressed with the work I did, and I will be their go to guy for future VR work. This is how you get repeat business and build a financially self sustained company. Now, if only I knew how to market myself... There's another potential upcoming VR project with a much bigger budget that I am extremely excited about. I can't go into details, but I will say... if we get this and pull it off, it's going to create a new industry and there will be lots of magazine articles written about what we did. My experimental research work in AI will not be wasted. Spellbound: As far as Spellbound work is concerned, I still work on it on the side, between projects. Since this is going to be mostly a story driven VR game, I've been focusing a lot on the story writing. It feels weird to open up a google doc for my game story and say I'm going "game development". It's creative story writing, not code writing. And... it's just as hard. I've gone through probably 10 different iterations of the story so far and scrapped most of them. I'm finding that the story writing for VR is extra hard and takes a LOT of time. You need to write it like a movie manuscript, but you also need to think about presentation and the fact that the story is not consumed linearly. If you're an indie on a budget (like me), then you need to consider that every word costs money and player patience, so if you're excessively verbose, it's going to be expensive and nobody wants to sit through long, boring performances. You also have to consider the localization angle: If your scripts have lots of sophisticated word play and language based puns, you are going to hate yourself when you need to localize for different audiences. Short, simple, sweet, to the point, well worded. A story isn't about how flowery and eloquently you can write your sentences, it's about retelling a sequence of emotional events which leaves an impact on your audience... I recently realized that the story writing doesn't need to be done from an omniscient point of view where you tell every important detail, it can be done from the point of view of the story writer (an in-game historians perspective, if you will) who has a limited point of view. This let's me skip details up front, hide them a bit, and do a big reveal later on in the plot as a twist in the story. With this style, you can experience the story twice and see the plot structure and make new sense of various setup clues (similiar to how you can watch the movie "The Sixth Sense" twice and get two completely different viewing experiences). Here is the dressed down storybook intro for Episode 1 in Spellbound: [This is the story book intro. It is presented as a series of illustrated pages in a book and is read by a narrator. We have accompanying voice over lines and sound effects to go along with the story, so that it feels almost like a cinematic.] Page 1: Thousands of years ago, evil wizards and witches opened the gates to hell. Swarms of demon spawn entered into the mortal realms, lead by the arch demon Asmodeus. Page 2: No army could stand against the demon spawn. Cities were reduced to rubble and ash, tens of thousands of people fell to their blades like wheat to a scythe... Page 3: When all seemed hopelessly lost to darkness, the eastern horizon erupted in a blinding glow of shimmering light. Asmodeus was no more, and his leaderless demon spawn retreated to hell. Page 4: Nobody quite knew what happened or how they won, but one thing the remaining kings agreed upon was the outlaw of magic and the execution of all its practitioners. Page 5: People were finally safe again... all but for wizards and witches. For thousands of years, those who even *looked* like they could be magical, were hunted down and killed in every kingdom. (Short pause / transition break for establishing shot) Page 6: On the edge of the Blackwood Forest, in a small hamlet named Halfordshire, a humble pair of farmers welcomed a new boy into the world. After seeing his fire red hair, Father named him "Rupert". Page 7: Rupert grew up, as all young boys do, and trouble followed naturally, as it does with all young boys. Page 8: But the natural troubles of boyhood soon turned into supernatural troubles which seemed to haunt Rupert everywhere he went. In a freak magical accident, his home was utterly incinerated. Only Rupert survived. Page 9: As he stumbled out of the smouldering cinders of his former home, the villagers could see that not even one red hair on Rupert was burned. Was it a miracle? Page 10: As the initial shock wore off, villagers began to shout that he was a witch. Fearing for his life, Rupert ran, and he ran, and he ran, deep into the Black Forest. As day turned to dusk, the village hunters abandoned the hunt. Page 11: Rupert survived through the night, wandering through the forest for days, getting hungrier and hungrier. He stumbled upon an old tower of crumbling stone, and made it his new home and lived off of whatever edible bark, berries and mushrooms he could find in the forest. Page 12: Rupert lived in the crumbling stone tower for years, completely alone. Whatever was haunting him, seemed to follow him everywhere he went. What he didn't realize is that the next morning, his life would be changed forever...
    10. HunterGaming

      Unreal Engine 4: Level Streaming Demystified

      Little update I just found out. It wasn't spawning the player because I didn't set the game mode overrides for the persistent level. However the player character still doesn't seem to be placed at the player start. I still have to move it to there.
    11. I am curious, if anyone would be interested in an RPG Adventure in a visual novel art style? I loved Doki Doki and if I could create something in an RPG element, that would be THE BEST. I am a complete noob however, that is the thing. Like I just started adventuring into coding two weeks ago. I love it so far. I think I may be addicted. oof. Which is why I want to create something I have that itch. lol. Basically if anyone who wants to pitch in for free, or not I'd be glad to include them in the credits section. Also, I'd love to get the community involved in this, to create more fun RPG-esk things if that makes sense. Where would I go for that?
    12. blesseddisciple

      Java's Weaknesses in Game Creation

      I am moderately experienced with Java and I love the language, it is the most popular (arguably) language in the world, has rigid but proper and effective structure, great support and libraries, and honestly the speed thing is a non-issue for 95% of applications. If you are using heavy 3d rendering or very heavy sprite movement densities, you might have some slow downs but even still, tests have shown Java to be very close to C++ in most performance situations. It's true C++ is faster, but honestly, the difference isn't affecting much. I myself am just going back to Java for games after having left HTML5/JavaScript browser development. I find returning to Java to be refreshing as the language is structured in such a way that it almost forces me to program well, and quickly tells me when I don't! Instead of some other languages making me debug for 9 hours over a missed parenthesis. Let me know how your development goes if you decide to stay with Java and I will help where I can.
    13. Mapet

      Game code architecture

      Hello again after long break. Job stopped me from diving into my project, but I have come back . I read articles about ECS and it looks promising, but i have several questions about it. I think that i catch the idea of ECS, but few implementation aspects are unclean to me. In first place in a few sentences I will say how I understand ECS: 1) Each game object is an ECS entity. The Entity have just ID and a list of components. 2) The Components are just sets of data containing selected pieces of information (for example PositionComponent will have just float x,y,z position, RotationComponent - float rad/deg/quaternion etc.) 3) Systems are logical units that process all Entities which contain selected components (for example RenderingSystem will require TransformComponent and TextureComponent). So there are not really another classes for each game object like enemy or spawn point, but I have to just add required components to new Entity. I can "save" entity as prefab, so I can quickly reproduce game objects without adding new components manually. Now my questions. I have some ideas, of course, but it would be great to read your recommendations. 1) If im correct, I need something like EntityManager class that contains all entities, allows to create new ones, destroying unwanted and so on. 2) I think, i cannot check in each system every entity in terms of containing specific components. How should I do it correctly? Maybe should I create lists for each type of components and keep references / entity id / component id? It is even possible to automatize? I mean i could write it like that, but then if I add another component I have to remember to create list corresponding to it. This can be in my opinion optimized, but I dont know how at this moment. 3) Where logic of selecting entities for computations in specific system should be? In each system seperately? Somehow select entity that have required components -> processing -> do it while there still are entities or Create a SystemManager where systems are registered and loop over all entities and process them in required systems? This looks nice, but I have feeling that I should process each system one by one, not all systems for each entity (but thats just feeling). 4) Specific systems and components should be part of my ECS Module? I mean lets suppose im writting two different games using my ECS Module. First game is 3D game, second 2D. In first game TransformComponent will have float x, y, z fields, in second - float x, y. I know that "perfect is the enemy of good" but I'm wondering, how could I go around this? 5) How systems are communicating? Is it even required? For example i have InputComponent in which I have list of pressed keyboard keys. How should I change its values? In InputSystem which reading keyboard and updating list od pressedKeys? But then I need good order of executing systems mentioned before (input have to be processed before anything other or I will get 1 frame delay which can be annoying in for example hardcore platforming game). Thats all my ideas, questions and thoughts. Tommorow i will start writting something, but I am looking forward to your advice.
    14. blesseddisciple

      Shooting a ball in java

      Honestly, for a basic 2d game, there is no need for an engine. You just need to use some critical thinking and some programming skills and you can easily hand code your formula for a bullet trajectory. Basically how a bullet works is that it is fired at max force and min. gravity effect. As the shot line (let's say coordinate x) increases, the the fall line (say coordinate y) increases. The trick is that with each increment, the x velocity slows down more sharply, and the y velocity increases more sharply. Many ways to code this, just get creative and think about what needs to happen at the beginning, end , and each step of the bullet life. The biggest factor is how far you want the bullet to travel at max distance, use that to figure out your algorithm for controlling velocity and bullet drop. Regarding 2d games, you didn't say whether it is FPS, top down, side scroller.. But, regardless, you can easily do almost anything you need by simply using the AWT library. Drawn graphics, sprites, etc.
    15. So I have a decent amount of JavaScript experience now and decided I was gonna lower my head and start cranking out some 2d games, partly to learn, partly to have fun. Afterall, HTML5 canvas is such an easy and enticing medium. I love the JavaScript implementation of it. But after literally struggling for a week to get basic game functionality working I have had enough of the little stupid bugs that pop up with JavaScript. Don't get me wrong, I still love the language for scripting. I'm just not going to spend 20 mins coding and 5 hours debugging just because the language is crap. I've decided to return to my previous endeavor, Java. I like Java a lot and the only reason I haven't pursued more in the way of game development is just for the fact that Java is limited to mobile or PC apps that may never see the light of day unless it's hosted on some obscure Java game hosting website that is populated with 2,000 half developed games that no one will ever care about. BUT, still, I enjoy hand coding and I know C# but don't feel like using Visual studio and I really don't wanna hand code C# on the .Net or whatever. I use Visual Studio for business apps (ASP.NET) but I don't wanna build a game with it. So, does anyone have any points to share about why moving to Java for game development is not smart? Besides the whole, "Java is slow" thing. I mean things that might make it harder in JAva to make games vs. in other languages. Please share your thoughts.
    16. If so, I think the problem is with the contents of isSlotTaken.
    17. This is what I thought as well, I have done some tests and I also think this is how it works.. but I copied the code from stack exchange, so I do not really understand why it works. I think that there is just a problem in my logic not setting true after completing, so in some it is resolving false when I think it should be true. No not really. I have only been coding for about a week, and learning from a book called "The C# Players Guide". I am using the free visual studio and if I press f5 and there is an error I can see some issues in the log thing. I do not know about breakpoints and stuff though or how to "watch" variables or w/e. I assume the book just hasn't gotten to that stuff yet. This looks like it also works.. I did a test but I do prefer the simple format of the other way, now I know it actually functions as I thought it would. I've edited the OP Title to include that I am, indeed, talking about C#
    18. Yesterday
    19. Hey there, I've finished a new project and wanted to post it here as I think it's the best thing I've ever done, thanks so much for your time
    20. babaliaris

      Textures, some work and some doesn't???

      Forcing stb_load to use 4 channels solved the problem but why the code below does not work? What I'm doing is checking if division by 4 is an integer. If it's not an integer the do alignment by 1 else do alignment by 4. Texture::Texture(std::string path, bool trans, int unit) { //Reverse the pixels. stbi_set_flip_vertically_on_load(1); //Try to load the image. unsigned char *data = stbi_load(path.c_str(), &m_width, &m_height, &m_channels, 0); //If the size of each row is not divideable by 4 // then use alignment of 1. float check = (m_width*m_channels) / 4.0f; if (check != ceilf(check)) { std::cout << "working" << std::endl; glPixelStorei(GL_UNPACK_ALIGNMENT, 1); } //If it is divideable by 4, use alignment of 4. else glPixelStorei(GL_UNPACK_ALIGNMENT, 4); //Image loaded successfully. if (data) { //Generate the texture and bind it. GLCall(glGenTextures(1, &m_id)); GLCall(glActiveTexture(GL_TEXTURE0 + unit)); GLCall(glBindTexture(GL_TEXTURE_2D, m_id)); //Not Transparent texture. if (m_channels == 3) { GLCall(glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, m_width, m_height, 0, GL_RGB, GL_UNSIGNED_BYTE, data)); } //Transparent texture. else if (m_channels == 4) { GLCall(glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, m_width, m_height, 0, GL_RGBA, GL_UNSIGNED_BYTE, data)); } else { throw EngineError("UNSUPPORTEWD"); } //Texture Filters. GLCall(glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT)); GLCall(glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT)); GLCall(glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST_MIPMAP_NEAREST)); GLCall(glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR)); //Generate mipmaps. GLCall(glGenerateMipmap(GL_TEXTURE_2D)); } //Loading Failed. else throw EngineError("The was an error loading image: " + path); //Unbind the texture. GLCall(glBindTexture(GL_TEXTURE_2D, 0)); //Free the image data. stbi_image_free(data); }
    21. babaliaris

      Texture mapping acts weird on some images.

      @Cararasu please take a look on this thread too. Lets have a conversation there as well. I just solved it by forcing 4 channels to be used but I would like to ask you something else too. I'm going to post right now in that thread.
    22. Cararasu

      Texture mapping acts weird on some images.

      Well if you use 4 channels then the size of one row is always divisible by 4 so there can be no issue because of weird line-sizes. An issue might appear if the data pointer returned by "stbi_load" is not divisible by 4. I do not know if stb guarantees a certain alignment, but I would be very surprised if you ever get a pointer that is not at least divisible by 4. So yeah you should be safe when always using 4 channels. Additionally, I think most hardware does not store RGB images tightly. They will leave space for a fourth value because of (surprise surprise) 4-byte alignment. That means you do not lose anything in terms of GPU-memory when using RGBA over RBG. I am not 100% certain about this, so if there is anyone who has more in-depth knowledge feel free to enlighten me😉
    23. You should probably mention the progamming language used somewhere in the post. It looks like it's C#, so I'll assume that. With the given example, I think draw will be true if every element of isSlotTaken is a bool with the value true. This sounds like a problem you should look into using a debugger -- are you familiar with debugging techniques? Breakpoints, etc.
    24. babaliaris

      Texture mapping acts weird on some images.

      Thank you so much! That was the issue! Just from curiosity, instead of glPixelStorei(GL_UNPACK_ALIGNMENT, 1) if I always force stb image to use 4 channels (no matter how many channels the source file might have) because RGBA is 4 bytes, will this ever be problematic? In my case now it's working and I can see the image being rendered just fine.
    25. If you want to cycle through an array it's pretty easy, you can use for loops, or foreach. For example, if you need to check all values in the array to be true then you would do something like this: bool allValuesTrue = true; for (int a = 0; a < array.length; a++) { if (array[a].status == false) { allValuesTrue = false; // Code to exit loop } } The second something isn't true there is zero point in checking the rest if you need all to be true, and you can safely exit out. Also when posting code questions, please indicate the language you're using.
    26. Cararasu

      Texture mapping acts weird on some images.

      No that's not OpenGL ES. I just looked it up and it seems that the 4-byte alignment rule also affects normal OpenGL even though it is not mentioned directly in the description of glTexImage2D. The problem seems to be a common one. They even put it inside the common mistakes section of the OpenGL-wiki: https://www.khronos.org/opengl/wiki/Common_Mistakes#Texture_upload_and_pixel_reads To fix the issue either make sure that the alignment of each row is always 4 bytes. or call glPixelStorei(GL_UNPACK_ALIGNMENT, 1) before glTexImage2D. The problem with changing the alignment rules is that uploading the texture can be slower than if the alignment is not 4-bytes.
    27. Hi there... I am having a issue in C# with checking if all the values in a array are True... I'm getting strange behaviour... so I wanted to ask to make sure this code is correct.. or if there is a deeper problem with my logic after this test happens... // Check for Draw bool draw = isSlotTaken.All(x => x); if (draw) { rungame = false; winner = "Draw"; } Will that IF STATEMENT resolve is every single value in the bool isSlotTaken is set to true? Thanks!
    28. babaliaris

      Texture mapping acts weird on some images.

      glGetString(GL_VERSION) reports: 3.3.13541 Core Profile Context 24.20.13019.1008 How can I see if it is OpenGL ES? Does the above information says that I'm using regular openGL?
    29. Cararasu

      Texture mapping acts weird on some images.

      Are you by any chance using OpenGL ES? In that case, take a look at the documentation of glTexImage2D. Especially the sentence That means after every row the pointer is rounded up to be divisible by 4. The images from the explosion are exactly 480 pixels wide. Every pixel has 3 color values, which means 480*3 = 1440 bytes, which is divisible by 4. So no adjustment. The image of the arrows is 115 pixels wide. This makes 115*3 = 345 bytes per line. OpenGL will then pad it by 3 bytes effectively skipping 1 pixel. So the second line will look 1 pixel shifted. The third line will look 2 pixels shifted. I have already mentioned it but I think you have missed it. If you look at the rendered image you will see 93 pixels missing from the last line, which is the height of the image. If you load an image with 4 components the size of each line is trivially divisible by 4 which means no adjustments. Curiously enough this alignment of rows seems to be only part of the OpenGL ES standard and not the normal OpenGL.
    30. InterlockedOr requires shader model 4.0, don't know what you mean by texture UAV. Although I found a perfect solution for my specific case. I can just use additive blending among the shadows being drawn in the same batch because they are all created by different lights, which will create the same result as using bitwise OR 1 + 2 = 1 | 2 = 3, however if the bit is already set 2 + 2 ≠ 2 | 2 = 2 It will still need to sample the shadows from other characters and check if the bit is already set, if that's the case it should return 0 (add 0), otherwise, return light bit for the shadow.
    31. CSharpCoder

      HLSL - Additive blending

      You need to set a BlendState for this, the following code adds the two colors together and calculates alpha as source * 1 + destination * 0, the result will of course be source. BlendState additiveColorBlendState = new BlendState { ColorBlendFunction = BlendFunction.Add, ColorSourceBlend = Blend.One, ColorDestinationBlend = Blend.One, AlphaSourceBlend = Blend.One, AlphaDestinationBlend = Blend.Zero, AlphaBlendFunction = BlendFunction.Add };
    32. JWColeman

      Texture mapping acts weird on some images.

      I'm really not certain, and as for the gimp images, I'm pretty sure they are rgba, could be wrong though. Like I said, I'm really no pro when it comes to the inner workings of this stuff, and fortunately, it looks as if you've pretty much solved your problem. Maybe someone with more experience can chime in and thoroughly explain why its behaving this way.
    33. babaliaris

      Texture mapping acts weird on some images.

      Well, how do you explain then the gimp images. They have 3 channels too and they work without the need to force 4 channels. Also in my code, if the file has only 3 channels i'm telling the driver through the glTexImage2D function to handle the data as rgb not rgba. And if 4 as rgba.
    34. JWColeman

      Texture mapping acts weird on some images.

      Well, likely, my best guess, is that stbi_load is putting something other than a null value in the A channel when its not present, therefore your shader handles it normally.
    35. babaliaris

      Texture mapping acts weird on some images.

      Yes, with the ones from paint which was causing the problem. These images have 3 channels (according with stb_image when i don't force how many channels to be used).
    36. JWColeman

      Texture mapping acts weird on some images.

      It worked with the old image?
    37. babaliaris

      Texture mapping acts weird on some images.

      I tried to force how many channels to use in stbi_load() and it worked! Now from your expirience do you now why this might have fixed that? The files have 3 channels only. The code: Texture::Texture(std::string path, int unit) { //Try to load the image. unsigned char *data = stbi_load(path.c_str(), &m_width, &m_height, &m_channels, 4); m_channels = 4; std::cout << m_channels << std::endl; //Image loaded successfully. if (data) { //Generate the texture and bind it. GLCall(glGenTextures(1, &m_id)); GLCall(glActiveTexture(GL_TEXTURE0 + unit)); GLCall(glBindTexture(GL_TEXTURE_2D, m_id)); //Not Transparent texture. if (m_channels == 3) { GLCall(glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, m_width, m_height, 0, GL_RGB, GL_UNSIGNED_BYTE, data)); } //Transparent texture. else if (m_channels == 4) { GLCall(glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, m_width, m_height, 0, GL_RGBA, GL_UNSIGNED_BYTE, data)); } //This image is not supported. else { std::string err = "The Image: " + path; err += " , is using " + m_channels; err += " channels which are not supported."; throw VampEngine::EngineError(err); } //Texture Filters. GLCall(glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT)); GLCall(glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT)); GLCall(glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST_MIPMAP_NEAREST)); GLCall(glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR)); //Generate mipmaps. GLCall(glGenerateMipmap(GL_TEXTURE_2D)); } //Loading Failed. else throw VampEngine::EngineError("There was an error loading image \ (Myabe the image format is not supported): " + path); //Unbind the texture. GLCall(glBindTexture(GL_TEXTURE_2D, 0)); //Free the image data. stbi_image_free(data); }
    38. I'm what you might consider a "casual prosumer" when it comes to commenting. I comment important stuff, but only if the code isn't written in a self-explanatory way. Which is why I've also adapted a really simple and descriptive naming scheme and done away with any and all notation systems. That being said, there are occasions where I want to either document a whole system of code or provide a summarized walk-through on a per-function basis. This is where I find myself in a frustrating spot since none of the solutions on the market seem to be quite for what I want. So I figured I'd list things I want to achieve and what I want to avoid in hopes that I'm either not familiar with something or perhaps am simply not configuring stuff properly. I'm willing to pay for a good solution. The dream wish list of things I want and need: full documentation generation, a la Doxygen a non-verbose (lightweight) and non-monolithic style non-XML style markup (eg the way Natural Docs does it, not Doxygen) no block comments in documentation (I use block comments extensively to manage code flow during development) partial documentation (I really don't want to provide an explanation for each and every argument and return type) a concise format with a clear layout, so no \param and \return shenanigans automatically filled in for me no duplication of obvious information (eg the function name) in the comments inline documentation no explicit flow direction (in/out/inout) in documentation, but rather taken directly from code - I already provide this information! proper macro expansion I've tried Atomineer and it doesn't work for me at all. So far the Doxygen style in general is pure bloat in my eyes since it becomes bothersome to maintain as soon as you make something as simple as a name change. Allow me to demonstrate by example: Here's what a typical function in my code might look like: _BASEMETHOD ECBool OnInitialize( IN MODIFY ResourceType& object, IN const char* type, OPTIONAL IN ISignalable* signalable = nullptr, OPTIONAL IN uint32 flags = 0) const { ... } _BASEMETHOD expands to 'virtual'. Atomineer doesn't handle this too well since it is adamant about placing the documentation below that line unless I take care to actually generate it on the word _BASEMETHOD itself. Here's the default "trite" Atomineer generates: /// Executes the initialize action /// /// \author yomama /// \date 12-Dec-18 /// /// \tparam ResourceType Type of the resource type. /// \param [in,out] {IN MODIFY ResourceType&} object The object. /// \param {IN const char*} type The type. /// \param [in,out] {OPTIONAL IN ISignalable*} signalable (Optional) If non-null, the signalable. /// \param {OPTIONAL IN uint32} custHandlerFlags The customer handler flags. /// /// \return {ECBool} An ECBool. This is close to being the least useful way to say what the function actually does. None of the auto-generated stuff makes sense, because it's already obvious from the names. In addition, data flow direction is assumed, not extrapolated from markup that already exists in the code (notice the in/out of signalable while certain conditions might force me to accept a non-const pointer, which is nevertheless never written to). The return type is obvious. Even the general description is obvious to the point of being insulting to the reader. Of course this is all meant to be manually edited. However, the problem is that: 1) on the one hand, writing this stuff from scratch using this style of markup is time consuming and annoyingly verbose. 2) auto-generating the template and editing is also time consuming, because again, it's way too verbose. Here's what an ideal way of commenting the above function looks to me: /// Fill \p object with data and notify \p signalable once the procedure is complete. Runs asynchronously. _BASEMETHOD ECBool OnInitialize( IN MODIFY ResourceType& object, IN const char* type, OPTIONAL IN ISignalable* signalable = nullptr, /// Type-specific flags. See documentation of related resource type for possible values. OPTIONAL IN uint32 flags = 0) const { ... } That's it. This should be enough to generate feature-complete documentation when the docs are finally built. AND it's easy to read inline while writing code. A major hurdle is that while I actually kinda like the Natural Docs style, to the best of my knowledge it's only able to generate documentation for things that have actually been manually documented. Facepalm. So no automatic full documentation of classes, inheritance diagrams, etc. This seemingly forces me into using Doxygen, which is much more feature complete, but suffers from the abovementioned stylistic bloat and for some reason cannot handle relatively simple macro expansions in imo-not-so-complicated cases. I simplified the following from a real world example, but this includes auto-generated class implementations, eg: BEGIN_DEFAULT_HANDLER(foo) _BASEMETHOD const char* bar() const _OVERRIDE { return "yomama"; } END_DEFAULT_HANDLER(foo) which might expand into something like ---------> class foo : public crpt_base<foo> { base_interface* GetInterfaceClass() const _OVERRIDE { _STATIC foo_interface if; return &if; } _BASEMETHOD const char* bar() const _OVERRIDE { return "yomama"; } }; extern "C" _DLLEXPORT base_class* _fooFactory() { return static_cast<base_class*>(new foo); } Doxygen doesn't even recognize foo as a class. The bottom line is it seems to me I shouldn't be asking for too much here. I'd really like the clear coding style I've adopted to pay off in more than just the code. What's your approach? Any suggestions? Ideas or alternative options to explore?
    39. Cararasu

      Texture mapping acts weird on some images.

      It looks to me that you are not filling the whole memory of the image on the GPU. If you look closely at the rendered image you will notice at the last pixel-line that some pixels are missing. The amount seems to be about the same as the height of your image, which would mean the width is one pixel too wide. Your image loading code seems alright on first glance so that is strange. I do not think that the problem is that you are loading an RGBA image and treating it as RGB. This would create a different effect as the image is fully opaque. In this case, every pixel in the black area would be I think either red, green or blue.
    40. I'm learning how to implement a scripting language into my c++ game engine, specifically Mozilla's JavaScript engine: SpiderMonkey, but I'm still very confused. First I try to follow their How to build SpiderMonkey document, but it indicates I need the right build tools for my platform. In my case Windows 10 and Visual Studio 2018 Community as my IDE. I go to that "build tools" documentation for Windows but it then says Building Firefox for Windows. I guessing the same tools to build Firefox are used for SpiderMonkey, but I'm not sure what part of what I'm installing is overkill, I just want to #include "jsapi.h" in my project. I realize it's not as easy as including a bunch of header files. I just don't know where to start or how to do this properly. From what I see in the documentation, using the API is very straightforward, but building is so confusing for me Would anybody know of a more straight forward step by step guide? I can't find one anywhere.
    41. I would start with the idiomatic, iterator-based approach and simply add a convenience method for containers and other "iterable" types: template <typename InputIt> std::string join(InputIt begin, InputIt end) { std::string result; /* does the thing */ return result; } template <typename C> std::string join(const C& c) { using std::begin; using std::end; return join(begin(c), end(c)); }
    42. babaliaris

      Texture mapping acts weird on some images.

      I searched and they say you can't debug shaders natively. They have also Nvdia Inslight gor visual studio but i have amd and i cant use that debugger. I will see This gDebugger and i will come back later to post re results what do you mean by that? Can you understand if I;m doing it in the code below? What is what you're saying and where i might be doing that? Texture::Texture(std::string path, int unit) { //Try to load the image. unsigned char *data = stbi_load(path.c_str(), &m_width, &m_height, &m_channels, 0); //Image loaded successfully. if (data) { //Generate the texture and bind it. GLCall(glGenTextures(1, &m_id)); GLCall(glActiveTexture(GL_TEXTURE0 + unit)); GLCall(glBindTexture(GL_TEXTURE_2D, m_id)); //Not Transparent texture. if (m_channels == 3) { GLCall(glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, m_width, m_height, 0, GL_RGB, GL_UNSIGNED_BYTE, data)); } //Transparent texture. else if (m_channels == 4) { GLCall(glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, m_width, m_height, 0, GL_RGBA, GL_UNSIGNED_BYTE, data)); } //This image is not supported. else { std::string err = "The Image: " + path; err += " , is using " + m_channels; err += " channels which are not supported."; throw VampEngine::EngineError(err); } //Texture Filters. GLCall(glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT)); GLCall(glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT)); GLCall(glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST_MIPMAP_NEAREST)); GLCall(glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR)); //Generate mipmaps. GLCall(glGenerateMipmap(GL_TEXTURE_2D)); } //Loading Failed. else throw VampEngine::EngineError("There was an error loading image \ (Myabe the image format is not supported): " + path); //Unbind the texture. GLCall(glBindTexture(GL_TEXTURE_2D, 0)); //Free the image data. stbi_image_free(data); }
    43. JWColeman

      Texture mapping acts weird on some images.

      Also, it may be that if you're pushing an RGBA through to your device but only sending it data for an RGB, it could be that each frame your RGB is getting shifted by sizeof(A)... I'm really no expert on the rendering pipeline but these are oddities that I'm aware can occur when data is not pairing up just right.
    44. We are pleased to announce the release of Matali Physics 4.4. The latest version introduces comprehensive support for Android 9.0 Pie, iOS 12.x and macOS Mojave (version 10.14.x). Latest version also introduces Matali Render 3.4 add-on with normal mapping and parallax mapping based on the distance from the observer as well as other improvements and fixes. What is Matali Physics? Matali Physics is an advanced, multi-platform, high-performance 3d physics engine intended for games, virtual reality and physics-based simulations. Matali Physics and add-ons form physics environment which provides complex physical simulation and physics-based modeling of objects both real and imagined. Main benefits of using Matali Physics: Stable, high-performance solution supplied together with the rich set of add-ons for all major mobile and desktop platforms (both 32 and 64 bit) Advanced samples ready to use in your own games New features on request Dedicated technical support Regular updates and fixes You can find out more information on www.mataliphysics.com View full story
    45. We are pleased to announce the release of Matali Physics 4.4. The latest version introduces comprehensive support for Android 9.0 Pie, iOS 12.x and macOS Mojave (version 10.14.x). Latest version also introduces Matali Render 3.4 add-on with normal mapping and parallax mapping based on the distance from the observer as well as other improvements and fixes. What is Matali Physics? Matali Physics is an advanced, multi-platform, high-performance 3d physics engine intended for games, virtual reality and physics-based simulations. Matali Physics and add-ons form physics environment which provides complex physical simulation and physics-based modeling of objects both real and imagined. Main benefits of using Matali Physics: Stable, high-performance solution supplied together with the rich set of add-ons for all major mobile and desktop platforms (both 32 and 64 bit) Advanced samples ready to use in your own games New features on request Dedicated technical support Regular updates and fixes You can find out more information on www.mataliphysics.com
    46. G'day. So I have been doing a lot of work in the console as I try and learn the basics of C#. I got sick of going all over the place for ASCII symbols to use as "graphics" in my console apps so thought I would simply list them all here In addition, I found this cool "Ascii Font Generator" Ascii Font Generator (link) Finally some other cool links I found useful... http://www.asciiworld.com/ https://www.asciiart.eu/ https://manytools.org/hacker-tools/convert-images-to-ascii-art/ https://asciiart.website/ If you know of any other cool ones.. let me know!
    47. JWColeman

      Texture mapping acts weird on some images.

      Yep, Now, how to use it, is a whole other discussion I'm sure openGL has a device debug layer as well, its possible it could be throwing some warnings at you because of this.
    48. babaliaris

      Texture mapping acts weird on some images.

      I don't think i have a graphics debug. Can I somehow use Visual Studio for that?
    49. JWColeman

      Texture mapping acts weird on some images.

      It sounds like your shader is the problem then, a null value in an alpha channel in your shader could cause some unexpected behavior. Have you tried stepping through the graphics debug and checking the alpha value, when from your paint file, the alpha isn't there? The quick solution is to just not use any png files that don't have an alpha
    50. JWColeman

      Texture mapping acts weird on some images.

      So, the file from gimp worked but not from paint?
    51. babaliaris

      Texture mapping acts weird on some images.

      it has 3 channels. Also I tried to make this image on gimp and saved it as a PNG and it works just fine.
    52. JWColeman

      Texture mapping acts weird on some images.

      How many channels does your image have? Could there be a problem where you are reading in a channel that isn't there, or, applying a value to a channel that is causing the distortion? IIRC paint does not support alpha channel
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!