Jump to content
  • Advertisement

Search the Community

Showing results for tags 'Textures'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Categories

  • Audio
    • Music and Sound FX
  • Business
    • Business and Law
    • Career Development
    • Production and Management
  • Game Design
    • Game Design and Theory
    • Writing for Games
    • UX for Games
  • Industry
    • Interviews
    • Event Coverage
  • Programming
    • Artificial Intelligence
    • General and Gameplay Programming
    • Graphics and GPU Programming
    • Engines and Middleware
    • Math and Physics
    • Networking and Multiplayer
  • Visual Arts
  • Archive

Categories

  • Audio
  • Visual Arts
  • Programming
  • Writing

Categories

  • Game Dev Loadout
  • Game Dev Unchained

Categories

  • Game Developers Conference
    • GDC 2017
    • GDC 2018
  • Power-Up Digital Games Conference
    • PDGC I: Words of Wisdom
    • PDGC II: The Devs Strike Back
    • PDGC III: Syntax Error

Forums

  • Audio
    • Music and Sound FX
  • Business
    • Games Career Development
    • Production and Management
    • Games Business and Law
  • Game Design
    • Game Design and Theory
    • Writing for Games
  • Programming
    • Artificial Intelligence
    • Engines and Middleware
    • General and Gameplay Programming
    • Graphics and GPU Programming
    • Math and Physics
    • Networking and Multiplayer
  • Visual Arts
    • 2D and 3D Art
    • Critique and Feedback
  • Community
    • GameDev Challenges
    • GDNet+ Member Forum
    • GDNet Lounge
    • GDNet Comments, Suggestions, and Ideas
    • Coding Horrors
    • Your Announcements
    • Hobby Project Classifieds
    • Indie Showcase
    • Article Writing
  • Affiliates
    • NeHe Productions
    • AngelCode
  • Topical
    • Virtual and Augmented Reality
    • News
  • Workshops
    • C# Workshop
    • CPP Workshop
    • Freehand Drawing Workshop
    • Hands-On Interactive Game Development
    • SICP Workshop
    • XNA 4.0 Workshop
  • Archive
    • Topical
    • Affiliates
    • Contests
    • Technical
  • GameDev Challenges's Topics
  • For Beginners's Forum

Calendars

  • Community Calendar
  • Games Industry Events
  • Game Jams
  • GameDev Challenges's Schedule

Blogs

There are no results to display.

There are no results to display.

Product Groups

  • Advertisements
  • GameDev Gear

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


About Me


Website


Role


Twitter


Github


Twitch


Steam

Found 57 results

  1. I was wondering if anyone knows of any tools to help design procedural textures. More specially I need something that will output the actual procedure rather than just the texture. It could output HLSL or some pseudo code that I can port to HLSL. The important thing is I need the algorithm, not just the texture so I can put it into a pixel shader myself. I posted the question on the Allegorithmic forum, but someone answered that while Substance Designer uses procedures internally, it doesn't support output of code, so I guess that one is out.
  2. I'm trying to figure out how to design the vegetation/detail system for my procedural chunked-lod based planet renderer. While I've found a lot of papers talking about how to homogeneously scatter things over a planar or spherical surface, I couldn't find too much info about how the location of objects is actually encoded. There seems to be a common approach that involves using some sort of texture mapping where different layers of vegetation/rocks/trees etc. define what and where things are placed, but can't figure out how this is actually rendered. I guess that for billboards, these textures could be sampled from the vertex shader and then use a geometry shader to draw a texture onto a generated quad? what about near trees or rocks that need a 3D model instead? Is this handled from the CPU? Is there a specific solution that works better with a chunked-lod approach? Thanks in advance!
  3. How can I get scaling of a texture when drawn on screen? In the other words: how can I get the amount of texture elements (texel) a pixel on the screen takes up? i.e. if a texture has 100x100 pixels in size and it only takes up 20x20 pixels on the monitor screen then I want to calculate 5.0 as value. I don't need anything complex since it's a 2D scene with ortographic camera setup. I'm trying to do manual texture sampling in my fragment shader. It's a Cg prorgam inside a Unity project so if there is a built-in way to get/calculate this let me know. I feel like there are two ways: Calculate using viewport and camera information Calculate using world to screen space transformations Is there a better way? Which one should I implement and how?
  4. Let's say you have a random sprite sheet like this: How can you sample from it? I want to create a sprite system which you give it a sprite texture sheet and automatically samples from it and creates different textures. I really don't have a clue how to do this. The only think i can do now, is to manually try different texture coordinates for each figure i see on the texture (somehow to guess the texture coordinates for each figure) and then try them to see if I'm getting the correct part from the sprite sheet and if not depending on the output, i adjust the texture coordinates (a little to the left, or right or bottom or top) in order to get the part of the texture i want. This is cumbersome and does not allow me to create an automatic system for that. If only I knew the texture coordinates for each figure that would be easy. So am I missing something? Just downloading images from google is not enough? Do i need to find a special sprite sheet format or something which the creator already includes the texture coordinates too? Thank you.
  5. Hello again Recently I was trying to apply 6 different textures in a cube and I noticed that some textures would not apply correctly, but if i change the texture image with another it works just fine. I can't really understand what's going on. I will also attach the image files. So does this might have to do anything with coding or its just the image fault??? This is a high quality texture 2048x2048 brick1.jpg, which does the following: And this is another texture 512x512 container.jpg which is getting applied correctly with the exact same texture coordinates as the prev one: Vertex Shader #version 330 core layout(location = 0) in vec3 aPos; layout(location = 1) in vec3 aNormal; layout(location = 2) in vec2 aTexCoord; uniform mat4 model; uniform mat4 view; uniform mat4 proj; out vec2 TexCoord; void main() { gl_Position = proj * view * model * vec4(aPos, 1.0); TexCoord = aTexCoord; } Fragment Shader #version 330 core out vec4 Color; in vec2 TexCoord; uniform sampler2D diffuse; void main() { Color = texture(diffuse, TexCoord); } Texture Loader Texture::Texture(std::string path, bool trans, int unit) { //Reverse the pixels. stbi_set_flip_vertically_on_load(1); //Try to load the image. unsigned char *data = stbi_load(path.c_str(), &m_width, &m_height, &m_channels, 0); //Image loaded successfully. if (data) { //Generate the texture and bind it. GLCall(glGenTextures(1, &m_id)); GLCall(glActiveTexture(GL_TEXTURE0 + unit)); GLCall(glBindTexture(GL_TEXTURE_2D, m_id)); //Not Transparent texture. if (!trans) { GLCall(glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, m_width, m_height, 0, GL_RGB, GL_UNSIGNED_BYTE, data)); } //Transparent texture. else { GLCall(glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, m_width, m_height, 0, GL_RGBA, GL_UNSIGNED_BYTE, data)); } //Generate mipmaps. GLCall(glGenerateMipmap(GL_TEXTURE_2D)); //Texture Filters. GLCall(glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT)); GLCall(glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT)); GLCall(glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR)); GLCall(glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR)); } //Loading Failed. else throw EngineError("The was an error loading image: " + path); //Free the image data. stbi_image_free(data); } Texture::~Texture() { } void Texture::Bind(int unit) { GLCall(glActiveTexture(GL_TEXTURE0 + unit)); GLCall(glBindTexture(GL_TEXTURE_2D, m_id)); } Rendering Code: Renderer::Renderer() { float vertices[] = { // positions // normals // texture coords -0.5f, -0.5f, -0.5f, 0.0f, 0.0f, -1.0f, 0.0f, 0.0f, 0.5f, -0.5f, -0.5f, 0.0f, 0.0f, -1.0f, 1.0f, 0.0f, 0.5f, 0.5f, -0.5f, 0.0f, 0.0f, -1.0f, 1.0f, 1.0f, 0.5f, 0.5f, -0.5f, 0.0f, 0.0f, -1.0f, 1.0f, 1.0f, -0.5f, 0.5f, -0.5f, 0.0f, 0.0f, -1.0f, 0.0f, 1.0f, -0.5f, -0.5f, -0.5f, 0.0f, 0.0f, -1.0f, 0.0f, 0.0f, -0.5f, -0.5f, 0.5f, 0.0f, 0.0f, 1.0f, 0.0f, 0.0f, 0.5f, -0.5f, 0.5f, 0.0f, 0.0f, 1.0f, 1.0f, 0.0f, 0.5f, 0.5f, 0.5f, 0.0f, 0.0f, 1.0f, 1.0f, 1.0f, 0.5f, 0.5f, 0.5f, 0.0f, 0.0f, 1.0f, 1.0f, 1.0f, -0.5f, 0.5f, 0.5f, 0.0f, 0.0f, 1.0f, 0.0f, 1.0f, -0.5f, -0.5f, 0.5f, 0.0f, 0.0f, 1.0f, 0.0f, 0.0f, -0.5f, 0.5f, 0.5f, -1.0f, 0.0f, 0.0f, 1.0f, 0.0f, -0.5f, 0.5f, -0.5f, -1.0f, 0.0f, 0.0f, 1.0f, 1.0f, -0.5f, -0.5f, -0.5f, -1.0f, 0.0f, 0.0f, 0.0f, 1.0f, -0.5f, -0.5f, -0.5f, -1.0f, 0.0f, 0.0f, 0.0f, 1.0f, -0.5f, -0.5f, 0.5f, -1.0f, 0.0f, 0.0f, 0.0f, 0.0f, -0.5f, 0.5f, 0.5f, -1.0f, 0.0f, 0.0f, 1.0f, 0.0f, 0.5f, 0.5f, 0.5f, 1.0f, 0.0f, 0.0f, 1.0f, 0.0f, 0.5f, 0.5f, -0.5f, 1.0f, 0.0f, 0.0f, 1.0f, 1.0f, 0.5f, -0.5f, -0.5f, 1.0f, 0.0f, 0.0f, 0.0f, 1.0f, 0.5f, -0.5f, -0.5f, 1.0f, 0.0f, 0.0f, 0.0f, 1.0f, 0.5f, -0.5f, 0.5f, 1.0f, 0.0f, 0.0f, 0.0f, 0.0f, 0.5f, 0.5f, 0.5f, 1.0f, 0.0f, 0.0f, 1.0f, 0.0f, -0.5f, -0.5f, -0.5f, 0.0f, -1.0f, 0.0f, 0.0f, 1.0f, 0.5f, -0.5f, -0.5f, 0.0f, -1.0f, 0.0f, 1.0f, 1.0f, 0.5f, -0.5f, 0.5f, 0.0f, -1.0f, 0.0f, 1.0f, 0.0f, 0.5f, -0.5f, 0.5f, 0.0f, -1.0f, 0.0f, 1.0f, 0.0f, -0.5f, -0.5f, 0.5f, 0.0f, -1.0f, 0.0f, 0.0f, 0.0f, -0.5f, -0.5f, -0.5f, 0.0f, -1.0f, 0.0f, 0.0f, 1.0f, -0.5f, 0.5f, -0.5f, 0.0f, 1.0f, 0.0f, 0.0f, 1.0f, 0.5f, 0.5f, -0.5f, 0.0f, 1.0f, 0.0f, 1.0f, 1.0f, 0.5f, 0.5f, 0.5f, 0.0f, 1.0f, 0.0f, 1.0f, 0.0f, 0.5f, 0.5f, 0.5f, 0.0f, 1.0f, 0.0f, 1.0f, 0.0f, -0.5f, 0.5f, 0.5f, 0.0f, 1.0f, 0.0f, 0.0f, 0.0f, -0.5f, 0.5f, -0.5f, 0.0f, 1.0f, 0.0f, 0.0f, 1.0f }; //Create the Vertex Array. m_vao = new Vao(); //Create the Vertex Buffer. m_vbo = new Vbo(vertices, sizeof(vertices)); //Create the attributes. m_attributes = new VertexAttributes(); m_attributes->Push(3); m_attributes->Push(3); m_attributes->Push(2); m_attributes->Commit(m_vbo); } Renderer::~Renderer() { delete m_vao; delete m_vbo; delete m_attributes; } void Renderer::DrawArrays(Cube *cube) { //Render the cube. cube->Render(); unsigned int tex = 0; for (unsigned int i = 0; i < 36; i += 6) { if (tex < cube->m_textures.size()) cube->m_textures[tex]->Bind(); GLCall(glDrawArrays(GL_TRIANGLES, i, 6)); tex++; } }
  6. I'm currently learning how to import models. I created some code that works well following this tutorial which uses only one sampler2D in the fragment shader and the model loads just fine with all the textures. The thing is what happens when a mesh has more than one textures? The tutorial says to define inside the fragment shader N diffuse and specular samplers with the format texture_diffuseN, texture_specularN and set them via code, where N is 1,2,3, .. , max_samplers. I understand that but how do you use them inside the shader? In the tutorial the shader is: #version 330 core out vec4 FragColor; in vec2 TexCoords; uniform sampler2D texture_diffuse1; void main() { FragColor = texture(texture_diffuse1, TexCoords); } which works perfectly for the test model that the tutorial is giving us. Now lets say you have the general shader: #version 330 core out vec4 FragColor; in vec2 TexCoords; uniform sampler2D texture_diffuse1; uniform sampler2D texture_diffuse2; uniform sampler2D texture_diffuse3; uniform sampler2D texture_diffuse4; uniform sampler2D texture_diffuse5; uniform sampler2D texture_diffuse6; uniform sampler2D texture_diffuse7; uniform sampler2D texture_specular1; uniform sampler2D texture_specular2; uniform sampler2D texture_specular3; uniform sampler2D texture_specular4; uniform sampler2D texture_specular5; uniform sampler2D texture_specular6; uniform sampler2D texture_specular7; void main() { //How am i going to decide here which diffuse texture to output? FragColor = texture(texture_diffuse1, TexCoords); } Can you explain me this with a cube example? Lets say i have a cube which is a mesh and i want to apply a different texture for each face (6 total). #version 330 core out vec4 FragColor; in vec2 TexCoords; uniform sampler2D texture_diffuse1; uniform sampler2D texture_diffuse2; uniform sampler2D texture_diffuse3; uniform sampler2D texture_diffuse4; uniform sampler2D texture_diffuse5; uniform sampler2D texture_diffuse6; void main() { //How am i going to output the correct texture for each face? FragColor = texture(texture_diffuse1, TexCoords); } I know that the text coordinates will apply the texture at the correct face, but how do i now which sampler to use every time the fragments shader is called? I hope you understand why I'm frustrated. Thank you
  7. Today starts our new sprint 30. During the last sprint, we did get much work done. 😊 We can’t publish that much pics these days, since we don’t want to spoil the new video scenes. But, I tell you something. 😄 We finished modeling all objects for the kitchen, also for Clearwaters home-office and main hall. We offer you a lil’ glimpse 😉: We need more special textures for some of these objects. It’s a work item for the current sprint. We designed furniture and had fun. We finished texturing some houses from Clearwaters world-outside. We’ve to continue, anyway. We finished modeling all animations relevant for the new video as listed in the script from the last sprint. We made a first-scene animation where he realizes something is weird. We modeled a look-out-of-the-window animation, a looking-around animation, a sneak-animation, and so on. We also modeled several facial animations for the above-mentioned body animations. Some new faces can look left, right, up and down, etc. Clearwater is therefore able to move his eyes and to blink. But we still need to test ALL of these newly created animations in the current sprint. Now, we can throw all necessary-for-the-new-video ingredients in a big bowl. It’s our mission henceforth to make a tasteful soup out of it. How euphemistic! We want to start creating some video sequences, i.e. positioning the cameras in Clearwaters apartment, set light, test all animations, fix all Bugs and issues, adapt Clearwaters appearance, and make everything looking good. After having eaten all Bugs, the video is ready for YouTube. Furthermore, we modeled a hot animation for Clearwaters full body 4K Pic. It’s supposed to be a promotional pic that will appear on social media and on our new game blog as appetizer. 😉 During the current sprint, we want to test and render the pic of him, just minutes before we publish it. This week, I swear, I’ll continue writing the book BIZARRE Episode I! Charly Men’s website charly-men.com gets a new design, too. Our software developer started writing code. I get my own new Content Management System called BeeMS. Don’t ask me why it is so called. 😛 The new game blog for BIZARRE is included in the chaly-men.com website as it is now. The Charly Men website get’s expanded and will later include the Charly Men’s BIZARRE game blog, all Charly Men’s books, paintings, lyrics and songs. It gets published hopefully before the sun used up all its helium. I sum all up: We want to texture houses and objects in the apartment. We want to create some video sequences to test all animations and textures. We want to fix all issues and Bugs seen during the video sequence tests. We want to continue writing the book BIZARRE Episode I. We want to render a nice 4K Pic of Clearwater and publish it. We want to continue programming the new Charly Men website as well as the new game blog. I have a great appetite for soup now!
  8. I know that there are Vehicle and Car blueprints all over the net on almost every car model that exists and has existed, but surprisingly few textures. To be precise, photo textures of side view, front view and rear view of even common car models is hard to find. Even raster graphics drawn textures are rare. The best that could be found is on the Google Warehouse and typing 'textured car' in the search box. This will give you some good numbers of textured cars and car textures. Google warehouse is also excellent for finding textured buildings and building textures for any region. But where on earth are the vehicle textures?
  9. ShaderMap 4.1 is here and is a major update. It brings with it a lot of new features, changes, and bug fixes. Most exciting is PBR (Physically Based Rendering) Map creation, PBR Materials, and Multi Angle Light Scan nodes. Check out ShaderMap 4.1 here: https://shadermap.com I put a lot of work into this version so I hope everyone enjoys the update.
  10. Hornok Tibor

    Lack of idea for background

    Hello! I new for game developing and to gamedev.net. I hope somebody can point me to the right direction. I developing a vertical shoot em' up and I want to make some non-traditional backgrounds. I want to make something like hyperspace but more cartoonish and colorful, and of course in top-down. i can't find any good example, so if any of you know any way to get good inspiration to make it, thanks!
  11. Cultusfit

    GIMP Alpha Channel packing

    Just grabbed the GIMP program to try to do some channel packing to give myself more optimized textures. I am able to add the alpha channel. But I cant just seem to bring a greyscale image in as my Alpha. Any one familiar with this in this program and able to help me out? I would appreciate it. I know you cant draw directly on the Alpha, you have to use a mask, so i assume i'm over looking how to just pull the image in as the mask?
  12. For two days I have been struggling with trying to draw a image on a plane at the exact point. I have a UV point, however because of how the shaders work, or at least my understanding of it, I the piece of code only effects the current pixel. So I can easily draw gradients, change the color of the pixel at that UV point and all kinds of effects. However the only way I could think of drawing the image was to calculate the four corners from the center point, except after I found these points using 4 "If" branches I have no idea on how to use these points to draw the images. The shader is just a basic shadless vertex and fragment shader. Unlit.
  13. I'm looking for someone i can collaborate with in Unity or UE4, honestly any game engine is fine. My style of art is low to mid poly range, 3rd or 3rd person top down. Dungeon / Wolrd of Warcraft-esk style. Let me know if anyone is looking for group project practice. Lets make a small, simple game to start.Follow up for contact info.Here's a few examples.
  14. Sprint no. 27 is here, and we skipped the sprint 26 on gamedev.net in June as the #WORDSproductions team members sent me a lot of vacation requests over the last few weeks (and I was on holiday, too 😉), so that we could not complete as much items from the sprints as we initially wanted, but we opened up a lot of nice issues either way. Take a look! First, the concept art of the next game scene is finished. The scene we want to recreate after Clearwater's apartement scene has been completed. It looks nice. It's a scene from the book BIZARRE Episode I. Furthermore, we continued texturing the houses of the world-outside-Charlys-apartement-window. We haven’t finished it yet, as it became a little game environment already, with many houses, with streets, street lamps, fences, and so on. We use a nice texturing software and began painting the bricks or the colored cement onto the houses exterior walls. Bit by bit, one by one. We use textures, we photographed in our neighborhood, or we’ve found by accident anywhere in the world. We continued programming Clearwater’s shooting behavior. We implemented some basic shooting mechanics in the player controller using c++. The player is now able to pull his gun which triggers the correct state transition causing the associated animation to play. The logic to control the shooting animation blueprint and aim offset, letting Charly aim at the target he is shooting at, is now in place. We did improve main-character Charly Clearwater’s appearance again, by adapting vertices and textures and light. But, obviously, we won’t stop improving his appearance and behavior until the finished game is published one day, and after that, we’ll permanently upload updates including some nice main-character improvements. 😄 Next, we started modelling Charly Clearwater’s facial animations. As I said, during the colored woman video, we’d published on Halloween last year, the facial animations could not be imported in UE4 caused by a mean Bug. But now I ate the Bug, and the animations can be imported now without any errors. We need some facial expressions for the planned video, but also for some 4k pics of Clearwater, we want to render later on. He’s now able to smile. To be continued… I continued writing the book BIZARRE Epsiode I. It works fluently. No Bugs, no thinking barriers. I’ll keep writing. Our plans for the current sprint 27 are: We'll continue painting walls and modelling houses, trees etc. We'll continue programming Clearwater’s shooting animation and behavior. We need to adjust his pull-the-gun-and-shoot animation. We need to make him look hot and unaffected while killing. We'll continue modelling some relevant facial expressions and import them correctly in UE4, hopefully. Test them. Adapt them. Until they look flawless. We want to shoot a status report video, in which I talk about our successes of the last year and our targets for the next year. In May 2017, the BIZARRE Game project was born and one year after, I pass the year in review. It’s planned to include English subtitles. It is also planned to publish this video in July on YouTube. I’ll post it here, too. Yeah, some props must be done, too. Stuff for the kitchen, some books, a couch. Peanuts. Did I miss something? I think that’s enough. CU
  15. Dark Fantasy Environment and Props Hi GameDevs, I am currently working on a model pack for the Unity Asset Store. I am however not satisfied with the overall look of these game-ready models. There would be many architectual and prop objects from gates and windows to lamps, furniture, decoration etc. The style would be a gloomy, mystical, dark fantasy-inspired look, similar to a dark elf or vampire castle interior. For modeling I use Blender, for texturing Substance Designer and PS. Please take a look at my work and help me figuring out how should I improve the textures. I use the Smoothness / Metallic workflow, and would like to add Ambient Occlusion separately (screen space). I also use Emissive maps where needed (lamps). If you see some obvious flaws in contour, colors, etc please note those too. In these images I threw the objects into Unity, no light setup and compositing was done (I still need to learn those for presenting my stuff). Link to my Sketchfab (here you can see the crystal lamps in 3D). Thank you for your attention!
  16. I'm very new to OpenGL and graphics programming, and there is probably a simple answer to this. I've made a function that defines a grid and used it to make the "ground" of my game. I added the textures, which have a diffuse, specular, and normal map. When viewed closely the textures look fine However, from the side they look very pixelated I would appreciate if someone points me in the right direction as to how to handle this problem, as since i'm a beginner i'm not quite sure what's causing it. I'm not sure if they'd help but the texture params i'm using are: GL_LINEAR_MIPMAP_LINEAR for min filter, GL_LINEAR for mag filter, and i'm using openGL to generate the mipmaps (glGenerateMipmap). Thanks again
  17. Hello everyone, I'm looking for some advice since I have some issues with my textures for my mouse pointer and I'm not sure where to start to look. I have checked everything that I know off and now I'm in need of advice on what to look for in my code when I try to fix it. I have a planet that is rendered, I have a UI that is rendered and I also have a mouse pointer that is rendered. First the planet is rendered, then the UI and then the mouse pointer last. When the planet is done rendering I turn off Z-Buffer and enable Alpha Blending while I render the UI and the Mouse Pointer. In the Mouse Pointers Pixel Shader I look for black color and if that is the case I blend it. But what seems to happen is that it also blends part of the texture that isn't supose to be blended. I'm going to provide some screenshot of the effect. In the first image you can see that the mouse pointer changes color to a more white one when behing infront of the planet. The correct color is the one that is displayed when it's not infron of the planet. The second thing I find weird is that the mouse pointer is behind the ui text even tho it is rendered after. I also tried switching them around and it makes no difference. Also the UI doesn't have the same issues when being above the planet, it's color is displayed as it should. Here comes the Pixel Shader code if that helps anyone get a better grip of the issue: float4 color; color = shaderTexture.Sample(sampleType, input.tex); if(color.b == 0.0f && color.r == 0.0f && color.g == 0.0f) { color.a = 0.0f; } else { color.a = 1.0f; } return color; The UI uses almost the same code, but only checks the r channel of the color but I'm using all 3 channels in the Mouse Pointer due to colors might be abit more off. Should be that if the pixel is black it's should be blended. And it does work, but it's just that somehow it also does something with the parts that shouldn't be blended. Right now I'm leaning towards there being something in the Pixel Shader since I can set all pixels to white and it behaves as it should and creates a white box for me. Any pointers of what kind of issues I'm looking at here and what to search for to find a solution will be appreciated alot Best Regards and Thanks in Advance Toastmastern
  18. Hi all, I'm trying to generate MIP-maps of a 2D-array texture, but only a limited amount of array layers and MIP-levels. For instance, to generate only the first 3 MIP-maps of a single array layer of a large 2D-array. After experimenting with glBlitFramebuffer to generate the MIP-maps manually but still with some sort of hardware acceleration, I ended up with glTextureView which already works with the limited amount of array layers (I can also verify the result in RenderDoc). However, glGenerateMipmap (or glGenerateTextureMipmap) always generates the entire MIP-chain for the specified array layer. Thus, the <numlevels> parameter of glTextureView seems to be ignored in the MIP-map generation process. I also tried to use glTexParameteri(..., GL_TEXTURE_MAX_LEVEL, 3), but this has the same result. Can anyone explain me how to solve this? Here is an example code, how I do it: void GenerateSubMips( GLuint texID, GLenum texTarget, GLenum internalFormat, GLuint baseMipLevel, GLuint numMipLevels, GLuint baseArrayLayer, GLuint numArrayLayers) { GLuint texViewID = 0; glGenTextures(1, &texViewID); glTextureView( texViewID, texTarget, texID, internalFormat, baseMipLevel, numMipLevels, baseArrayLayer, numArrayLayers ); glGenerateTextureMipmap(texViewID); glDeleteTextures(1, &texViewID); } GenerateSubMips( myTex, GL_TEXTURE_2D_ARRAY, GL_RGBA8, 0, 3, // only the first 3 MIP-maps 4, 1 // only one array layer with index 4 ); Thanks and kind regards, Lukas
  19. Louis Brady

    Week One: Basic Geometry

    Week one is done To start off, I'd like to mention that I have no timeline for this project. This project's purpose is to get me more familiarized with game development and implementing Blender obj's into Unity. Then, eventually teaching myself basic coding for a 3d platformer. Continually, I will teach myself how to rig a character, add animation into Unity, add cut scenes, develop GUI's and a smooth user interface. The list is endless as well as standard. Also with this project, I will feel out the process of developing (and finishing) a game which will then allow me to decide if game development is for me. I've always had the idea in the back of my head, so now is the time to discover the truth! Before I get into my update, I would like to add that I have been using Blender and Unity for a few years, but I never completed a game. I did, however, finish a game using GameMaker https://www.newgrounds.com/portal/view/691738. This game will be different. Here is what I have accomplished this week: I was inspired by Ocarina of Time. I saw the buildings, the textures, and the low poly characters and thought "I could totally do something like this with Blender." Mario 64 was always a huge inspiration too, so I revisited some of the levels and noticed that most of the older 64 games used basic geometry and textures to create great games. I'm always hesitant, though, because all the other indie developers are doing unique systems, great art, things that I could not compete with. I thought, "Enough! If I'm ever going to do this, I'm going to have to do it for myself before I could please the masses." My mindset is locked in placed and I'm determined to make a 64-esque platformer. I started by created a prop set for myself. I skipped the concept art and the idea boards and jumped right into designing elements that I wanted in the game. I created cubes and put textures on them. I created ramps like I would imagine Mario 64 would do. I created stylized trees, bridges, fences, rocks, grass, and railings. Then, I put all those objects into Unity and played with the coloring of the textures. The lighting. I want to see if my vision would sustain the transfer. It did. I like how the objects look in Unity. After I finished transferring my objects, I was going to design a level with the pieces I created, but I couldn't. I needed to know how to scale my level. Which brings me to the end of my week. I just started constructing a main character. I want the player to be a small dragon. I'm in the process of figuring out the best design for him as well as how I think he should be colored. I'm texture painting him because unwrapping him seems too complicated at the moment. That's where I am at now! Next Sunday I will be back with another update 😜
  20. Hi can you guys please help out with this blade? I want it to look more like a flame blade. (will change pallete after coloring it right), I'm a complete beginner that wants to make a huge content mod for Terraria, but as you can see I'm very inexperienced in spriting.
  21. Kjell Andersson

    Caustics Generator 5.0 Available

    Spectral rendering enters the computer graphics world. After a period with not that many feature improvements, Dual Heights Software have now finally released a new major version of their water texture generator software called Caustics Generator. The new version includes full spectrum color rendering allowing the simulation of spectral refraction which produces prismatic coloring effects and thus bringing caustics simulation to a new level. The spectral functionality is available in the Pro version on both Windows, Mac and Linux. Read more about the Caustics Generator at https://www.dualheights.se/caustics/
  22. Kjell Andersson

    Caustics Generator 5.0 Available

    Spectral rendering enters the computer graphics world. After a period with not that many feature improvements, Dual Heights Software have now finally released a new major version of their water texture generator software called Caustics Generator. The new version includes full spectrum color rendering allowing the simulation of spectral refraction which produces prismatic coloring effects and thus bringing caustics simulation to a new level. The spectral functionality is available in the Pro version on both Windows, Mac and Linux. Read more about the Caustics Generator at https://www.dualheights.se/caustics/ View full story
  23. Hi, I'm implementing a simple 3D engine based on DirectX11. I'm trying to render a skybox with a cubemap on it and to do so I'm using DDS Texture Loader from DirectXTex library. I use texassemble to generate the cubemap (texture array of 6 textures) into a DDS file that I load at runtime. I generated a cube "dome" and sample the texture using the position vector of the vertex as the sample coordinates (so far so good), but I always get the same face of the cubemap mapped on the sky. As I look around I always get the same face (and it wobbles a bit if I move the camera). My code: //Texture.cpp: Texture::Texture(const wchar_t *textureFilePath, const std::string &textureType) : mType(textureType) { //CreateDDSTextureFromFile(Game::GetInstance()->GetDevice(), Game::GetInstance()->GetDeviceContext(), textureFilePath, &mResource, &mShaderResourceView); CreateDDSTextureFromFileEx(Game::GetInstance()->GetDevice(), Game::GetInstance()->GetDeviceContext(), textureFilePath, 0, D3D11_USAGE_DEFAULT, D3D11_BIND_SHADER_RESOURCE, 0, D3D11_RESOURCE_MISC_TEXTURECUBE, false, &mResource, &mShaderResourceView); } // SkyBox.cpp: void SkyBox::Draw() { // set cube map ID3D11ShaderResourceView *resource = mTexture.GetResource(); Game::GetInstance()->GetDeviceContext()->PSSetShaderResources(0, 1, &resource); // set primitive topology Game::GetInstance()->GetDeviceContext()->IASetPrimitiveTopology(D3D_PRIMITIVE_TOPOLOGY_TRIANGLELIST); mMesh.Bind(); mMesh.Draw(); } // Vertex Shader: cbuffer Transform : register(b0) { float4x4 viewProjectionMatrix; }; float4 main(inout float3 pos : POSITION) : SV_POSITION { return mul(float4(pos, 1.0f), viewProjectionMatrix); } // Pixel Shader: SamplerState cubeSampler; TextureCube cubeMap; float4 main(in float3 pos : POSITION) : SV_TARGET { float4 color = cubeMap.Sample(cubeSampler, pos.xyz); return color; } I tried both functions grom DDS loader but I keep getting the same result. All results I found on the web are about the old SDK toolkits, but I'm using the new DirectXTex lib.
  24. Looking For Talent Looking For: - 3D Modelers (Natural & Hard Surface) - 3D Animators - Texture Artist - Game Designers Payment: Royalty My Experience: Programmer & Manager & Game Designer. My recent project: (so you see that I'm serious and actually finishing and releasing games) `http://store.steampowered.com/app/787820/Rickos_Island/` Project: A voxel open world game. It will be released on Steam in Early Access. Mostly completed/completed key features: - A Procedural & Infinite world to explore. - Over 12 Biomes to discover and find new creatures, blocks, resources and more. - A voxel block building system with over 150+ Blocks. - 30+ creatures to tame, train, ride, breed. (0 now but the mechanics are ready, just need models and animations) - Character progression system: train your character: level up, increase skills and more. - 6 Different Game Modes: Survival, Freedom, Hardcore, Freecam, Creative, Adventure. - The world noise gets crazier the more farther you get from the start. - Action System: let's be honest, it's like redstone but has more features and wireless. - Fully automate everything using the Action System, from a crops farm to an auto killing creatures machine. - Plant, farm and grow using an advanced farming system. Not completed key features: - Forge custom tools and weapons using an advanced system. - Fight mini bosses and bosses in order to progress in the game and unlock new items. - Find and master dungeons to get XP and higher stats weapons, tools, armor. - Magic: get essences from the nature and make potions, spells and more. Length of Project: Hoping for 2-4 months until the game is ready to release in Early Access, but then a few more months in the Early Access period. If you can only help for a limited time, that's fine but please mention it when you contact me! Contact: Add me on Discord Tbjbu2#8639
  25. Good evening, I'm new to the world of texturing and I have some questions about creating realistic non-procedural materials. I created a texture for a wooden bridge and I wondered what was the best way to make it realistic. First I created a square texture on Substance Designer (SD_Wood). Once applied to the model on Blender it is repeated several times and I do not really like the result (Blender_Tilable). I then imported the texture on photoshop to customize it a little (4096 x 1024) (PS_Custom). Now the result is more pleasant (Blender_Custom). My questions are: 1. Is this the best solution? 2. Are there any other ways to get this kind of result? 3. If this is the best solution, how can I regenerate the normal map, height map, ao, roughness etc. ? 4. From a theoretical point of view a non-square texture, but that is still 2^, is not recommended or can be used for the development of a videogames? Thanks in advance for any clarifications. Have a nice day!
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!