Jump to content
  • Advertisement
Sign in to follow this  
  • entries
    9
  • comments
    6
  • views
    3107

Silverpath Online Beta release and the sound implementation

Mert Oguz

1692 views

Yesterday, beta version has started and very first music, sounds added finally, some mechanichs have changed, test it out and feedback, I will be waiting for it. Currently game does not contain anything regarding purchasing.

 

 

1.png



6 Comments


Recommended Comments

I don't see a link to the Beta, or your site in your post. Is this an open beta?

Share this comment


Link to comment
13 minutes ago, Rutin said:

I don't see a link to the Beta, or your site in your post. Is this an open beta?

Yes simply type silverpath online on google playstore and you are in. (:

Share this comment


Link to comment
Rutin

Posted (edited)

I found it on Google Play: https://play.google.com/store/apps/details?id=com.ogzzmert.online.game&hl=en

I would highly suggest you plug your game link into your blog posts. It's better for exposure! :) Otherwise people can just pass your blog post by, I almost did. ;) 

EDIT: I see you have links on your prior posts, but this is the first one I've come across seeing your game. :) - Still a good idea for new viewers.

Edited by Rutin

Share this comment


Link to comment
1 minute ago, Awoken said:

what platform is this for?

It appears to be only for Android.

Share this comment


Link to comment

I need help in quest, Npc zaveb About the mixed potion, I do not understand what I should do

Share this comment


Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Advertisement
  • Advertisement
  • Blog Entries

  • Similar Content

    • By jb-dev
      This is how loading screens will look like. I still have no idea whenever or not I could show things like tips or anything alike...
    • By Tanzan
      Hello  all,
      I just finished my first Android game and published it on Google play...
      I know its not the next red dead redemption2 but it would be nice to have some comments/feedback on it if its worth it to go on with a release 2.0. or move on to the next game? (red dead redemption 3  )
      Anyway thx for your reading time and i hope on some nice reviews!
      https://play.google.com/store/apps/details?id=com.gamlex.android.games.typomania
      Regards,
       
      Tanzan
       
       
    • By lawnjelly
      After spending many hours painstakingly attempting to model creatures entirely by hand, I finally discovered (a couple of years ago) the skin modifier in Blender, which is a fantastic quick way to build organic creatures and shapes, especially for the artistically challenged like myself, and also makes rigging a breeze. I thought I would write a quick description for those unfamiliar.
      If you want ultimate control, particularly for a low poly creature, there is no substitute for manually creating polys. However, this can be very time consuming and tedious. If you are instead in a position where you are willing to trade off speed of creation against 'perfect rendering efficiency', or are making medium/high poly models, or models for later sculpting, then one of the options available is the skin modifier.
      Using the skin modifier, instead of modelling the skin by hand you place the joints (as vertices) of a creature to build a kind of skeleton, and allow the skin modifier to automagically generate a skin around this skeleton.
       
      Process
      Typically I start off by creating a plane, then go into edit mode, and merge the vertices to 1 in the centre. Next set up the modifier stack to create the skin. At the top of the stack goes a mirror modifier, because most animals are symmetrical bilaterally. Next goes the skin modifier, which creates a simple box-like skin around the skeleton. Finally add a subsurface modifier to smooth the skin, and make it more organic.
      Once the modifier stack is ready you can begin modelling. In the case of this bird, I started with a top-down view. Select the start vertex (there should now be a 'blob' around the single merged vertex), and create the skeleton by pressing 'e' to extrude and place a new vertex. I did this to place several vertices to create a backbone for the bird. You can then create wings and legs by picking one of the vertices in the backbone and extruding to the side.
      If you follow this process you can form a rough top-down skeleton, it doesn't have to be exact because it is easy to adjust, that is one of the beauties of skin modifier. I find it useful to google pictures of the skeleton of the animal for reference. Next look at side views and adjust the up-down position of the vertices (joints). The legs needed to be going downwards, and the head slightly up. Once I am happy with the basics of the structure, I start to fill it out. You do this by selecting a vertex, then pressing 'ctrl-a' then dragging with the mouse. You can make the skin thicker or thinner at each vertex.
      This can quickly give you a reasonable shape. You can further refine the shape by pressing 'ctrl-a' then limiting to either the x or y axis by pressing 'x' or 'y' before dragging. I used this to give a broad flat tail and wings.  
      Conclusion
      Pretty soon you can build a pretty good model. You can tweak a few things in the skin modifier, especially set a root vertex (e.g. pelvis) can make it easier for later animation.

       
      Skin modifier also makes rigging easy. Once you are happy with your skeleton, make a copy of the whole thing (so you don't lose the original), then choose 'create armature' from the skin modifier. This will create an armature and link it to the mesh so it is ready for posing and animating!
      I also typically choose smooth shading in the skin modifier, then manually add hard edges in mesh edit mode (ctrl-e, hard edge, and use in combination with the edge-split modifier). I also use this to select seams for uv mapping. Note that once I finish the skin modifier version I usually have to do a little tweaking of the polys manually, because there are some things it is not good at.
      Anyway, this has been a brief introduction to this method, I would encourage trying it and following some youtube tutorials.
       
      After some decimating and very rough texturing (~640 tris)

       
    • By _Nyu
      Hello,
      I'm trying to make a PBR vulkan renderer and I wanted to implement Spherical harmonics for the irradiance part (and maybe PRT in the future but that's another story).
      the evaluation on the shader side seems okay (it look good if I hardcode the SH directly in the shader) but when I try to generate it from a .hdr map it output only gray scale.
      It's been 3 days I'm trying to debug now I just have no clue why all my colour coefficients are gray.
      Here is the generation code:
       
      SH2 ProjectOntoSH9(const glm::vec3& dir) { SH2 sh; // Band 0 sh.coef0.x = 0.282095f; // Band 1 sh.coef1.x = 0.488603f * dir.y; sh.coef2.x = 0.488603f * dir.z; sh.coef3.x = 0.488603f * dir.x; // Band 2 sh.coef4.x = 1.092548f * dir.x * dir.y; sh.coef5.x = 1.092548f * dir.y * dir.z; sh.coef6.x = 0.315392f * (3.0f * dir.z * dir.z - 1.0f); sh.coef7.x = 1.092548f * dir.x * dir.z; sh.coef8.x = 0.546274f * (dir.x * dir.x - dir.y * dir.y); return sh; } SH2 ProjectOntoSH9Color(const glm::vec3& dir, const glm::vec3& color) { SH2 sh = ProjectOntoSH9(dir); SH2 shColor; shColor.coef0 = color * sh.coef0.x; shColor.coef1 = color * sh.coef1.x; shColor.coef2 = color * sh.coef2.x; shColor.coef3 = color * sh.coef3.x; shColor.coef4 = color * sh.coef4.x; shColor.coef5 = color * sh.coef5.x; shColor.coef6 = color * sh.coef6.x; shColor.coef7 = color * sh.coef7.x; shColor.coef8 = color * sh.coef8.x; return shColor; } void SHprojectHDRImage(const float* pixels, glm::ivec3 size, SH2& out) { double pixel_area = (2.0f * M_PI / size.x) * (M_PI / size.y); glm::vec3 color; float weightSum = 0.0f; for (unsigned int t = 0; t < size.y; t++) { float theta = M_PI * (t + 0.5f) / size.y; float weight = pixel_area * sin(theta); for (unsigned int p = 0; p < size.x; p++) { float phi = 2.0 * M_PI * (p + 0.5) / size.x; color = glm::make_vec3(&pixels[t * size.x + p]); glm::vec3 dir(sin(phi) * cos(theta), sin(phi) * sin(theta), cos(theta)); out += ProjectOntoSH9Color(dir, color) * weight; weightSum += weight; } } out.print(); out *= (4.0f * M_PI) / weightSum; }  
      outside of the SHProjectHDRImage function that's pretty much the code from MJP that you can check here:
      https://github.com/TheRealMJP/LowResRendering/blob/2f5742f04ab869fef5783a7c6837c38aefe008c3/SampleFramework11/v1.01/Graphics/SH.cpp
      I'm not doing anything fancy in term of math or code but I that's my first time with those so I feel like I forgot something important.
      basically for every pixel on my equi-rectangular hdr map I generate a direction, get the colour and project it on the SH
      but strangely I endup with a SH looking like this:
      coef0: 1.42326 1.42326 1.42326
      coef1: -0.0727784 -0.0727848 -0.0727895
      coef2: -0.154357 -0.154357 -0.154356
      coef3: 0.0538229 0.0537928 0.0537615
      coef4: -0.0914876 -0.0914385 -0.0913899
      coef5: 0.0482638 0.0482385 0.0482151
      coef6: 0.0531449 0.0531443 0.0531443
      coef7: -0.134459 -0.134402 -0.134345
      coef8: -0.413949 -0.413989 -0.414021
      with the HDR map "Ditch River" from this web page http://www.hdrlabs.com/sibl/archive.html
      but I also get grayscale on the 6 other hdr maps I tried from hdr heaven, it's just different gray.
      If anyone have any clue that would be really welcome.
    • By congard
      Hello!
      I tried to implement the Morgan's McGuire method, but my attempts failed. He described his method here: Screen Space Ray Tracing. Below is my code and screenshot.
      SSLR fragment shader:
          #version 330 core          uniform sampler2D normalMap; // in view space     uniform sampler2D depthMap; // in view space     uniform sampler2D colorMap;     uniform sampler2D reflectionStrengthMap;     uniform mat4 projection;     uniform mat4 inv_projection;          in vec2 texCoord;          layout (location = 0) out vec4 fragColor;          vec3 calcViewPosition(in vec2 texCoord) {         // Combine UV & depth into XY & Z (NDC)         vec3 rawPosition = vec3(texCoord, texture(depthMap, texCoord).r);              // Convert from (0, 1) range to (-1, 1)         vec4 ScreenSpacePosition = vec4(rawPosition * 2 - 1, 1);                  // Undo Perspective transformation to bring into view space         vec4 ViewPosition = inv_projection * ScreenSpacePosition;                  ViewPosition.y *= -1;              // Perform perspective divide and return         return ViewPosition.xyz / ViewPosition.w;     }          // By Morgan McGuire and Michael Mara at Williams College 2014     // Released as open source under the BSD 2-Clause License     // http://opensource.org/licenses/BSD-2-Clause     #define point2 vec2     #define point3 vec3           float distanceSquared(vec2 a, vec2 b) { a -= b; return dot(a, a); }           // Returns true if the ray hit something     bool traceScreenSpaceRay(         // Camera-space ray origin, which must be within the view volume         point3 csOrig,                // Unit length camera-space ray direction         vec3 csDir,               // A projection matrix that maps to pixel coordinates (not [-1, +1]         // normalized device coordinates)         mat4x4 proj,                // The camera-space Z buffer (all negative values)         sampler2D csZBuffer,               // Dimensions of csZBuffer         vec2 csZBufferSize,               // Camera space thickness to ascribe to each pixel in the depth buffer         float zThickness,                // (Negative number)         float nearPlaneZ,                // Step in horizontal or vertical pixels between samples. This is a float         // because integer math is slow on GPUs, but should be set to an integer >= 1         float stride,               // Number between 0 and 1 for how far to bump the ray in stride units         // to conceal banding artifacts         float jitter,               // Maximum number of iterations. Higher gives better images but may be slow         const float maxSteps,                // Maximum camera-space distance to trace before returning a miss         float maxDistance,                // Pixel coordinates of the first intersection with the scene         out point2 hitPixel,                // Camera space location of the ray hit         out point3 hitPoint) {               // Clip to the near plane             float rayLength = ((csOrig.z + csDir.z * maxDistance) > nearPlaneZ) ?             (nearPlaneZ - csOrig.z) / csDir.z : maxDistance;         point3 csEndPoint = csOrig + csDir * rayLength;               // Project into homogeneous clip space         vec4 H0 = proj * vec4(csOrig, 1.0);         vec4 H1 = proj * vec4(csEndPoint, 1.0);         float k0 = 1.0 / H0.w, k1 = 1.0 / H1.w;               // The interpolated homogeneous version of the camera-space points           point3 Q0 = csOrig * k0, Q1 = csEndPoint * k1;               // Screen-space endpoints         point2 P0 = H0.xy * k0, P1 = H1.xy * k1;               // If the line is degenerate, make it cover at least one pixel         // to avoid handling zero-pixel extent as a special case later         P1 += vec2((distanceSquared(P0, P1) < 0.0001) ? 0.01 : 0.0);         vec2 delta = P1 - P0;               // Permute so that the primary iteration is in x to collapse         // all quadrant-specific DDA cases later         bool permute = false;         if (abs(delta.x) < abs(delta.y)) {              // This is a more-vertical line             permute = true; delta = delta.yx; P0 = P0.yx; P1 = P1.yx;          }               float stepDir = sign(delta.x);         float invdx = stepDir / delta.x;               // Track the derivatives of Q and k         vec3  dQ = (Q1 - Q0) * invdx;         float dk = (k1 - k0) * invdx;         vec2  dP = vec2(stepDir, delta.y * invdx);               // Scale derivatives by the desired pixel stride and then         // offset the starting values by the jitter fraction         dP *= stride; dQ *= stride; dk *= stride;         P0 += dP * jitter; Q0 += dQ * jitter; k0 += dk * jitter;               // Slide P from P0 to P1, (now-homogeneous) Q from Q0 to Q1, k from k0 to k1         point3 Q = Q0;                // Adjust end condition for iteration direction         float  end = P1.x * stepDir;               float k = k0, stepCount = 0.0, prevZMaxEstimate = csOrig.z;         float rayZMin = prevZMaxEstimate, rayZMax = prevZMaxEstimate;         float sceneZMax = rayZMax + 100;         for (point2 P = P0;               ((P.x * stepDir) <= end) && (stepCount < maxSteps) &&              ((rayZMax < sceneZMax - zThickness) || (rayZMin > sceneZMax)) &&               (sceneZMax != 0);               P += dP, Q.z += dQ.z, k += dk, ++stepCount) {                           rayZMin = prevZMaxEstimate;             rayZMax = (dQ.z * 0.5 + Q.z) / (dk * 0.5 + k);             prevZMaxEstimate = rayZMax;             if (rayZMin > rayZMax) {                 float t = rayZMin; rayZMin = rayZMax; rayZMax = t;             }                   hitPixel = permute ? P.yx : P;             // You may need hitPixel.y = csZBufferSize.y - hitPixel.y; here if your vertical axis             // is different than ours in screen space             sceneZMax = texelFetch(csZBuffer, ivec2(hitPixel), 0).r;         }                   // Advance Q based on the number of steps         Q.xy += dQ.xy * stepCount;         hitPoint = Q * (1.0 / k);         return (rayZMax >= sceneZMax - zThickness) && (rayZMin < sceneZMax);     }          void main() {         vec3 normal = texture(normalMap, texCoord).xyz * 2.0 - 1.0;         vec3 viewPos = calcViewPosition(texCoord);                  // Reflection vector         vec3 reflected = normalize(reflect(normalize(viewPos), normalize(normal)));              vec2 hitPixel;         vec3 hitPoint;              bool tssr = traceScreenSpaceRay(             viewPos,             reflected,             projection,             depthMap,             vec2(1366, 768),             0.0, // zThickness             -1.0, // nearPlaneZ             1.0, // stride             0.0, // jitter             32, // maxSteps             32, // maxDistance             hitPixel,             hitPoint         );              //fragColor = texture(colorMap, hitPixel);              if (tssr) fragColor = mix(texture(colorMap, texCoord), texture(colorMap, hitPixel), texture(reflectionStrengthMap, texCoord).r);         else fragColor = texture(colorMap, texCoord);     } Screenshot:

      I create a projection matrix like this:
      glm::perspective(glm::radians(90.0f), (float) WIN_W / (float) WIN_H, 1.0f, 32.0f) This is what will happen if I display the image like this
      fragColor = texture(colorMap, hitPixel)
      colorMap: 
      normalMap: 
      depthMap:
      What am I doing wrong? Perhaps I misunderstand the value of csOrig, csDir and zThickness, so I would be glad if you could help me understand what these variables are.
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!