Jump to content
  • Advertisement
    1. Past hour
    2. https://discord.gg/DZPrnGClick to join now and join in with everyones development share snippets or even share memes see you soon!
    3. https://discord.gg/DZPrnGClick to join now and join in with everyones development share snippets or even share memes see you soon!
    4. https://discord.gg/DZPrnGClick to join now and join in with everyones development share snippets or even share memes see you soon!
    5. https://discord.gg/DZPrnGClick to join now and join in with everyones development share snippets or even share memes see you soon!
    6. https://discord.gg/DZPrnGClick to join now and join in with everyones development share snippets or even share memes see you soon!
    7. https://discord.gg/DZPrnGClick to join now and join in with everyones development share snippets or even share memes see you soon!
    8. https://discord.gg/DZPrnGClick to join now and join in with everyones development share snippets or even share memes see you soon!
    9. This is what my statement is about, you can put anything into a unity file but this also means building your whole architecture every time you change just a single line of code. If you can modularize your project, setting up Log System, Resource Manager or whatever once and never touch it, why should you want to put it into your unity file too. Instead compile them once and use a clever tool like Visual Studio that is able to know when to recompile. No need to process things never touched. @lawnjelly's suggestion of using development time DLL/SO and release time static libraries is a good addition
    10. Today
    11. hi. as you know we have many style of games that need different multiplayer architecture and connection model. for example architecture for XO or chess game is different from clash royale and clash royale is differnt from clash of clans and them from call of duty. as im working on some new ideas, im just looking for standards or best practices. question is : is there any best practices or i just have go for R&D for my games? if there is best practice for turn based architecture, turn organization and .... can you give me references?
    12. Zakwayda

      Capsule-Capsule Detection

      Too late to edit my previous post, but that code seems to test for whether the capsules are just touching, whereas I'm guessing you want to know if they're intersecting at all. For that, you only need check that the distance is < or <= twice the radius (or you can do a squared distance check and save some math). It looks like the source you're using as reference already addresses numerical problems. If there's some other issue you're trying to address via an epsilon, maybe you could clarify what it is.
    13. I'm positive my code is correct with the exception of my implementation of multiplayer functionality with the tracker bots. I'm following this course closely, I can't say i understand the way controllers work in C++ just yet, but i have an understanding that the code which controls my character is located within the Character.cpp and header of that. I appreciate the help alot, So i have to learn how to make a controller class then? Would you be able to tell me why this is the case? why controllers in C++ have to be separated from the character? I will check this videos out thankyou, much appreciations
    14. Matze Hu

      Translation needed?

      push
    15. Zakwayda

      Capsule-Capsule Detection

      Irrespective of whether the rest of the code is correct (it may or may not be), I'm not sure how this is supposed to work: diff = abs( close_d - (2*S_RADIUS) ) if(diff <= epsilon): return True else: return False If the distance is less than twice the radius, then 'close_d - (2*S_RADIUS)' will be some negative value, and 'abs( close_d - (2*S_RADIUS) )' will be some positive value that's likely > epsilon, therefore the function will (incorrectly) return false. Am I missing something there? (I may be.)
    16. alimuzaffarkhan

      Looking For Constructive Criticism(My First Sprites)

      Constructive criticism... You do now that this is Internet. Right? :P Jokes apart... That looks good. Although, I am not a graphics guy by a longshot.
    17. fleabay

      plane game

      I see your point but it's not the same thing. I should have sent him a pm about the proper use of a blog but I don't want to get flagged as using the pm's as harassing and end up with another unwarranted warning.
    18. ToeBeans The Brave

      Looking For Constructive Criticism(My First Sprites)

      Thank you for the advice! I totally see what you mean about it seeming "glued" in some areas, and I'll work on fixing it. I found the videos that you provided to be particularly helpful and will be coming back to reference them
    19. Wow! I wish my first sprites looked like this! Very good job. Some things to note: 1. When you're doing an idle stance look for parts that seem "glued" or stuck. It looks as if the sprite is doing shrugs. You should try to incorporate more motion if possible. Adding in some "sway" could breath more life into the animation. 2. The same thing with the running animation, the shoulders and upper body appear motionless in terms of vertical movement. I don't have too much time to find resources but look at the end of this video. Notice the full body movement?
    20. Hello everyone! I'm new to programming and pixel art, and I would greatly appreciate hearing some feedback on my Idle and Running animations. I'm looking for constructive criticism so that I can become better at making sprites in the future, so don't hold back!
    21. Prototype

      Github and Viruses

      That's not an open question, it's simply 'no'. It could however trigger a vulnerability in the software used to open it, as we've seen with the JPG decoder in GDI. That's why it's important to protect your software from buffer overruns and such. I think the major reason GitHub has this in place is to prevent liability claims, after all you could theoratically download an infected executable with their service. You'd still need to run it yourself in some way though. There is no magic in your file system that lets downloaded files do anything.
    22. phil67rpg

      plane game

      well let me explain my blog, I first have drawn two planes using sprites they start on opposite sides of the screen, then they can rotate left and right using either the left or right arrow keys, or the "a" or "d" keys. also the ships can move up and down using the up and down arrow keys or the "w" or "x" keys. furthermore the ships can shoot bullets using the space bar key or the "s" keys. however when a ship gets hit by a bullet I want to draw an animated collision sprite but it does not do this.
    23. jbadams

      plane game

      My irony sense is tingling.
    24. The tools update script is needed if you build Unreal Engine from source (github) but not if you just use the download. How confident are you that your C++ code is correct, and that you're not using something like global or static variables that let data "bleed" between objects? How confident are you that your character has the correct controller assigned? (Characters need controllers, especially in networked games -- trying to drive Character/Actor directly is bound to fail because you're going against how the engine wants to work.) Can you get a simple multiplayer project to work correctly, using blueprint? If so, start looking at what's different between that project and yours. Here's the YouTube videos I was talking about, I highly recommend them! There's a little bit of networking update in 4.20 and onwards, here's another series of videos: Looking through these videos will likely help you get another view into how networking in Unreal is supposed to work in general, which will probably help you build up the skills necessary to debug whatever your current problem is. Another option is to start at the beginning, set breakpoints where your character is supposed to receive movement commands, and set a breakpoint where the keyboard input generates the movement commands, and then step through it all in the debugger, making sure that the data goes to the right point at each step. At some point in this process, you will find that the command somehow gets ignored, dropped, changed, or countermanded, or maybe just not sent upstream, and that will be your problem.
    25. ThinkSmall98

      Capsule-Capsule Detection

      Hi, I used the 3D shortest distance between two line segments algorithm at this website: http://geomalgorithms.com/a07-_distance.html#dist3D_Segment_to_Segment This function in Python is checking if two capsules intersect. I checked the algorithm from the website and it seems to work. I tried implementing an epsilon to help with floating point error, but I don't think I did it correctly. Help would be much appreciated. def check_intersection(particle1,particle2): decimal.getcontext().prec = 100 epsilon = 2**-52 #implement epsilon small_num = 0.00000001 #number to check if they're closely parallel u = particle1.get_s() #s1 v = particle2.get_s() #s2 p0 = particle1.get_p1_position() #P0 q0 = particle2.get_p1_position() #Q0 w = np.array([p0[0]-q0[0], p0[1]-q0[1], p0[2]-q0[2]]) #distance from 2 particles from their p1's a = u[0]**2 + u[1]**2 + u[2]**2 #dot product of u*u. Always >=0 b = u[0]*v[0] + u[1]*v[1] + u[2]*v[2] #dot product of u*v. c = v[0]**2 + v[1]**2 + v[2]**2 #dot product of v*v. Always >=0 d = u[0]*w[0] + u[1]*w[1] + u[2]*w[2] #dot product of u*w e = v[0]*w[0] + v[1]*w[1] + v[2]*w[2] #dot product of v*w D = (a*c)-b**2 #always >=0 #Set all to defaults sc = sN = sD = D #sc = sN / sD, default sD = D >= 0 tc = tN = tD = D #tc = tN / tD, default tD = D >= 0 if D**2 < small_num: # checks if SCs are parallel sN = 0.0 # force using point P0 on segment S1 sD = 1.0 # to prevent possible division by 0.0 later tN = e tD = c else: # get the closest points on the infinite lines sN = (b * e) - (c * d) tN = (a * e) -(b * d) if sN < 0.0: sN = 0.0 tN = e tD = c elif sN > sD: # sc > 1 => the s=1 edge is visible sN = sD tN = (e + b) tD = c if tN < 0.0: # tc < 0 => the t=0 edge is visible tN = 0.0 # recompute sc for this edge if -d < 0.0: sN = 0.0 elif -d > a: sN = sD else: sN = -d sD = a elif tN > tD: # tc > 1 => the t=1 edge is visible tN = tD # recompute sc for this edge if (-d + b) < 0.0: sN = 0.0 elif (-d + b) > a: sN = sD else: sN = (-d + b) sD = a # division to get sc and tc if abs(sN) < small_num: sc = 0.0 else: sc = sN / sD if abs(tN) < small_num: tc = 0.0 else: tc = tN / tD # difference of 2 closest points dP = np.array( [w[0] + (sc * u[0]) - (tc * v[0]), w[1] + (sc * u[1]) - (tc * v[1]), w[2] + (sc * u[2]) - (tc * v[2])] ) # dP = w + np.multiply(sc,u) - np.multiply(tc,v) #S1(sc) - S2(tc) close_d = (math.sqrt(dP[0] ** 2 + dP[1] ** 2 + dP[2] ** 2) ) # closest distance b/w 2 lines # check if distance <= radius * 2, if so, INTERSECTION! diff = abs( close_d - (2*S_RADIUS) ) if(diff <= epsilon): return True else: return False
    26. I re-installed the Unreal Engine and this fixed the issue. I don't know what a DLL is or how to use tools updater script. I don't have a Controller set up in the unreal engine, I'm still very much learning, but i feel comfortable with most of the terms and how to operate in the unreal engine with c++. Tom Said that there might be an Override function that isn't calling it's Super::MyFunction, which i could not find. He said that the character is automatically server implemented. But i am using different movement code in my Character.cpp to him. The thing i am trying to get my head around is that i am adding Code to the AI "TrackerBot" but that is somehow effecting my Character which seems odd to me. I am going to try some different things and see what works best. I am trying to avoid my code from becoming messy, i also can't do damage to these tracker bots for some reason, but i think i can fix that. My character works fine on the client side of things. I appreciate the help hplus0603
    27. here: https://www.moddb.com/company/honor/jobs/unpaid-3d-artist-wanted-for-released-sifi-rts-mod let me know if your interested.
    28. I have an application that renders to two targets. struct PixelOut { float4 Colour: SV_Target0; unsigned int Pick: SV_Target1; }; The value I write to SV_Target1 is an ID that describes the geometry rendered. After rendering the frame, I read this texture back on the CPU for analysis. The above works fine when I have multisampling disabled, but when I enable it, I also have to set the second rendertarget to have sampling too. But this obviously doesn't work as I need the ID to be intact, not some resolved value that will produce a different value. Is there any way around this? I am using DirectX 11
    29. Yesterday
    30. fleabay

      plane game

      Giving advice where none is asked for on a blog is bad form. You did the same thing to me on my blog.
    31. Yeah, Nagle doesn't impact anything with large packets. Instead, it's quite likely that you drop a packet now and then, and the TCP stream temporarily stalls. This is what TCP connections do. You need to add enough buffering / read-ahead that a few hundred milliseconds doesn't matter; you just run down to less data available in the buffer while that goes on. If the additional delay from a playout buffer is not acceptable for you, then you'll likely want to use UDP instead of TCP for streaming. That, of course, opens up other problems. You might want to look into a dedicated protocol on top of UDP, like RTP and RTCP.
    32. "unable to create child process" can be one of many things. Does it work if you run as admin? Does it work if you install more RAM? Is there a DLL dependency that changed? Have you re-run the Unreal Engine tools updater script? When it comes to Unreal networking, you will typically not give authority to a client; instead the client will send input messages to the controller for the character actor on the server, and those would be repeated out to all other clients. There are some reasonable YouTube tutorials about this that I've liked in the past. (I don't know if they changed the network system in the last few versions, it's been a while since I used this.)
    33. Zakwayda

      OBB-OBB detected 'intersected' but It's not

      Ok, in that case, the code you posted is incomplete. It'll return the correct results for some configurations, but will return false positives for others. (As I mentioned earlier those other configurations don't necessarily come up often in practice, so such an implementation can appear to work reliably even though it's incomplete.) It's also worth noting that while your code is a general-purpose solution for arbitrary convex polytopes, the test for oriented boxes can be implemented more efficiently by exploiting the regularity of the shape (the code the OP posted is an example of this).
    34. 51mon

      3D smooth voxel terrain LOD on GPU

      Obviously I don't know if this is suitable for your setup Anyhow how I meant was just that sdf's are good for computing collisions. You know when you intersect with them and you can even know the vector (distance, direction) to closest point. So if you have an sdf in video memory you can do collisions efficiently & robust on the gpu. It's also possible to upload to cpu. You should do it triple buffered otherwise high chance of stalling btw
    35. _WeirdCat_

      OBB-OBB detected 'intersected' but It's not

      Its a 3d thing i just made screenshots of one projection, so one can see it, this stays true for 3d too. These shapes are actually 3d objects, however guy has too low EPSILON when comparing floats he should use 0.001 as epsilon
    36. Zakwayda

      OBB-OBB detected 'intersected' but It's not

      Yeah, I'm familiar with the algorithm (nice diagrams by the way!). And it works more or less as you describe for 2-d. The problem is that the OP is performing the test in 3-d, and in 3-d you have to test additional axes to cover all possible configurations. So although your test may be correct for 2-d, it doesn't (fully) address the specific problem under discussion.
    37. _WeirdCat_

      OBB-OBB detected 'intersected' but It's not

      Found one bug const float EPSILON = 1.0e-6f; Seems that you cant compare something that is less than 0.001 I'll try to explain twisted logic behind this, i've studied this few times always with the same result: consider all faces point outside of the shape, (Test A against B) Through A faces, check whenever all B shape vertices lie in front of this face if yes then theres no intersection at all (return false) if first test passes then you test B against A to see if all of A vertices lie in front of any B face, otherwise it overlaps.
    38. dyl_pickle

      Art Dump

    39. Green_Baron

      Github and Viruses

      *pullsheadoutofsand* I would assume that it is at least improbable. That Github performs checks to avoid being obviously contagious because a bad reputation might drive people off. But i doubt one can categorically exclude it. Can malware be transported and activated through textfiles or images ? I only know rumors, no idea ... *sticksheadbackintosand* :-)
    40. I'm looking to find concept artists, 3d modelers, riggers, and animators to collaborate on a project in Unity. The game we would like to create is an RPG. We are not set on any "one" idea as we would like this project to be a true reflection of the team. I'm creating this team for experience. However, a complete game is the goal. I only ask that participants be patient and willing to adapt to problems that arise in development. If you are an aspired game artists and interested in collaborating with others, reply below or friend me on discord @Zan#2630
    41. After loading a level, I now propose to add interaction with the use of a mouse. This will be an opportunity to see two other patterns: the Observer Pattern to handle mouse events, and the Game Loop Pattern for synchronization between controls, updates, and display. This post is part of the AWT GUI Facade series Mouse Facade To add the management of the mouse to the facade, I propose to introduce a new Mouse interface that contains all the functions related to the mouse: Separating the interface for the mouse from the general interface of the facade has two advantages: lighten the general interface and allow the simultaneous management of several mice. As far as methods are concerned, I chose the simplest API possible: isButtonPressed() method: returns true if a button is pressed; getX() and getY() methods: return the (x, y) coordinates of the cursor in the window This API is also lowlevel, to match what most graphic libraries offer. In addition, it allows having at any time the complete state of the mouse, which is often necessary for video games. For the general GUIFacade interface, a new getMouse() method is added to return an implementation of the Mouse interface. Mouse handling in AWT To implement this interface, AWT library included in the standard Java library is still used. This one offers a high-level API, which meets the needs of office applications. It is based on the Observer Pattern, which allows an element to observe (or listen to) another element, and to be notified when an event occurs. In the case of the mouse, these events are for example the pressure of a button or the movement of the cursor. This API is divided into several interfaces, for example, the interface MouseListener allows to manage the events related to the buttons: The MouseListener interface contains the methods implemented by the observer: for example, when a button is pressed, the mousePressed() method is called and the observer can then act accordingly. The Component class, which is the superclass of many graphic components within AWT, can be observed by anyone who requests it thanks to the addMouseListener() method. For the example of this article, it is the Canvas used to make the level of the game that is observed: each action of the mouse in its display area causes calls to the methods of the MouseListener interface. Facade implementation I propose that the implementation of the Mouse interface of the facade takes the form of an AWTMouse class: The class implements the Mouse, MouseListener, and MouseMotionListener interfaces. The first methods provide information about the mouse (contained in its attributes): public class AWTMouse implements Mouse, MouseListener, MouseMotionListener { private final boolean[] buttons; private int x; private int y; public AWTMouse() { buttons = new boolean[4]; } @Override public boolean isButtonPressed(int button) { if (button >= buttons.length) { return false; } return buttons[button]; } @Override public int getX() { return x; } @Override public int getY() { return y; } ... The following ones respond to mouse events and update mouse information: @Override public void mouseClicked(MouseEvent e) { } @Override public void mousePressed(MouseEvent e) { if (e.getButton() <= 3) { buttons[e.getButton()] = true; } } @Override public void mouseReleased(MouseEvent e) { if (e.getButton() <= 3) { buttons[e.getButton()] = false; } } @Override public void mouseEntered(MouseEvent e) { } @Override public void mouseExited(MouseEvent e) { } @Override public void mouseDragged(MouseEvent e) { x = e.getX(); y = e.getY(); } @Override public void mouseMoved(MouseEvent e) { x = e.getX(); y = e.getY(); } } Finally, the AWTWindow class contains as before the canvas, and this one is used to “listen” the events of the mouse: public void init(String title, int width, int height) { ... mouse = new AWTMouse(); canvas.addMouseListener(mouse); canvas.addMouseMotionListener(mouse); } Game Loop It only remains to exploit this new interface in an example. To do this, I propose to introduce the Game Loop Pattern in its simplest version (without multi-threaded considerations). It is based on a set of methods that can be grouped together in the same class: The init() method is called at startup to initialize the game and its data; The processInput() method is called at each iteration of the game to manage the controls (keyboard, mouse, …). In short, its main role is to transform the user’s “instructions” into more abstract “orders” or “commands” that the game engine knows how to interpret. Considering time, these operations go to the rhythm of the user. The update() method applies changes to game data based on various sources, such as commands produced by user controls or operations that must be applied to each update. Considering time, these operations go to the speed of the game engine. The render() method handles display. This method will most often order or transfer data on the graphics card, the latter dealing with very low-level tasks at the pixel level. Considering time, these operations go to the refresh rate of the screen (default 60Hz). The run() method calls the previous ones and contains the actual game loop. In this example, I don’t implement all the elements required to meet all these principles. I propose here a basic form of this pattern, sufficient to begin to understand all these notions. For the init() method, we build the two layers and initialize the window: public void init(Level level) { this.level = level; backgroundLayer = gui.createLayer(); backgroundLayer.setTileSize(level.getTileWidth(),level.getTileHeight()); backgroundLayer.setTexture(level.getTilesetImage(0)); backgroundLayer.setSpriteCount(level.getWidth()*level.getHeight()); groundLayer = gui.createLayer(); groundLayer.setTileSize(level.getTileWidth(),level.getTileHeight()); groundLayer.setTexture(level.getTilesetImage(1)); groundLayer.setSpriteCount(level.getWidth()*level.getHeight()); gui.createWindow("Exemple de contrôle avec la souris", scale*level.getTileWidth()*level.getWidth(), scale*level.getTileHeight()*level.getHeight()); } For the processInput() method, if the following conditions are met: the left button is pressed; there is a cell of the level under the cursor; the tile of this cell is in the second set of tiles (the one of the second layer), then we put a tile with grass (this will erase all buildings in the map): public void processInput() { Mouse mouse = gui.getMouse(); if (mouse.isButtonPressed(MouseEvent.BUTTON1)) { int x = mouse.getX() / (scale*level.getTileWidth()); int y = mouse.getY() / (scale*level.getTileHeight()); if (x >= 0 && x < level.getWidth() && y >= 0 && y < level.getHeight()) { if (level.getTileset(x,y) == 1) { level.setTileset(x,y,0); level.setTile(x,y,new Point(7,0)); } } } } For the update() method, the content of the level data is used to define the sprite textures. It is the same as in the previous post, except that this operation is repeated regularly: public void update() { for(int y=0;y<level.getHeight();y++) { for(int x=0;x<level.getWidth();x++) { int index = x + y * level.getWidth(); backgroundLayer.setSpriteLocation(index, new Rectangle(scale*x*level.getTileWidth(), scale*y*level.getTileHeight(), scale*level.getTileWidth(), scale*level.getTileHeight())); if (level.getTileset(x, y) == 0) { Rectangle tile = new Rectangle(level.getTile(x, y), new Dimension(1,1)); backgroundLayer.setSpriteTexture(index, tile); } else { backgroundLayer.setSpriteTexture(index, null); } } } for(int y=0;y<level.getHeight();y++) { for(int x=0;x<level.getWidth();x++) { int index = x + y * level.getWidth(); groundLayer.setSpriteLocation(index, new Rectangle(scale*x*level.getTileWidth(), scale*(y-1)*level.getTileHeight(), scale*level.getTileWidth(), scale*2*level.getTileHeight())); if (level.getTileset(x, y) == 1) { Rectangle tile = new Rectangle(level.getTile(x, y), new Dimension(1,2)); groundLayer.setSpriteTexture(index, tile); } else { groundLayer.setSpriteTexture(index, null); } } } } The render() method just draws the two layers: public void render() { if (gui.beginPaint()) { gui.drawLayer(backgroundLayer); gui.drawLayer(groundLayer); gui.endPaint(); } } Finally, the run() method contains the game loop that calls the other methods at most 60 times per second: public void run() { int fps = 60; long nanoPerFrame = (long) (1000000000.0 / fps); long lastTime = 0; while(!gui.isClosingRequested()) { long nowTime = System.nanoTime(); if ((nowTime-lastTime) < nanoPerFrame) { continue; } lastTime = nowTime; processInput(); update(); render(); long elapsed = System.nanoTime() - lastTime; long milliSleep = (nanoPerFrame - elapsed) / 1000000; if (milliSleep > 0) { try { Thread.sleep (milliSleep); } catch (InterruptedException ex) { ex.printStackTrace(); } } } gui.dispose(); } This basic implementation of the Game Loop pattern does not respect all its principles. There is no real notion of control, and it will be difficult to run each part at different rates. In addition, in terms of design, there are also questionable choices, like the Level class that serves as both a level loader, storage for the level, and a form of buffer. Through future posts, I will present all that is needed to achieve more effective implementations. The code of this post can be downloaded here: awtfacade06.zip To compile: javac com/learngameprog/awtfacade06/Main.java To run: java com.learngameprog.awtfacade06.Main The post AWT GUI Facade (6): Mouse and Game Loop appeared first on Design Patterns and Video Games. View the full article
    42. I wouldn't think that Nagle would affect high bandwidth streaming at all. I was under the impression that you'd only notice its effects if sending small pieces of data infrequently.
    43. fleabay

      plane game

      You should make it clear what this blog is about. "I am making a game, here is my code" is a $h|t post, not a description.
    44. I am trying to stream a lot of videos on a peer to peer connection using TCP stream(via winsock2) on windows. The connection is 10Gbs copper. I need to stream about 2Gbs. I am able to get the throughput but occassionally there is a 200 or 300ms stall. The stall occurs around send for the server and around recv for the client. Digging a bit more I read about the Nagle algorithm and quickack implemented by default. I have tried to change the configurations in many ways... But there always seem to occassionally a non "flat line" communication. So there are spikes of around 200 ms. Should I stream video in UDP? Are such spikes inherent to TCP Stream? I can smooth the spikes with a buffer but I would pay with latency. Or is thete a setup where TCP ack or algorithm wont stall at all?
    45. @Zakwayda Okay got you! RDP is done already. So now i try to find the places in the Chaikin function. I will report back. Thank you! 😀 struct Vertex { double x; double y; unsigned int p; unsigned int t; sf::Color c; // Constructor Vertex(double x, double y, unsigned int p, unsigned int t, sf::Color c) : x(x) , y(y) , p(p), t(t), c(c) {} }; //--------------------------------------------------------------------------------------------- // Chaikin - Corner Cutting (Smoothing by corner cutting) //--------------------------------------------------------------------------------------------- std::vector<Vertex> Smooth(std::vector<Vertex> &IN, int iterations) { size_t VC = IN.size(); if (VC < 2) { return IN; } sf::Vector2f startp = sf::Vector2f(IN[0].x, IN[0].y); sf::Vector2f Endpnt = sf::Vector2f(IN[VC - 1].x, IN[VC - 1].y); std::vector<Vertex> OUT = IN; for (int i = 0; i < iterations; i++) { size_t VCount = OUT.size(); std::vector<Vertex> newCurve; newCurve.emplace_back(startp.x, startp.y, 0, 0, sf::Color::Black); for (int j = 0; j < VCount - 1; j++) { sf::Vector2f displacem = sf::Vector2f(OUT[j + 1].x - OUT[j].x, OUT[j + 1].y - OUT[j].y); sf::Vector2f newPoint1 = sf::Vector2f(OUT[j].x + (0.25f * displacem.x), OUT[j].y + (0.25f * displacem.y)); sf::Vector2f newPoint2 = sf::Vector2f(OUT[j].x + (0.75f * displacem.x), OUT[j].y + (0.75f * displacem.y)); newCurve.emplace_back(newPoint1.x, newPoint1.y, 0, 0, sf::Color::Black); newCurve.emplace_back(newPoint2.x, newPoint2.y, 0, 0, sf::Color::Black); } newCurve.emplace_back(Endpnt.x, Endpnt.y, 0, 0, sf::Color::Black); OUT = newCurve; } return OUT; }
    46. Cacks

      OpenGL ViewPort Matrix

      I'm using gluProject & gluUnproject atm but it would simplify my code if I could use the correct matrices
    47. Green_Baron

      OpenGL ViewPort Matrix

      You're right, i was very superficial. Thanks for pointing it out. Yes, glm has project and unproject functions for these cases. The glm functions claim to be compatible to the old glut functions ...
    48. Green_Baron

      New Here

      Welcome ! Tips always, but my shares stay in the depot 👹 Just kidding, i don't hold any shares
    49. Zakwayda

      OpenGL ViewPort Matrix

      The viewport transform can be expressed in matrix form, but unless things have changed I don't think OpenGL exposes that to the user. I don't know if modern OpenGL provides a way to 'feed back' screen coordinates - I'll leave it to others to comment on that. But, you can do it yourself. Everything OpenGL does is just math, ultimately, and can be recreated on the CPU or wherever if needed. Getting positions into clip space is fairly straightforward, as you just need to transform by the same matrices you're submitting to OpenGL (or using in a vertex program or whatever). After that you need to get the coordinates into NDC space, and then transform to screen space. I've implemented all this in my own code before, although I don't have that code handy at the moment. I'll also mention that math libraries sometimes offer this functionality in the form of e.g. a 'project point' function. GLM, for example, might include this (I don't know if it does - it just seems like it might). Also, depending on what you need the screen space coordinates for, you might be able to accomplish the same thing via other means. For example, if you want to know whether a point is visible to the camera (not counting occlusion), that can be done via other means that don't involve transforming to screen space.
    50. Cacks

      OpenGL ViewPort Matrix

      @Green_Baron good point, there might not be a matrix which holds this info, the windowing api might do this behind the scenes
    51. The short answer is that every time the smoothing code interpolates between two points from the input curve, you should interpolate the other relevant parameters (time, pressure, etc.) as well using the same interpolation value. (If any of the algorithms you're using do something other than simple interpolation, things might be more complicated.) I think I suggested this earlier as well, but I'd suggest tackling each step of the smoothing process (RDP, corner-cutting, and resampling or whatever) separately. It might be useful to set up your code so that you can turn the different steps on and off, allowing you to, for example, only use RDP and skip the other steps. Then you can test the steps in isolation and see if what you're doing is working so far. Since preserving the ancillary values with RDP appears to be trivial, it seems corner-cutting is the next step to tackle. Again, you'll want to find the places in the algorithm where points are interpolated, and revise the code so that the other parameters are interpolated as well. If that's not clear, then I'd suggest posting your code for the corner-cutting algorithm so that someone can maybe point you in the right direction.
    52. Green_Baron

      OpenGL ViewPort Matrix

      Never heard of a viewport matrix. Anybody else maybe ? The model matrix converts from local to world space. The view matrix converts from world to view space. The projection matrix from view to clip space. Transformation from clipspace to screen coords is done by setting the viewport which is a rectangle, not a matrix. The viewport is set automagically by the windowing api to the size of the window in window mode. Else or if you want a different area you can set the viewport to different sizes with glViewport() and you can query the size with glGetIntegerv( GL_V... ).
    53. Our team at Darkstar Games is looking for some motivated developers to join our new TCG MMORPG game called "Greater Powers". We are previewing KS for Q1 2020 and are set to create a unique and epic videogame ! Our team members work for corporate equity (corporate shares). Every team member who has shown active participation is granted stock option in the corporation. And Department directors will be distributing cash bonuses to team members who contribute significantly to the project during development. Skillset especially needed: -> Concept artist -> Rigger -> Animator -> C# programming -> Graphic design -> 3D modeling (especially for structures, creatures and skyships) -> Good knowledges level in Unity3D If you're looking to join in on an up and coming original game company send me your Portfolio to: flosambora123@gmail.com Hope to hear from you soon ! https://www.facebook.com/DarkstarGamesCorp/
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!