Jump to content

  • Log In with Google      Sign In   
  • Create Account


Ryan_001

Member Since 23 Apr 2003
Offline Last Active Today, 11:20 AM

Posts I've Made

In Topic: So... C++14 is done :O

Yesterday, 08:32 AM

A good example of a change is the addition of . I rarely if ever use int or short anymore, its now all uint16_t or int64_t. I know what I'm getting and I know their limits. I can use them without tons of asserts to std::numeric_limits. Apart from little-big endian issues, which only really crops up when serializing data, they are portable.


And your code is thus non-portable to a platform with a different native word size. Cuts both ways.


Is this of any practical relevance? Do you know of any platforms where those types are not available? Any platform that matters? I mean, maybe there is a washing machine somewhere that won't be able to run my game, but I can live with that.


I agree.

TBH I think this has gone beyond a discussion and is now an argument, something I have no interest in participating in.

In Topic: So... C++14 is done :O

Yesterday, 07:37 AM

But that's the point. Something as simple as a vector, shouldn't have different versions. It shouldn't be possible to write a significantly better version by using compiler specific features. That undermines C++ as a portable language.

 
Pardon me? C++, a portable language? Name mangling, type-size-differences, compiler/os/cpu-specifics, DLL-hell? You *can* make portable C++-code, but there is certainly not a standard-solution, and I definately wouldn't call C++ a "portable" language. Granted, giving a default STL implementation would make it a tad bit more portable, but thats appearently NOT what C++ aims at.


Well I agree that it isn't at the moment. Look at any C++ tutorial, any book by Stepanov, Sutter, Alexandrescu, Stroustrup and they all teach and push 'portable' programming practices. All the standard library types like size_t and vector::value_type are designed around the idea that the types might change on a different compiler and/or compiler switch. I agree that its not as portable as it should be, but its taught, promoted, and used as if it is.

I'm not saying we need Java level of portability. But simply copying values from point A to B shouldn't require 1/2 dozen defines and template meta-function invocations simply to be both portable and efficient.

A good example of a change is the addition of <cstdint>. I rarely if ever use int or short anymore, its now all uint16_t or int64_t. I know what I'm getting and I know their limits. I can use them without tons of asserts to std::numeric_limits. Apart from little-big endian issues, which only really crops up when serializing data, they are portable.

In Topic: So... C++14 is done :O

Yesterday, 07:09 AM

I'm pretty sure the reason the guys on the committee don't enforce and author a standard library implementation isn't because they aren't capable of writing it.
 
It would be silly if they said "This is the standard library code that MUST be supplied for this to be a standards compliant C++ installation". For a start, that would stop individuals or groups coming up with internal improvements to the standard library which don't affect the interface. It also means that a compiler vendor would not be able to take advantage of their own compiler specific features under the hood of the standard library which they are free to do as long as the interface is presented and it all behaves "as if".
 
I'm not really sure what you think the advantages to what you are proposing are, but there are a raft of disadvantages.


But that's the point. Something as simple as a vector, shouldn't have different versions. It shouldn't be possible to write a significantly better version by using compiler specific features. That undermines C++ as a portable language.

Have you ever actually looked at these standard library implementations? They are a complete mess. They break nearly every rule of good programming practices. Tons of #defines and macros everywhere. Rarely if ever a comment. Some of the requirements in the standard are contradictory, and no mention of how they chose to work around these sticky areas. Containers split across multiple files with pieces scattered between them. Or take a look at the Boost source. Those are some VERY smart people working on that, and they have spent a TON of time; and its still filled with #define work-arounds, tons of template code to patch in or out or around compiler 'features'. And all of this mess for just a few very basic containers and algorithms.

I wasn't trying to imply that the people on the C++ committee were stupid. Rather I feel the complexity of C++ is working against its very core principles. To write 'proper' (efficient, portable, safe) C++ code requires a massive investment of time and knowledge. So much so that even some of the best in the field have a difficult time.

In Topic: So... C++14 is done :O

21 August 2014 - 06:23 PM

In my humble opinion...
 
I wish that they'd clean-up the standard library.  I understand that some of the standard lib needs to be vendor specific.  Threads, mutexs, files, memory management, OS specifics, ect... they need to be vendor supplied.  But things like iterators, containers, algorithms, ect...  They really should be just be standard C++ code supplied by the committee.  Call the vendor supplied section of the standard library the 'core library' or 'vendor library', then call the rest the 'extended library' or just 'standard library', whatever...  That way new additions/ideas can be made to the library much faster/cleaner.  Also simply having a 'standard implementation' would really make learning the language a lot easier.
 
Now I know the counter argument to that is that by specifying 'what' and not 'how' in theory library writers can give a better implementation.  My feeling is that if the committee can't write clean, efficient, proper C++ code, how they hell are we supposed to be able to?  If the committee can't write a simple vector implementation, then they need to get back to the drawing board and figure out why they cannot, rather than just pass the buck onto the vendors.
 
I mean the whole reason Boost in its current form exists (and I love Boost) is simply because its the standard library that C++ needs.

In Topic: Font Rendering

21 August 2014 - 08:58 AM

Ya I guess it does, when I have some time I will.  Really quickly the steps to use it would be...

 

1) Create a Font object like:

TTF::Font font("fonts/Nexa-Light.otf");

The Font object has all the font data, like bounding rects, kerning info, headers, code point to glyph index conversion, ect...

 

2) Create a triangulator:

TTF::TriangulatorI triangulator;

Currently there are 3.  Each triangulator will have different outputs.  TriangulatorI uses the GPU gems 3 vector rendering (ie. requires a pixel shader).  TriangulatorII on the other hand is just a standard mesh (no special pixel shader required) but requires you to supply a function to specify how to subdivide the curves.

 

3) Create a Mesh object to store the glyph:

TTF::MeshSmall mesh;

Mesh small is useful for most fonts, but its theoretically possible that some glyphs might be too large (have too many vertices) to be stored in a MeshSmall object, in which case you can use a MeshLarge object.  There's almost no difference from a user standpoint, a MeshLarge just uses a less compact storage.

 

4) Triangulate the glyph:

font.GetGlyph(CodePoint(text[i]),triangulator,mesh);
In this case text is a std::string.  Now you can do whatever you want with the glyph (add it to a vertex buffer perhaps).
 
5) Optionally get the kerning:
TTF::vec2s t = font.GetKerning(CodePoint(text[i]),CodePoint(text[i+1]));

Kerning is the distance between one glyph and the next.

 

PARTNERS