shared_ptr doesn't seem to work because that is reference counted.
Or of course simply a weak_ptr.
Posted by BitMaster on 08 July 2014 - 09:12 AM
Posted by BitMaster on 08 July 2014 - 02:19 AM
Posted by BitMaster on 08 July 2014 - 12:27 AM
Posted by BitMaster on 07 July 2014 - 02:08 PM
In general I found the C++11 Wikipedia article to be a helpful overview, combined with cppreference.com for more details.
Posted by BitMaster on 06 July 2014 - 07:55 AM
textureLibrary[filename.c_str()] = newTexture;
Posted by BitMaster on 30 June 2014 - 01:17 AM
In an ideal world, yes. In practice, you need to check fir's history. The surprising thing is not he is getting downvoted without further comment, the surprising thing is there are people left who are actually willing to engage with him in something resembling a constructive way. Whatever else he is, he is very good at burning bridges for absolutely no reason at all.Don't get me wrong I'm not defending/offending him, I just try to exchange useful information with all of you.
I think that if you put a downvote on someones post, you MUST tell him why he is wrong, at the end of the date we are programmers we use facts.
Posted by BitMaster on 26 June 2014 - 02:48 AM
Posted by BitMaster on 25 June 2014 - 12:41 AM
Posted by BitMaster on 25 June 2014 - 12:12 AM
First, you cannot have a dot product of \(x\) and \(x^T\) (unless \(x \in \mathbb R\)). The dot product is for two column vectors \(u, v\) is commonly defined as \(dot(u, v) := u^T \cdot v\). Trying \(dot(x^T, x)\) would once again be impossible because you would have to multiply an element of \(\mathbb R^{n \times 1}\) with and element of \(\mathbb R^{n \times 1}\) which is not defined for \(n \neq 1\).But that is not possible since x^{T}=!=x. What I find strange though (for example, dot product of x and x^{T }is defined, thus their subtraction is defined, but thus subtraction of x and x^{T }can never be a zero vector)
You are thinking of the ring axioms but that applies only to square matrices. You cannot even multiply general vectors together like that (as already said above), so talking about multiplicative neutral and multiplicative inverse elements for vectors does not make any sense.To every vector (matrix) exists addition neutral vector (matrix) Z
To every vector (matrix) exists multiplication neutral vector (matrix) I such that V*I=V
To every vector (matrix) exists multiplication inverse vector J (matrix) such that V*J=I
Posted by BitMaster on 24 June 2014 - 02:16 PM
Well, that was my argument to him in the original thread, but as he says now
I would cite now "but then you would have to test a row and column vector for equality, which just does not work.". I am comparing vectors of the same majoring (dimensions).
which has me a bit at a loss for words.
Posted by BitMaster on 24 June 2014 - 06:17 AM
a helping true algebraic statement:
P*V*M*p=p*M*V*P (becouse P*V*M=(M*V*P) ^{T})
Posted by BitMaster on 18 June 2014 - 06:25 AM
Posted by BitMaster on 13 June 2014 - 05:00 AM
Posted by BitMaster on 12 June 2014 - 08:55 AM