Archived

This topic is now archived and is closed to further replies.

Generic dimension vector-matrix multiply?

This topic is 5534 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi peeps\gurus, I have a matrix with dimensions m, n and a vector with dimension n. I want to multply the two together giving an output (obviously) of another vector. How do I do this in the general case? I''m using it for multiplying neuron activation by weight matrix for an artificial neural network but for various reasons, cannot get my head around exactly whats going on. Thanks for any help you can give. int Width = Matrix.GetWidth (), Height = Matrix.GetHeight (); assert ( Matrix.GetHeight () == size () ); for ( int i = 0; i < ; ++i ) { for ( int j = 0; j < ; ++j ) { } }

Share this post


Link to post
Share on other sites
Well, for starters I'm going to have to make an assumption. Based on that assert statement you have I think you're looking at a representation like this:


[1 1 1 1 1] * [1 0 0 0]
[0 1 0 0]
[0 0 1 0]
[0 0 0 1]
[0 0 0 1]


      
int Width = Matrix.GetWidth (), Height = Matrix.GetHeight ();

assert ( Matrix.GetHeight () == Vector.size () );

for ( int i = 0; i < Width; ++i )
{
resultVector[i] = 0;
for ( int j = 0; j < Height; ++j )
{
// Note: Matrix[i][j] represents column i, row j

resultVector[i] += Matrix[i][j] * Vector[j];
}
}



[edited by - jediknight219 on October 23, 2002 9:33:13 AM]

Share this post


Link to post
Share on other sites