Stride is used to pad the width of an image so that its width in memory aligns to a power of two, which makes the pixels easier to reference with bit-math.
Sometimes stride is the 'true length' of a row, sometimes it's the 'added length' of a row. For instance, if you have a texture with width 100 there's a good chance that it's stored as a width 128 texture. The stride is either 128 or 28, depending on what API you're working with. When the image is rendered only the 100 pixel section gets drawn, but having the width stored as 128 means that you can use bit-math to select a row very quickly in the hardware.
Edit - Ah, yeah. In some cases the stride is stored in byte length rather than pixel length as well. It should be documented somewhere in what you're working with, but basically pixel + stride + 1 means the pixel that's one row down and one column to the right from where you're at. In this case it's clearly defined as 'int stride = pixelsPerRow * sizeof(uint32_t);' in the code itself.
As a term, whenever you see 'stride' you should just realize that it's dealing with the 'real length' of a row rather than the apparent length.