It's likely the JIT will produce equivalent code for both cases, assuming your operator implementations aren't braindead. Especially if your vector is defined as a struct, since it will be allocated on the stack, and will fall out of scope automatically, without even invoking the GC. So, use the operator version. Much more readable => less potential for bugs + more easily maintained => more time for other, more relevant optimizations, such as better algorithms, or even more features. It's also recommended that vector objects such as these be immutable, especially if they are structs, as the C# struct memory model causes mutable structs to behave in unintuitive ways at times. Of course mutable objects are often required for performance, but for primitives like vectors I doubt it's worth the hit in readability.
In any case, for tiny functions like this, timing is usually not the solution, as the timings will be very dependent on context (read: your benchmarks will be meaningless). You should instead look at the output IL. There are tools to disassemble C# code into IL, so you can see what both functions map to, and see which one takes the least instructions/memory copies/etc...
And honestly I doubt such a micro-optimization on a function that's called maybe a handful of times per frame would even make a detectable blip on your profiler. Don't optimize unless there is a problem, because it's usually a performance/maintainability tradeoff and you ideally want to stay on the maintainable side as much as possible.
The slowsort algorithm is a perfect illustration of the multiply and surrender paradigm, which is perhaps the single most important paradigm in the development of reluctant algorithms. The basic multiply and surrender strategy consists in replacing the problem at hand by two or more subproblems, each slightly simpler than the original, and continue multiplying subproblems and subsubproblems recursively in this fashion as long as possible. At some point the subproblems will all become so simple that their solution can no longer be postponed, and we will have to surrender. Experience shows that, in most cases, by the time this point is reached the total work will be substantially higher than what could have been wasted by a more direct approach.
- Pessimal Algorithms and Simplexity Analysis