The question is plenty precise enough, assuming the OP's definition of "cryptographic hash function" is the accepted one.
As for the question, you need to look at the input/output properties of a cryptographic hash function, specifically the bit independence criterion. What does this criterion tell you about what different values of t will give you?
I'll give you a hint: it is very slightly better to select t = 0, 1, 2, .. until you find a match than to select t at random, because the latter runs the risk of you selecting the same t twice, thus wasting a hash function invocation (the probability of this occurring tends to 2^(-64), see the birthday paradox). Other than that, there is no difference between the two approaches. How come?
Another hint: the value of m is irrelevant to your expected probability of success - can you see why?
The rest should be straightforward, but please don't hesitate to ask.
Note it may be easier to approach this in the random oracle model, if you've studied this.
Another, even easier approach is to show that if H(X) is a secure n-bit hash function, then removing any subset of bits from the output of this hash function and calling the "reduced" hash function H'(X), then H'(X) is still a secure m-bit hash function (where n - m bits were removed from the output of H(X)). Then, your problem boils down to a simple preimage argument.
Edited by Bacterius, 20 February 2013 - 10:37 PM.
The slowsort algorithm is a perfect illustration of the multiply and surrender paradigm, which is perhaps the single most important paradigm in the development of reluctant algorithms. The basic multiply and surrender strategy consists in replacing the problem at hand by two or more subproblems, each slightly simpler than the original, and continue multiplying subproblems and subsubproblems recursively in this fashion as long as possible. At some point the subproblems will all become so simple that their solution can no longer be postponed, and we will have to surrender. Experience shows that, in most cases, by the time this point is reached the total work will be substantially higher than what could have been wasted by a more direct approach.
- Pessimal Algorithms and Simplexity Analysis