Well, you could just reimplement Except using the standard approach (it will likely be fast enough, for 50 items every 10 frames even the naive method would do it even on mobile I think), but theoretically it can be done in O(n log n) time by using the following approach:

1. sort lists a and b [O(n log n)]

2. for every element in a, check if it exists in list b via binary search, and same for b [n * O(log n) => O(n log n)]

And, if you don't mind using some extra memory, it can in fact be done in O(n) time, by replacing the sort and binary search with a lookup into a hash table:

1. put every element of a and b in hash tables Ha, Hb [n * amortized O(1) => O(n)]

2. for every element in a, check if it exists in Hb, and same for b [n * amortized O(1) => O(n)]

Astute readers will notice this is actually the naive algorithm, just implemented using a HashSet and its Contains method, because HashSets are awesome.

If the order of the resulting list needs to reflect the order of the two original lists, then you must iterate in the order of the original lists in step 2 (which will require making a copy of the list to sort it in the first case, otherwise you lose the order). But you didn't mention if this was the case, and if it isn't, then it just becomes a straightforward set intersection!

The slowsort algorithm is a perfect illustration of the multiply and surrender paradigm, which is perhaps the single most important paradigm in the development of reluctant algorithms. The basic multiply and surrender strategy consists in replacing the problem at hand by two or more subproblems, each slightly simpler than the original, and continue multiplying subproblems and subsubproblems recursively in this fashion as long as possible. At some point the subproblems will all become so simple that their solution can no longer be postponed, and we will have to surrender. Experience shows that, in most cases, by the time this point is reached the total work will be substantially higher than what could have been wasted by a more direct approach.

- *Pessimal Algorithms and Simplexity Analysis*