Ah. Usually encoding like that, such as base-64 encoding or uuencoding, converts binary data into values that can fit into plain text newsgroups and other textual places. It encodes six bits as an ascii value, expanding it to 8 bits in length. So it gives an instant 1/3 growth in size, plus a tiny bit of overhead for header, footer, and padding at the end of the message.
Base 62 is the smallest encoding you can get for upper and lower case alphanumeric.
I've never heard text-to-binary conversion called 'base62', probably because it is a new thing and because Base X already has a defined meaning different from that. You apparently are going the other way; accepting only 62 ascii values and turning them into six bits --- with two extra values left over for your own nefarious purposes. :-)
But yes, if you are looking for size the best bet is to use your own knowledge of the data to remove all unnecessary bits. If your data is still large a more traditional encoder might work for you, or might not if your data is small or if it has high entropy.