Compiled binary files have no reason to be in SOURCE control. Important ones, such as public packages of release or beta test versions, should be archived to Dropbox, network shared drives, or other types of "dumb" storage by build managers to let others download them as needed.
This is a common debate. I'm on the opposite side.
I'm a very strong believer of putting basically everything in version control.
My first test is the 'new machine install' test. I believe we should be able to take a freshly formatted machine, run a single bulk install script to put on visual studio or maya, python, perl, and all other software needed for a programmer/modeler/animator/tester configuration. Then I should be able to sync to version control and be done. This means all the third-party libraries and packages should be available through version control. If that doesn't work, I feel it is a defect to be fixed.
My second test is the 'instant demo' test. I believe we should be able to take a freshly formatted machine, run a single bulk install script, and pull down any one of a large number of builds. These should include all the nightly builds (including all debug information) for at least one month, and every build ever released outside the building (including all debug information). The list of available builds can easily go back for many years on a long-term project.
In short, we should be able to turn any machine into a demo machine, a test machine, or a development machine in just the time it takes to transfer the files and run the minimal number of installers. We should be able to re-create the exact build environment for any build over the history of the project using only an OS install disk and a version control installer and nothing more.
Over the years I have learned it is best if this is kept in a single location inside version control, rather than spread out across many network locations such as your described dropbox and network shares. Even more so for data kept in an external database -- don't do that, instead extract the data you need and store that copy of the data in version control. Since you can easily purge files from version control there is almost no space difference. It doesn't happen often, but it happens often enough that you need to re-create everything as it existed years ago. Having EVERYTHING in version control including binary files and installation files is perhaps the most reliable way to meet that need.
You do not know when you will need to pull a very historical build, including all of its debug information. We had one instance in a long-running game that we needed to re-create some of the libraries as they existed about eight years before. Our repository had well over a million submissions, but we were able to go back to that very old build, completely re-create the build environment of VS2002 and Windows 2000, install all the build tools and pipeline, and otherwise regenerate everything as needed. I'm confident we could not do that if we didn't keep EVERYTHING in version control. Having the ability enabled the studio to take advantage of a multi-million dollar deal.
There are of course people who disagree with that policy of putting everything in version control. Their logic escapes me and I usually attribute it to a lack of experience in actually NEEDING historical builds. For now it is enough to say it is a common enough disagreement point around the Internet and it is something people strongly disagree about.