You only intialize your graphics device once per run of the game (right????) so the problem isn't going to be in how you set things up, that's pretty standard, boilerplate DirectX stuff. Your framerate drop could be coming from many sources:
repetitive buffer creation
repetitive rendertarget creation
inefficient draw-call architecture (are you sending each tile as a sprite to the batcher, or telling the batcher to draw itself for every sprite?)
Or it could be entirely outside your graphics portion. What happens in the update portion of your game loop? The fact that an increase in your tile objects dropped your framerate makes me wonder if you have some insanely inefficient pathfinding algorithm (like recursively checking every tile/node against every other one), or collision tests that happen against every object in the scene, etc.