1. Resize the window and stretch everything in it to fit the larger window.
2. Change the machine's resolution and don't stretch.
Option 1 is usually preferred in a Flash app (given that everything's vector and resizes with almost no performance hit), and it looks quite nice that way. Problem is that I have a lot of bitmaps in my app, and Flash needs to stretch 'em on-the-fly. Performance dropped a little on my machine, but I imagine it would drop a lot on a lower-end machine, so I decided to change the monitor resolution directX-style and leave the game at its current size.
That, however, was a problem too. If I drop the machine resolution and center the game, I end up with a big border of empty around it (given the aforementioned slightly-smaller-than-800x600 size I was using). And it really looked ugly after changing machine resolution. Making the client window 800x600 would fix that, but that'd make the game unplayable on an 800x600 screen as anything but full-screen.
So I implemented two checkboxes in the setup. One is called "smaller screen", and it shrinks the client area to 640x480 so it's playable on an 800x600 screen. This does cause the aforementioned problem of bitmap stretching, but since you're stretching things smaller rather than bigger, the performance hit isn't as bad.
When the game runs the first time, it checks the monitor's current resolution, if it's 800x600 or lower, "smaller screen" is selected by default. For everyone else, it's really not necessary (unless you just want the game to be smaller).
The other checkbox is called "full screen", and it kicks the monitor into 800x600, kills off the title bar, and centers the game. Thus you're full-screen without the performance hit of stretching.
Thus is the world of the discount rack --don't assume that your game's players will have as nice a screen as you do.