Hmm, is this it?
To a degree.
Now extrapolate this into fictional territory.
User submits a bug. You click a button and end up with their application state. You move forward and backward to diagnose that.
Not inside debugging session, but regular user, perhaps visiting a website or running an app. Doesn't really matter, the concept of "application" or "exe" is not all that important anymore, one gets a snapshot of user's workflow and all that affects it, be it current process, driver, OS, hardware.
Debugging history today is limited to capturing a small set of events inside a specially crafted debugging session. It works with limited set of managed languages which can establish such a separation.
But more on topic of direct modification of application (or game) state. Debugging crashes is still a pain. Rather than focusing on technical widgets, imagine usability only. When a user experiences an issue (not necessarily a crash), they may opt to send the application state directly. No more bug trackers, filling out issues, writing "steps to reproduce". They simply say "the button should not be here".
As developer, you receive such issue, then just move a bit back or forth through execution to pin point the error.
Next, there's diagnosis tools. Mark a logical item in such application snapshot, "find all references" to see how it got into that state (when was button.visible set to true). This isn't manual reference chasing, but automated analysis of program flow. Most of this type of programming is nothing but reference chasing. A very manual task for majority of part.
A tool like this could be built, but only given sufficient hardware and even then it would be pushing everything, from storage to computational capacity.
We have small and limited tools that help with individual parts of the process, but not the next step. Just like 20 years ago building a dictionary was a considerable technical challenge due to hardware constraints, it's trivial today (set<String>).
Hence I mentioned the unpopular CS topics which do hint that such tools will likely never be possible or that a particular step will become possible in a given number of years when hardware advances to the point where problems we're solving are smaller than that.
Instead of P vs. NP, it's closer to economy and exponential growth. For most human endaevors, there is an upper limit. Dictionary has finite size and hardware outgrew that. Maybe we'll reach an upper limit on computer-assisted tasks as well.