Need Debugging Help

Started by
2 comments, last by JimboC 16 years, 7 months ago
I wrote an application for my job. It's just a small program (15K lines or so) using forms in C# 2003 Pro and it stores the data in an MS SQL 2000 database on a server at work. Basically it's a performance improvement program where we can keep track of different metrics for different departments at our different facilities. For example, the Rehab Dept in Facility 1 tracks 6 things they're trying to improve on over time and the Rehab Dept in Facility 2 is tracking 7 different things. The program allows each department to pull up their metrics so they can input data and then Administration can run reports and see how all the departments at all the different facilities are doing. Here's the problem. The program works 100% correctly on my PC (which is where I wrote and debugged the code). But if I run it from one of our terminal servers, it will leave out some indicators for some departments. Say it will display 5 of the 6 metrics for the Rehab Dept at Facility 1. If I run it from a different terminal server, it will give me 4 of the 6 metrics for the Rehab Dept at Facility 1. If I run it from a 3rd term server, Rehab at Facility 1 is correct, but Rehab at Facility 2 is missing some metrics. The results are consistent on whatever PC/server I run it from. It always runs correctly on my PC and it always leaves out the same metrics on term server 1. So the errors are duplicatable, but they're different from PC to PC. Since it runs correctly on my system, I'm thinking that my code is good and it must be something like it's reading from libraries that I'm not aware of and they're different on each PC. The .NET Framework is the same version on every PC I've tested on, so I don't think that's it. Does that seem like a reasonable assumption to you guys? Can anyone offer any ideas on how to track this down? I'm a novice/hobbyist programmer so my debugging skills are basic at best. I've got a full copy of Visual Studio 2003 Pro, but I'm not sure if there's anything in there that will help with this.
Advertisement
I think more information would be helpful.

- You say it runs differently on different servers. Are you connecting to them all from the same machine?

- Are you using the same user credentials no matter how you connect to the server?
-- What would Sweetness do?
"It runs fine on my machine" syndrome happens to every professional developer.

The first thing you need to nail down is what piece of software is causing the problem. Start with the SQL server. Connect to the server using the exact same credentials your application does from one of the problem machines. Verify that you can see all of the data you expect in your tables.

If everything looks fine, it's more likely that it's directly related to something your code is doing. Look into remote debugging with VS2003. It's a pretty slick feature that lets you run a debug version of your app on a problem machine, and debug it from your development machine. Set some breakpoints at key parts of the code to track down where data is getting lost.

Knowing exactly where things are failing on these machines will go a long way to giving you an explanation to the problem. My gut feeling says it's something to do with permissions on the database, but you never know.

Thanks for the replies.

The program is a basically just a client that can run from any PC on our network. Wherever it runs from on our network it connects to the database through an ODBC connection. It doesn't seem to matter what Windows credentials I'm logged in with or what PC I connect to the Remote Desktop with - each PC/Server does the same thing regardless of where I access it from or who I'm logged in as.

But I hadn't even thought of the database security. I'll go through that again and see if maybe that's it. Or maybe the ODBC drivers aren't quite right or something like that. Definitely seems like it's something PC related.

This topic is closed to new replies.

Advertisement