Archived

This topic is now archived and is closed to further replies.

liquiddark

Regular Adventures in Software Engineering

Recommended Posts

Gonna try this for a week or so, see what happens here. So, as some of you may know the shop I work at is moving to .NET. Given that the other choice is VB6, this is a very good thing. I end up working at an intermediate to senior level most of the time purely by default, not by any overarching brilliance on my part. Regardless, that means that I get to play a little. Over the past year or so, my antics have brought me into close contact with some modern methodologies - Extreme Programming & Agile Methods especially, unit testing in particular. The gold standard of test suite standards is the XUnit framework, which in .NET is available via NUnit. I started using NUnit last year, but apparently VS.NET 2003 changes the configuration somehow, and the easy configuration of the system I seem to remember has disappeared. Maybe it never existed. I just don't know. Anyway, the installation isn't terrifically difficult, if you're not a 7-cent tool like myself. Having said that, using this thing via VS.NET isn't perfect. First, NUnit only accepts dlls. This is great for most purposes, but it's a bit annoying when you want to test stuff that has no natural way to become decoupled from a .exe. You end up having to adopt other XP practices by default - refactoring, simplest-thing-first, and test-driven development all become very desirable practices when NUnit comes into the picture. Those can be pretty large overhead for small projects, so you end up adapting. One possible adaptation, which I'm using for my current personal project (a wargame), is to discard automated unit testing & keep the rest of the practices in place as much as possible. This is fine for small-scale projects with low stakes involved, but it begins to fall apart pretty quickly when you're doing something of any consequence, especially if someone is going to pay money for the thing you're developing. Another possibility is to write your own test framework. I did this at our shop, and came to the conclusion that while it's not perfect, the "reflection" available in COM and particularly in VB is suitable for simple test-runner scenarios. CallByName and the TypeLib dll, in particular, make explicit programming of a test runner for VB simple and enjoyable. In writing this thing, I learned about test frameworks, and in particular I found that test frameworks themselves tend to be the exception to their own rule - they're simple, if you're doing a lot of testing, these things get run so much you're going to find pretty much all the bugs in them pronto. That seems to be a large part of the beauty of automated testing in general - the code gets run, and it gets run often . Yet a third possibility is to not so much adapt as suffer through and start using the NUnit framework. I'll be brief here, because I found a better source of info (see note 1) than I could provide, and I've listed it at the bottom of this piece. As mentioned, there are some drawbacks to the test runner as is. Aside from having to adopt a lot of practices at once and create your components in separate dlls in order to use it effectively, I also found that the test runner is very slow, likely due to the slowness of .NET's reflection capacities. Best way to set up a project is to use a Start Action (Project->Properties->Configuration->Debugging) of Program and set the NUnit runner from there. Things only get slower if you don't use this configuration. Now having said all of the above, you can read a lot of stuff online that screams about how important it is to do the practices as early as possible. This, in particular, is not especially true for non-professional programming. It seems to be good practice to adopt XP completely once you've decided to "get serious" about a project, but until that time, you're just as well off prototyping and discarding until something works. Even when you're serious about it, you can usually work a little faster by establishing a small but complete set of functionality before starting to consider testing concerns. In my case, I have a simple map editor done and I'll be doing a couple of other resource editors for the wargame before I start using automated testing. Once I adopt it, however, I will retrofit all of the reusable components of those editors with tests. Of course, as I mentioned above, I'm also doing other practices all along that will help me with this aim long-term. The only other point I can make in this regard is that testing, automated or not, takes a lot of time to understand. I'm predisposed to run my code after every change regardless, as if I were an NUnit robot sent back in time to change the future for one lucky binary, but if you're not constantly driven to check the state of your code's operations, you might consider investing time in your xUnit incarnation of choice. Notes: 1. While I actually never saw it before writing this, Adventures in C#: Using NUnit by Ron Jeffries looks like it's a pretty good intro to deeper usage of NUnit. ld [edit: finished hanging sentence] [edited by - liquiddark on May 27, 2004 12:29:07 PM]

Share this post


Link to post
Share on other sites
I might be wrong, but isn''t NUnit open source? Wouldn''t that allow you to remove the dll check? It would be that difficult, since in general all .NET assemblies, dll or exe are assemblies. And the change on itself is not tha tbig either, I guess it is a simple test.

There could be more behind it, though.

Share this post


Link to post
Share on other sites
quote:
Original post by Sijmen
I might be wrong, but isn''t NUnit open source? Wouldn''t that allow you to remove the dll check? It would be that difficult, since in general all .NET assemblies, dll or exe are assemblies. And the change on itself is not tha tbig either, I guess it is a simple test.

I decided pretty early on that unless it suddenly became a large time waster, digging through a mountain of code to find out how to run a .exe properly is about the last use of time I''m going to embark on. I would consider it to actually be anti-agile to have to do so. In my experience, the framework itself seems to be pushing XP practices implicitly, so that''s not the problem it could be, and I don''t usually have much code I can''t put into a dll. As you say, the problem is solvable. But there is no "simplest" way around it.

ld

Share this post


Link to post
Share on other sites
Second Adventure: The Joys of Deployment

I appear to have managed to become something of a go-to guy at my place of work when it comes to deployment. While our Programming Manager (really more of a Lead Programmer, but whatever, it''s only a title) is fully invested in developing the next-gen solution for deployment of our enterprise app, I''m stuck building & verifying custom install and config wizards for up-and-coming releases.

The really fun part is that we have a guy dedicated to InstallShield, and we end up beating our heads off each other a lot. That''s partly because I have trust issues and partly because he has a tendency to keep to the "simplest thing" a little too often. Not that I blame him, really; I had my turn on the IS roundabout and it''s about as not-fun as one could possibly hope for.

In the meantime, I''m also working on the package format for the wargame. At the moment we just use bmps for textures and text files for maps, autoloading the textures into a list a la UnrealEd''s texture palette, but now I''m building the basic tile editor, and so the texture palette on the main map editor is going to have to become a tile palette. Meanwhile, I still need my texture palette for the tile editor. A class hierarchy leaps to mind, and it turns out to only be an hour or so doing the separation and implementing it for the tile editor. As mentioned above i''m not one of those brilliant types who have a playable in 8 hours (the InstallShield guy, for the record, is, although I''d bristle to have him called more intelligent than me).

I''ve got a boilerplate wizard done for my job, and I''ve branched and modified it to suit the needs of the new custom config wizard. I had a lot of trouble figuring out how to verify that a web service is working from VB6, but the solution I have in place right now works well enough that I''m willing to let its hackish flavor slide for the moment. In the long run, of course, that won''t do. But my interpretation of You Aren''t Gonna Need It is basically that I can let anything slide one to three times before it becomes necessary to take a broader perspective on the problem, and this is my first encounter with the issue.

One of the other interesting issues that comes out of the custom wizard is one of being explicit with your team members. I''m notorious for saying things and later having to clarify them, and I''m really, really trying to get better on that score. The people with whom I am working for the custom wizard, however, are giving me a taste of my own medicine. In the current case, I''ve been reasonably happy. The wizard itself is only responsible for a few config file entries, and that''s been done for a long time. However, the IS guy needs to replace a file that gets installed with our main application, and that''s turning out to be a problem, partly because of the way we number our builds and partly because of IS.

See, VB doesn''t recognize all 4 of the standard build number components, only the major, minor, and build numbers. So say you''re building an enterprise app with several years worth of major, minor, internal, and patch releases, and you need to number your product. You have a number of choices, but none of them are great. In our case, we use letters to designate specific builds and place the build number in the Comments section of the version information. As it turns out, IS isn''t exactly happy with this setup. According to our guy, "Always Replace" should - but I can tell you that right now it doesn''t - always replace the dll, regardless of the version. The other consequences of such a setup might be severe anyways, especially if uninstalling our plugin uninstalls our plugin stub. I just don''t know.

Beyond the client plugin skullduggery, there is also a server component which needs to be deployed, and it involves a lot more work. Now if I something I''m working on needs a deployment package done I am extremely anal about it, and if I could just write the entire thing I would. When one is working with someone else''s component, however, their needs must be communicated. I actually expect them to do a lot of work to get their deployment functioning as it should, because they''re the only one who can tell you if things are set up the way they expect them to be set up.

The person who''s doing the communicating in this case doesn''t seem to share my attitude in this regard, and I''ve had a hell of a time getting them to tell me who what where when and how the product needs to be set up. Now add to the pile the fact that every time I go back to them the needs of the product wizard have changed, and you may begin to share my frustration. One day we were using Bulk Inserts, the next we were using SQL scripts. Now the whole thing is on the back burner while we settle the client side out and the component developer refactors their component into two separate server components, which will each need its own deployment. Take from this story what you will, I take it as a sign to find a new friggin job.

In comparison to the work side of things, my package format woes are a delight. The basic problem is this: how do you address your resources in such a way that you don''t look and feel like a knob. My opinion on the matter has been shaped by two factors, one a GD.net article on virtual file systems, and one a Game Developer Magazine article on the resource pipeline. I want to emulate the latter and make it simple for my designer to include new content just by dropping files into a subdirectory, maybe.

I don''t yet know, however, whether I want to have my packages include both game resources such as tile palettes and binary resources such as bitmaps. If I do, no biggie, i just tell him to drop new tile and bmp files into subdirectories pretty much like he''s doing now. If I don''t, however, I''m going to have to invent a couple of different package formats. I''ll probably go down the former route for the moment, Once and Only Once be damned. But to me it points out one of those intersections of software engineering concerns that seems to knot things up nice and tight.

Anyway, beware deployment, for it can be a hairy beast.

ld

Share this post


Link to post
Share on other sites
Peopleware: Relationship Engineering in Software Development

I''ve just had an awfully long conversation with one of my favourite coworkers about the things she has been concerned with day-to-day, and I thought that it might be a good time to discuss the need to have a solid social aspect when doing software development. Even limited research will tell you that a good percentage of your time is spent dealing with others, and hence not actually doing any coding at all. In this type of situation, it is paramount that you have respect for and also have the respect of your peers. This can become impossible if your situation becomes sufficiently dissatisfactory to a sufficiently large portion of the techies at your organization, so morale must also be maintained, often at the short-term expense of the business at hand, namely software development.

In my experience, these two things, morale and respect, require a lot of work all by themselves, and go a long way towards making a room productive. Someone who gets to work at Microsoft might have the benefit of an employee dedicated to fixing things but normal people don''t. That usually means that fixing things - regardless of the core problem nature - becomes everyone''s business. In interpersonal terms, that means being professional, open, and honest, especially once you get "Senior" or "Lead" prepended to your title. Problem is, the folks you want to have in those positions are usually also the people who want to be in - and are most productive in - full-on technical roles, not "babysitting".

In the case where you have this situation, it''s usually best to bite the bullet and put the tech genius in charge of a group of juniors and intermediates. It''s pretty easy to find something talking about team size, so that much, at least, is taken care of. Making teams significantly smaller, by the by, can be unhelpful or even counterproductive. I''ve lead teams of between 2 and 5 people before, and the sweet spot is definitely on the large side of 4 per lead. In workflow terms, everyone needs to explore the problem space, and you need some intermediate resources in there to coach the juniors. The team lead should really only be called in when the need is dire. But that should never be a high wall - if the lead is behind locked doors, your problems are guaranteed to multiply without bound.

In contrast, I work at an entirely other kind of organization, where the people in charge tend to be communicators more than technical savants. This has its own problems, however. People with core competency in communications appear to have less overall drive to excel technically, and hence have a more business-ey flavor. They can be very hard to deal with on matters of technical importance, and more importantly they tend to suck at evaluating the importance of actually dealing with an issue at a particular moment. This is why I had the aforementioned conversation in the first place - our team structure is currently suffering from some pretty severe deficits in terms of risk analysis.

There are no simple answers to these problems, so whenever your team consists of several people you have to be very aware of how much communication is taking place and how things look from a techie perspective. Everyone on a team has to take responsibility, too. It''s no good to expect one hero to save the day all the time. If things aren''t clicking and you have ideas on ways to improve the situation, it''s never a bad idea to try them out. If the communication channels in a dev group become blocked with a significant amount of crap, more time and effort will be lost trying to work with half an ear than would be spent clearing them in the first place. Most importantly, if the group can''t get ideas across boundaries, the project, whatever it is, is going to fail.

ld

Share this post


Link to post
Share on other sites
So, I think I''ll leave the adventures there. I hope someone might have found something of use here; the exercise has been good from my perspective, but this is probably not the right place to be putting this type of thing up.

Thanks for not flaming,
ld

Share this post


Link to post
Share on other sites