I've run into tons of trouble with this exact problem. After spending hours and hours on trying to figure out why my "stuff" isn't rendering, I've come up with a comprehensive checklist of things to verify and check.
If you've never rendered a model or primitive to the screen before using your current API, you want to establish a baseline by trying to do the most basic thing you can: render the simplest model/primitive you can. This is akin to writing your first "hello world" program for graphics. If you can do this, then the rest of graphics programming is simply a matter of adding on additional layers of complexity. The general debugging step then becomes a matter of adding on each subsequent layer of complexity and seeing which one breaks.
At the core, debugging is essentially just a matter of isolating and narrowing the problem down to as few possibilities as possible, then focusing in on each possibility.
This is for the C# and XNA API, but you can generalize or translate these points to your own language and API.
Let's start with the comprehensive checklist for primitive rendering (triangle lists, triangle strips, line lists, line strips):
1. Base case: Can you render a triangle to the screen without doing anything fancy?
-Are you setting vertex positions for the three corners of the triangle? Are they different from each other? Is it rendering a triangle which should be visible in the current view?
-Are you actually calling the "DrawPrimitive()" method, or equivalent in your API?
-Are you using vertex colors which contrast against the background color?
-Are you correctly applying a shader? Is the shader the correct shader? Have all shader settings been set correctly before you call the draw call?
-Are you using a valid view and projection matrix which would actually let you view the triangle?
-Are you using a world matrix which is transforming the triangle off screen? (You shouldn't even need a world matrix yet)
-Are you using the right primitive type in your DrawPrimitives call? (triangle list vs triangle strip, etc)
2. Indexed verticies: Are you using an index buffer to specify the vertex drawing order?
-Is the vertex drawing order compatible with your current cull mode? To find out, either toggle your cull mode or change your drawing order.
-Are you actually creating an index buffer? Are you copying an array of ints into your index buffer to fill it with data? Are the array values correct?
-If your index buffer is created, are you actually setting the graphics cards active index buffer to your index buffer?
-Are you using "DrawIndexedPrimitives()" or your API's equivalent draw call? Are you correctly specifying the correct number of primitives to draw?
-Does the drawing order make sense with regard to the primitive type you're using? ie, the vertex order in a triangle strip is very different from a triangle list.
3. Vertex Data:
-Are you using a custom vertex declaration? If yes, skip to #4.
-Are you using a vertex buffer? If yes:
-You must use a vertex array of some sort, at some point, to populate the vertex buffer. Verify that you're getting an array of verticies in your code. Using your IDE debugger, verify that the vertex data is correct.
-Are you moving your vertex array data into a vertex buffer? Is the vertex buffer the correct size? Does the vertex buffer have the vertex data from your vertex array?
-On the graphics card, are you setting the active vertex buffer before drawing? Is there an associated index buffer?
4. Custom Vertex Declarations: Are you using a custom vertex declaration?
Yes: Then you must be defining your vertex in a Struct.
-Does your vertex declaration include position information? If not, how are you computing the vertex position in your shader?
-Does your vertex declaration include every field you want to use?
-Are you creating a Vertex Declaration correctly?
-Are your vertex elements being defined in the same order as they are in the struct fields? This is one of the few times declaration variable order really matters because it's specifying the order they appear in the struct memory block.
-Are you correctly calculating the BYTE size of each variable in the vertex? Are you correctly calculating the field offset in bytes?
-Are you correctly specifying the vertex element usage?
-Are you correctly using the right usage index for the vertex element?
-Are you specifying the correct total byte size for your custom vertex declaration?
-Is your code correctly using the custom vertex data? ie, putting position information into a position variable.
5. High Level Shader Language (HLSL): Are you using a shader other than "BasicEffect"?
-Are you actually loading the content for the shader and storing it in an "Effect" data structure?
-Are you correctly initializing the effect?
-Are you setting a "Current Technique" in your render call to one which exists in the shader?
-Does the technique which you use include a vertex shader and a pixel shader? Are they supported by your API and graphics card?
-Does the vertex shader require any global variables to be set? (ie, camera position, world matricies, textures, etc). Are they being set to valid data?
-Does the vertex shader output valid data which the pixel shader can use?
-Does the pixel shader actually output color information?
-Does your vertex shader math and logic check out correctly? (If you don't know or aren't sure, it's time to use a shader debugger).
6. Shader debuggers:
I'm using Visual Studio 2010, so I can't use the built-in shader debugger from VS2012. I have to use external tools. Here are the ones I've tried and my thoughts on them:
NVidia FX Composer: It sucks. It is unstable and crashes frequently, has a high learning curve, and can't attach a shader debugger to an executable file (your game). You can't push custom vertex data into a shader and see how the shader handles it. This program is mostly useful for creating shaders for existing models.
ATI's GPU PerfStudio: It doesn't work with DirectX 9.0, so if you're using XNA, you're out of luck. Sorry, ATI doesn't care enough. It's also a bit confusing to setup and get running.
Microsoft PIX: It's a mediocre debugger, but is the best one I've found. It is included in the DirectX SDK. The most useful feature is being able to attach to an EXE and capturing a frame by pressing F12. You can then view every single method call used to draw that frame, along with the method parameters. This tool also lets you view every single resource (DX Surfaces, vertex buffers, index buffers, rasterizer settings, etc) on the graphics card, along with that resources data. This is the best way to see if your vertex data and index buffer data is legit. You can also debug a pixel in your vertex data. This lets you step through your shader code (HLSL or ASM) line by line and see what the actual variable values are being set to. It's an okay debugger, but it doesn't have any intellisense or let you mouse over a variable to see its values like the Visual Studio IDE debugger does. This is the debugger I currently use to debug my shaders. The debugging workflow is a bit cumbersome since you have to rebuild your project, start a new experiment, take a snapshot, find the frame, find the data object you want to see, step through the shader debugger to the variable you're interested in (~2 minutes). Here are a few "nice to know" notes on PIX:
-If you're looking at the contents of a vertex buffer:
-Each block is 32 bits, or 4 bytes in size. Keep this in mind if you're using a custom vertex declaration to pack data into a 4 byte block (such as with Color data).
-0xFF is displayed as a funky value: -1.#Q0
-Each 4-byte block is displayed in the order it appears in your custom vertex declaration. Each vertex data block is your vertex declaration size / 4. (ie, 36 bytes = 36 / 4 = 9 blocks per vertex)
-The total size of the buffer is the blocks per vertex multiplied by the number of verticies you have (ie, 9 * 3 = 27 4-byte blocks)
-Usage: If your vertex declaration byte offsets are off by a byte or more, you should expect to see funky data in the buffer.
-Vertex Declaration should always match the vertex declaration in your custom vertex declaration struct.
-By selecting the actual draw call in the events list and then looking at the mesh, you can see the vertex information as it appears in the pre-vertex shader (object space), the post-vertex shader (world space), and Viewport (screen space). If the vertex data doesn't look right in any of these steps, you should know where to start debugging.
*Special note: If you're creating geometries on the graphics card within your shader, you won't see much of value in the pre-vertex shader.
-The debugger includes an shader assembly language debugger. It's nice to have but not very useful.
-The shader compiler will remove any code which isn't used in the final output of a vertex. This is extra annoying when you're trying to set values to a variable and debug them.
The same principles from the primitive rendering apply, except you have to verify that you've correctly loaded the model data into memory and are calling the right method to render a model.
One handy tip which may help you for your project: Write down each step it takes to add and render a new model within your project (ie, your projects content creation pipeline & workflow). It's easy to accidentally skip a step as you're creating new assets and end up wasting time trying to isolate the problem to that missed step. An ounce of prevention is worth a pound of cure, right?