crash when i try to convert string to char*

Started by
6 comments, last by Enigma 16 years, 3 months ago
im making a quiz game, where i try to wrap everything i have learned about OpenGL together. However when i try to convert a string to a char* my game crashes, and the debugger takes me to this function in classes:
[source langg = "cpp"]
class Output
{
 public:
 int X,Y;
 string Text; 
};

class Font
{
 public:
 unsigned int fontListBase;
 
 bool Init();
 void Quit();
 void Render();
 unsigned int CreateBitmapFont(char* fontName, int fontSize);
 void ReleaseFont(unsigned int base);
 void RenderFont(int xPos, int yPos,unsigned int base, const char* str);
 
 Output* question;
  Output* answers[4];
  int CorrectAnswer;
  void SetCorrect(int i){ CorrectAnswer = i-1;}

};


//it works if i just type in manually what string Text holds, encapsled with "'s
Font::Render()
{
RenderFont(WIDTH/2,500,fontListBase,Fonts->question->Text.c_str());
}

void Font::RenderFont(int xPos, int yPos, unsigned int base, const char *str)
{
	if ((base == 0) || (!str))
		return;

	glRasterPos2i(xPos, yPos);

	glPushAttrib(GL_LIST_BIT);
		glListBase(base - 32);
		glCallLists((int)strlen(str), GL_UNSIGNED_BYTE, str);
	glPopAttrib();
}




when i compile it it gives me the following error when it crash: Unhandled exception at 0x00401e36 in Lesson3.exe: 0xC0000005: Access violation reading location 0x00000024. and the debugger takes me to the following function in "xstring"
[source langg = "cpp"]
_Elem *__CLR_OR_THIS_CALL _Myptr()
		{	// determine current pointer to buffer for mutable string
		return (_BUF_SIZE <= _Myres ? _Bx._Ptr : _Bx._Buf);
		}

•°*”˜˜”*°•.˜”*°•..•°*”˜.•°*”˜˜”*°•..•°*”˜˜”*°•.˜”*°•.˜”*°•. Mads .•°*”˜.•°*”˜.•°*”˜˜”*°•.˜”*°•..•°*”˜.•°*”˜.•°*”˜ ˜”*°•.˜”*°•.˜”*°•..•°*”˜I am going to live forever... or die trying!
Advertisement
Font::Render(){    RenderFont(WIDTH/2, 500, fontListBase, Fonts->question->Text.c_str());}


What is Fonts? Couldn't find it your code.

Other than that, I would check if Fonts isn't NULL, then I would check if Fonts->question isn't NULL. Maybe try to step through this function in the debugger, and watch these values?
A few things I would look into:

glListBase(base - 32);
I'm not sure what glListBase does, but if fontListBase is anything less than 32 when you use it in Font::Render(), you are likely to crash. As a unsigned value it will wrap around at 0.

void SetCorrect(int i){ CorrectAnswer = i-1;}
I wouldn't trust this one either. It opens the possibility for buffer overrun/underrun.

Its hard to say without the rest of the code. I would like to see what values fontListBase starts out with and what values are fed to SetCorrect.
Well, SetCorret() isnt being called yet anyway, it was just a function i made, so i easily could make up my mind, about what my class is supposed to do.

Well heres the whole class and where i call its functions:

[source langg = "cpp"]#pragma once#include <iostream>#include <windows.h>		// Header File For Windows#include <gl\gl.h>			// Header File For The OpenGL32 Library#include <gl\glu.h>			// Header File For The GLu32 Library#include <gl\glaux.h>		// Header File For The Glaux Library#include <stdio.h>#include <string.h>using namespace std;  class Output{ public: int X,Y; string Text; };class Font{ public: unsigned int fontListBase;  bool Init(); void Quit(); void Render(); unsigned int CreateBitmapFont(char* fontName, int fontSize); void ReleaseFont(unsigned int base); void RenderFont(int xPos, int yPos,unsigned int base, const char* str);  Output* question;  Output* answers[4];  int CorrectAnswer;  void SetCorrect(int i){ CorrectAnswer = i-1;}};//end of class header, here starts the font class .cpp file#include "FontClass.h"extern HDC hDC;extern int WIDTH;extern int HEIGHT  ;extern Font* Fonts;bool Font::Init(){ fontListBase = CreateBitmapFont("Verdanna", 24); return true;}void Font::Quit(){  ReleaseFont(fontListBase);									  }void Font::Render(){ glLoadIdentity(); //glColor3f(0.3f,0.6f,0.45f); if(Fonts != 0) { Fonts->question->Text = "LOOLllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllll"; RenderFont(WIDTH/2,500,fontListBase,Fonts->question->Text.c_str()); } RenderFont(0,500+24,fontListBase,"OpenGL FONT LOL LOL !11one"); RenderFont(0,500+24*2,fontListBase,"OpenGL FONT LOL LOL !11one"); RenderFont(0,500+24*3,fontListBase,"OpenGL FONT LOL LOL !11one"); RenderFont(0,500+24*4,fontListBase,"OpenGL FONT LOL LOL !11one");}unsigned int Font::CreateBitmapFont(char *fontName, int fontSize){ HFONT hFont; unsigned int base; base = glGenLists(96); if(stricmp(fontName,"Symbol")==0) {  hFont = CreateFont(fontSize,0,0,0,FW_BOLD,FALSE,FALSE,FALSE,SYMBOL_CHARSET,OUT_TT_PRECIS,CLIP_DEFAULT_PRECIS,ANTIALIASED_QUALITY,FF_DONTCARE,fontName); } else {  hFont = CreateFont(fontSize,0,0,0,FW_BOLD,FALSE,FALSE,FALSE,ANSI_CHARSET,OUT_TT_PRECIS,CLIP_DEFAULT_PRECIS,ANTIALIASED_QUALITY,FF_DONTCARE,fontName); } if(!hFont)   return 0;														  SelectObject(hDC,hFont); wglUseFontBitmaps(hDC,32,96,base);  return base;}void Font::ReleaseFont(unsigned int base){ if(base != 0)  glDeleteLists(base,96);}void Font::RenderFont(int xPos, int yPos, unsigned int base, const char *str){	if ((base == 0) || (!str))		return;	glRasterPos2i(xPos, yPos);	glPushAttrib(GL_LIST_BIT);		glListBase(base - 32);		glCallLists((int)strlen(str), GL_UNSIGNED_BYTE, str);	glPopAttrib();}


in my main function im calling

at initializing:
Font* Fonts = new Font();
Fonts->Init();

once every frame i call:
Fonts->Render();

and at shutdown:
Fonts->Quit();
delete(Fonts);
•°*”˜˜”*°•.˜”*°•..•°*”˜.•°*”˜˜”*°•..•°*”˜˜”*°•.˜”*°•.˜”*°•. Mads .•°*”˜.•°*”˜.•°*”˜˜”*°•.˜”*°•..•°*”˜.•°*”˜.•°*”˜ ˜”*°•.˜”*°•.˜”*°•..•°*”˜I am going to live forever... or die trying!
And where exactly do you initialise Fonts->question so that it actually points at an Output instance?

Some other comments:
// Don't just use #pragma once - it's non-portable// When you need help with your code you're more likely to get it if// you make things as easy as possible for people using other compilers#pragma once#if !defined(HEADER_NAME_H)#define HEADER_NAME_H// ironically the only header you actually need here is the one// you didn't include#include <iostream>#include <windows.h>		// Header File For Windows#include <gl\gl.h>			// Header File For The OpenGL32 Library#include <gl\glu.h>			// Header File For The GLu32 Library#include <gl\glaux.h>		// Header File For The Glaux Library#include <stdio.h>#include <string.h>// for std::string#include <string>// for boost::noncopyable (see below)#include <boost/utility.hpp>// never, ever, ever use using in a header - there's no way for clients// which include a header to un-using namespacesusing namespace std;class Output{ public: int X,Y; // we got rid of using namespace std, so we have to be explicit std::string Text; };class Font // this class allocates resources in its constructor and releases // them in its destructor (see changes below) therefore we either // need to provide a copy-constructor and copy-assignment operator // or else prevent copy-construction and copy-assignment // boost::noncopyable provides the latter : public boost::noncopyable{ public: // most of this has no business being public // do you really want any Tom, Dick or Harry setting your // fontListBase? unsigned int fontListBase;  // use constructors and destructors, not Init and Release functions // that way your objects are always valid // there are far better ways to do the initialisation rather than passing // loads of paramters to the constructor, but I've already spent a very long // time on this post! Font(std::string const & fontName,      int fontSize,      Output const & question,      Output const & answer1,      Output const & answer2,      Output const & answer3,      Output const & answer4,      int correctAnswer); ~Font(); bool Init(); void Quit(); void Render(); unsigned int CreateBitmapFont(char* fontName, int fontSize); void ReleaseFont(unsigned int base); // don't use raw char pointers unless you really have to void RenderFont(int xPos, int yPos,unsigned int base, std::string const & str); // hide implementation details private: // do you really need pointers here? Output* question; Output* answers[4]; int CorrectAnswer; // move this to be with the other member variables unsigned int fontListBase; void SetCorrect(int i){ CorrectAnswer = i-1;}};#endif//end of class header, here starts the font class .cpp file#include "FontClass.h"// you do actually need to include the windows and OpenGL headers here#include <windows.h>		// Header File For Windows#include <gl\gl.h>			// Header File For The OpenGL32 Library// ugh, globals, reconsider - prefer to pass parameters and/or use // member variablesextern HDC hDC;extern int WIDTH;extern int HEIGHT;extern Font* Fonts;Font::Font(std::string const & fontName,           int fontSize,           Output const & question,           Output const & answer1,           Output const & answer2,           Output const & answer3,           Output const & answer4,           int correctAnswer) : question(question), CorrectAnswer(correctAnswer){ // set the answers answers[0] = answer1; answers[1] = answer2; answers[2] = answer3; answers[3] = answer4; // moved CreateBitmapFont implementation here HFONT hFont; // we have a member variable, so we don't need a local unsigned int base; basefontListBase = glGenLists(96); // the joys of std::string if (fontName == "Symbol") {  hFont = CreateFont(fontSize,                     0,                     0,                     0,                     FW_BOLD,                     FALSE,                     FALSE,                     FALSE,                     SYMBOL_CHARSET,                     OUT_TT_PRECIS,                     CLIP_DEFAULT_PRECIS,                     ANTIALIASED_QUALITY,                     FF_DONTCARE,                     fontName.c_str()); } else {  hFont = CreateFont(fontSize,                     0,                     0,                     0,                     FW_BOLD,                     FALSE,                     FALSE,                     FALSE,                     ANSI_CHARSET,                     OUT_TT_PRECIS,                     CLIP_DEFAULT_PRECIS,                     ANTIALIASED_QUALITY,                     FF_DONTCARE,                     fontName.c_str()); } // always use braces if(!hFont) {   // can't return from a constructor   return 0;   // clean up   glDeleteLists(fontListBase,96);   fontListBase = 0; } // we couldn't return so we need to not do the following if !(!hFont) else {   SelectObject(hDC,hFont);   wglUseFontBitmaps(hDC,32,96,basefontListBase); }}Font::~Font(){ // moved ReleaseFont implementation here // use our member variable for the list base if(basefontListBase != 0)  glDeleteLists(basefontListBase,96);}bool Font::Init(){ fontListBase = CreateBitmapFont("Verdanna", 24); return true;}void Font::Quit(){  ReleaseFont(fontListBase);  }void Font::Render(){ glLoadIdentity(); //glColor3f(0.3f,0.6f,0.45f); if(Fonts != 0) { Fonts->question->Text = "LOOLllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllll"; RenderFont(WIDTH/2,500,fontListBase,Fonts->question->Text.c_str()); } RenderFont(0,500+24,fontListBase,"OpenGL FONT LOL LOL !11one"); RenderFont(0,500+24*2,fontListBase,"OpenGL FONT LOL LOL !11one"); RenderFont(0,500+24*3,fontListBase,"OpenGL FONT LOL LOL !11one"); RenderFont(0,500+24*4,fontListBase,"OpenGL FONT LOL LOL !11one");}unsigned int Font::CreateBitmapFont(char *fontName, int fontSize){ HFONT hFont; unsigned int base; base = glGenLists(96); if(stricmp(fontName,"Symbol")==0) {  hFont = CreateFont(fontSize,                     0,                     0,                     0,                     FW_BOLD,                     FALSE,                     FALSE,                     FALSE,                     SYMBOL_CHARSET,                     OUT_TT_PRECIS,                     CLIP_DEFAULT_PRECIS,                     ANTIALIASED_QUALITY,                     FF_DONTCARE,                     fontName); } else {  hFont = CreateFont(fontSize,                     0,                     0,                     0,                     FW_BOLD,                     FALSE,                     FALSE,                     FALSE,                     ANSI_CHARSET,                     OUT_TT_PRECIS,                     CLIP_DEFAULT_PRECIS,                     ANTIALIASED_QUALITY,                     FF_DONTCARE,                     fontName); } if(!hFont)   return 0;  SelectObject(hDC,hFont); wglUseFontBitmaps(hDC,32,96,base);  return base;}void Font::ReleaseFont(unsigned int base){ if(base != 0)  glDeleteLists(base,96);}void Font::RenderFont(int xPos, int yPos, unsigned int base, std::string const & str){	if ((base == 0) || (!str.empty()))	{		return;	}	glRasterPos2i(xPos, yPos);	glPushAttrib(GL_LIST_BIT);		glListBase(base - 32);		glCallLists((int)strlen(str)str.length(), GL_UNSIGNED_BYTE, str.c_str());	glPopAttrib();}
Apologies for the long post without source boxes, but I wanted to use colour to emphasise things.

Σnigma

EDIT: re-formated to avoid extra-wide post syndrome (×3)
WOW Enigma! Thanks a bunch! Your corrections and suggestions are very much appreciated. I am extremely happy you took the time to help me improve myself. I have rated you up.
•°*”˜˜”*°•.˜”*°•..•°*”˜.•°*”˜˜”*°•..•°*”˜˜”*°•.˜”*°•.˜”*°•. Mads .•°*”˜.•°*”˜.•°*”˜˜”*°•.˜”*°•..•°*”˜.•°*”˜.•°*”˜ ˜”*°•.˜”*°•.˜”*°•..•°*”˜I am going to live forever... or die trying!
Quote:Original post by Enigma
there are far better ways to do the initialisation rather than passing loads of paramters to the constructor


Could you please explain this?
I was simply thinking of things like:
  • Packing the answers into a boost::array or std::vector rather than passing them as individual parameters.

  • Packing the question and answers into a separate class or struct.

  • Decoupling the question and answers from the Font class entirely.

  • Ideally all of the above.
Σnigma

This topic is closed to new replies.

Advertisement