Boost xtime & threading

Started by
5 comments, last by Deliverance 14 years, 8 months ago
I'm new to boost and looking to implement a platform independent function,in a multi threaded environment, that returns the current time in seconds, as a double. My first attempt is this, but it is not working as expected(if i replace all of this with GetTickCount() / 1000.0f) there are no issues:

boost::xtime xt;
boost::xtime_get(&xt, boost::TIME_UTC);
return xt.sec + double(xt.nsec) / 1000000000.0f;

Is there anything obviously wrong here?
Advertisement
Quote:Original post by Deliverance
but it is not working as expected


What do you expect?
Quote:Original post by Antheus
Quote:Original post by Deliverance
but it is not working as expected


What do you expect?


I expect to stop reading sound packets after 5 miliseconds have passed, and do that later so the video thread has some time to process and the sound not to starve the video thread. The relevant code(GetTime() is using the code posted already):

bool TheoraDecoder::read_packet(istream& is, ogg_sync_state* state, OggStream* stream, ogg_packet* packet, bool readingSound) {	boost::recursive_mutex::scoped_lock lock(mutex);	int ret = 0;	int ado=0;	float startTime = GetTime();	while ((ret = ogg_stream_packetout(&stream->mState, packet)) != 1) 	{		ado ++;				if (readingSound && !preprocessing)		{			float timePassed = GetTime() - startTime;			if (timePassed > soundUpdateTime)						{				return false;			}		}		ogg_page page;		if (!read_page(is, state, &page))		{			return false;		}		int serial = ogg_page_serialno(&page);		assert(mStreams.find(serial) != mStreams.end());		OggStream* pageStream = mStreams[serial];		// Drop data for streams we're not interested in.				if (pageStream->mActive) {			ret = ogg_stream_pagein(&pageStream->mState, &page);			assert(ret == 0);		}	}	return true;}
Quote:Original post by Deliverance
Quote:Original post by Antheus
Quote:Original post by Deliverance
but it is not working as expected


What do you expect?


I expect to stop reading sound packets after 5 miliseconds have passed, and do that later so the video thread has some time to process and the sound not to starve the video thread. The relevant code(GetTime() is using the code posted already):


And what actually happens?

It's hard to suggest a solution, before we even know what the problem is.

Quote:Original post by Antheus
Quote:Original post by Deliverance
Quote:Original post by Antheus
Quote:Original post by Deliverance
but it is not working as expected


What do you expect?


I expect to stop reading sound packets after 5 miliseconds have passed, and do that later so the video thread has some time to process and the sound not to starve the video thread. The relevant code(GetTime() is using the code posted already):


And what actually happens?

It's hard to suggest a solution, before we even know what the problem is.


From a point in time, timePassed keep incrementing, constantly. It is like if startTime was a static variable.
Well, for this particular case, boost::timer is a simpler choice:
  boost::timer timer;	while ((ret = ogg_stream_packetout(&stream->mState, packet)) != 1) 	{		ado ++;				if (readingSound && !preprocessing)		{                       if (timer.elapsed() > soundUpdateTime) return false;		}}
Quote:Original post by Antheus
Well, for this particular case, boost::timer is a simpler choice:
  boost::timer timer;	while ((ret = ogg_stream_packetout(&stream->mState, packet)) != 1) 	{		ado ++;				if (readingSound && !preprocessing)		{                       if (timer.elapsed() > soundUpdateTime) return false;		}}


Thanks, it works well now, could not figure out why the other approach didn't. I saw that internally boost::timer uses std::clock()(so it does not have a good resolution).

[Edited by - Deliverance on August 19, 2009 9:35:35 AM]

This topic is closed to new replies.

Advertisement