Jump to content

  • Log In with Google      Sign In   
  • Create Account


Zipster

Member Since 11 Mar 2000
Offline Last Active Dec 12 2013 05:22 PM
-----

Topics I've Started

Best practice for presenting enumerated display modes?

27 August 2013 - 07:17 PM

We're currently working on the video settings UI for our project, and we've run into a bit of a presentation issue regarding the enumerated display modes. In our past DX9 projects, we presented a list of modes to the user in the following format:

 

<width> x <height>, (<refresh rate> Hz[, Widescreen])

 

However the issue in DX11 is that multiple display modes can be enumerated that have different refresh rate numerators and denominators, yet result in the same integral value when divided, regardless of rounding (i.e. 59940 / 1000 and 59950 / 1000). Plus, multiple display modes can have identical widths, heights, and refresh rate ratios, yet have different scaling values (unspecified, centered, stretched).

 

My question is, what's the best practice for building a list of unique resolutions for presenting to the user? We'd like to keep it simple so that the user is only making a choice based on width, height, and integral refresh rate (numerator / denominator), however if multiple display modes have the same values, which one takes precedence? Why would I choose 59940 over 59950?


Question regarding re-executing TBB tasks

26 October 2012 - 07:11 PM

I haven't been able to find much information on the net pertaining to a particular usage pattern of TBB tasks, wherein the task recycles itself so it can be executed again, in my case for some extended period of time:
class MyTask : public tbb::task
{
public:
	 virtual tbb::task* execute()
	 {
		  // ... do work ...
		  if (!isDone())
		  {
			   recycle_as_safe_continuation();
			   set_ref_count(1);

			   // Task starvation?
			   // return this;
		  }

		  return 0;
	 }

private:
	 bool isDone() const { ... }
}
My question is regarding the behavior or returning 0 versus this after recycling the task. The scheduling documentation states that returning a task bypasses the scheduler, and the thread will execute the returned task next. If all the active worker threads are executing such self re-executing tasks, and they all return themselves as the next task, does this mean that the scheduler will starve any/all other tasks in the ready pool(s) for as long as the active tasks aren't done? Alternatively, if I return 0 instead, are other threads free to steal the task, or does the non-zero refcount prevent that?

[.net] [C#] Failing hard at DataGridView data binding

16 December 2010 - 02:59 PM

For the past few hours I've been trying to get data binding working properly with a DataGridView but not having much success. I've tried and failed to find anything helpful on Google, other than the basic tutorials (such as the one I followed here), but nothing that helps. What ends up happening is that I'm able to successfully add/remove/edit data in the grid view, however none of my changes reflect back onto the data source (which is a BindingList). I even tried changing the source to a DataTable thinking I'd have more luck, but it still didn't update -- no rows were added. This leads me to believe there's something wrong with how I'm doing the binding, but I'm not very experienced with this sort of thing so I don't know what I could possibly be missing. Here's the gist of the code I'm using.


public class MyBusinessObject : INotifyPropertyChanged
{
private UInt64 a;
private UInt64 b;
private UInt64 b;

public event PropertyChangedEventHandler PropertyChanged;

public UInt64 A { get { return a; } set { a = value; this.NotifyPropertyChanged("A"); } }
public UInt64 B { get { return b; } set { b = value; this.NotifyPropertyChanged("B"); } }
public UInt64 C { get { return c; } set { c = value; this.NotifyPropertyChanged("C"); } }

private void NotifyPropertyChanged(string name)
{
if (PropertyChanged != null)
{
PropertyChanged(this, new PropertyChangedEventArgs(name));
}
}
}

public class TheForm : Form
{
private BindingList<MyBusinessObject> Data = new BindingList<MyBusinessObject>();
private static BindingList<KeyValuePair<UInt64, string>> FooEnumData;
private static BindingList<KeyValuePair<UInt64, string>> BarEnumData;

private DataGridView dataGridView = new DataGridView();
private DataGridViewComboBoxColumn comboColA = new DataGridViewComboBoxColumn ();
private DataGridViewTextBoxColumn textColB = new DataGridViewTextBoxColumn ();
private DataGridViewComboBoxColumn comboColC = new DataGridViewComboBoxColumn ();

static TheForm()
{
FooEnumData = PopulateFromSomewhereElse();
BarEnumData = PopulateMoreFromSomewhereElse();
}

public TheForm()
{
InitializeComponent();

dataGridView.DataSource = Data;
dataGridView.AutoGenerateColumns = false;

comboColA.HeaderText = "A";
comboColA.Name = "comboColA";
comboColA.DataSource = FooEnumData;
comboColA.ValueMember = "Key";
comboColA.DisplayMember = "Value";
comboColA.DataPropertyName = "A";

textColB.HeaderText = "B";
textColB.Name = "textColB";
textColB.DataPropertyName = "B";

comboColC.HeaderText = "C";
comboColC.Name = "comboColC";
comboColC.DataSource = BarEnumData;
comboColC.ValueMember = "Key";
comboColC.DisplayMember = "Value";
comboColC.DataPropertyName = "C";

dataGridView.Columns.AddRange(new DataGridViewColumns[] { comboColA, textColB, comboColC });
}
}

I have a sneaking suspicion there's something weird going on with the combo cells. They were throwing a bunch of cell value exceptions (since their default values weren't in their bound data source), but multiple locations on the internets assured me that I could hook and ignore the DataError event in that case. Perhaps they were mistaken?

Should I close my savings account?

29 August 2010 - 09:49 PM

I'm in a little bit of a financial conundrum at the moment trying to figure out whether or not I should close my savings account, and I was hoping someone could shed some light on the situation.

I currently have a checking account at one bank, and a savings account at another bank. Right now I'm only making 0.05% interest in my current savings account, and after talking to one of that bank's "rate specialists" at length the other evening I found out that by opening one of their special checking accounts and upgrading my current savings account, I could bump that up to 0.15% in the savings account and 0.2% in the checking account, given the balance I'd be able to open them at. While the checking rate is higher than the savings rate, the savings rate scales better with balance.

However the issue is that I recently opened a second high-yield checking account the other day at a third bank, since I got fed up with my checking account bank and paying ATM fees every once in a while (no convenient locations given my daily routine). Plus it was a basic account with no interest. This checking account has 0.5% interest, no ATM fees, no balance maximums or minimums, free checks, etc. I'm not an expert in checking accounts but it seems to me to be a pretty sweet deal.

So the question is, why would I even want to bother with my savings account right now when the economy has caused the interest rates to drop so low, especially since the money is less fluid and has more restrictions anyway? Why shouldn't I just move everything into the 0.5% checking and earn the most interest with the least amount of fees and restrictions? On the one hand it seems like an obvious decisions, but on the other hand I feel like I'm missing something. Why does anyone have a savings account right now?

I know that it's difficult to give me personal financial advice without any more information (not that I'd expect anyone to anyway), but I'm hoping someone can at least help me understand whats going on with savings accounts these days.

[MSVC] Delay load hell

04 May 2010 - 10:57 AM

In the past, our company has used Bink for all our PC titles. However we're now working on a game that isn't going to have any fancy in-game video, so we need to remove our Bink dependency from our graphics library. However we're also working on a few other titles that require Bink and share common core libraries, so we can't just remove it entirely. The easiest solution is to delay-load the Bink DLL, which will remove the dependency in my project while still allowing us to use Bink in the other projects, and not require any major code changes or special integration instructions. However I can't for the life of me get the delay loading to work. After moving the Bink DLL to the delay load section in the linker settings, I get this: LINK : warning LNK4199: /DELAYLOAD:binkw32.lib ignored; no imports found from binkw32.lib Which is completely bogus, because a few lines later I get a slew of unresolved external errors related to Bink functions (I even confirmed that the functions being imported by my application existed in the list of functions being exported from the Bink DLL using dumpbin). If I keep the Bink DLL as a regular dependency, everything links properly, but not only do I still get the LNK4199 warning but Bink still isn't set to delay load (confirmed through Dependency Walker and dumpbin). I thought that maybe I had to use #pragma comment in some capacity, but no luck there either. I'm basically at my wits end trying to find the right combination of linker settings. Neither Google nor MSDN turn up anything that works...

PARTNERS