Jump to content

  • Log In with Google      Sign In   
  • Create Account


Member Since 13 Mar 2000
Offline Last Active Sep 26 2011 05:23 PM

Posts I've Made

In Topic: So who is still around these days?

26 September 2011 - 05:01 PM

I'm not around any more, but some strange compulsion caused me to wander back here tonight. I first registered to participate in the MP3-beating compression thread. It was thanks to the GDNet lounge that I discovered that there were still creationists around and that a whole segment of society viewed Microsoft as the ultimate evil.

I was once briefly allowed to be a moderator here, but I think Dave came to view that as an error of judgement (on the part of the person who nominated me), mainly because I allowed friends of mine to use my account and they started deleting threads for the thrill of it.

I no longer harbour dreams of becoming a game developer, after realising that I did not actually enjoy programming so much when it was all I had to do and without being able to direct what I was doing myself.

I am moving from England to the Netherlands soon.

That is all.

In Topic: John Carmack on Ray Tracing

09 April 2008 - 01:30 PM

Original post by Vilem Otte
I think GPUs will be dead, when quatum CPUs will come - they're really parallelised and damn fast. But the best quantum CPU they built is 7 qu-bit CPU (128 operations simuateously - it's like 128 core CPU!!!)

Quantum computers don't work that way! If you just create a superposition of all the possibilities, where most of the possibilities are failures, then when you measure the output, with very high probability it will give you one of the wrong answers. You need an algorithm that can alter the quantum state so that the probabilities build up on one of the right answers so that you get a high probability of getting that answer when you measure the qubits. Shor's algorithm for integer factoring is one of the useful examples, as it has a significant speedup (exponential in classical to bounded probabilistic linear in the quantum case). Another example is Grover's algorithm, which can invert any (quantum) computable function in sqrt(n).

Here are some links:
Explanation of what quantum computers don't do
Explanation of Shor's algorithm with approving comment by Peter Shor.

Wikipedia also has a thing or two.

In Topic: Questions about managed languages

05 March 2008 - 10:55 PM

Original post by Raghar
Original post by Promit
Fullscreen mode is a graphics related matter. It won't have any effect on how your app is scheduled. Besides, it's fairly common to run games -- MMOs in particular -- in windowed mode these days.

Fullscreen is a signal to OS. OS could:
1. Kill all input hooks non initiated by the application.
2. Ignore refresh requests by all other applications (application has screen in exclusive mode).
3. Swap other applications more aggressively when memory is needed, and schedule threads of the fullscreen application more often.

All these three optimizations helps, more or less.

I run a dual monitor system, and I won't be best pleased when running a game fullscreen on one monitor screws with how the applications on the other are running. It's annoying enough that the Source engine appears to have a bug that causes it to flip rapidly between resolutions when first started when I have the second monitor connected. It also takes away my mouse cursor. I am the sort of person who thinks that games should allow me to run them and play music chosen by me at the same time on the same machine.

On the subject of something related to the first topic, I don't think anyone mentioned the concept of fail-fast by name. Briefly, there are certain kinds of errors which should never ever ever occur at runtime. Reading outside an array is one of these; taking a vararg that isn't there is another (e.g. format string attacks). Thus, if it does happen, the proper response is to stop what you're doing at once, because no good can come of what you're doing. Managed systems are good specifically because they have this property with respect to memory. O'Caml's disallowance of nulls is in some ways the ultimate fail-fast - the program fails before you even write it (although I think it lets you use non-exhaustive pattern matching, in which case you can get a runtime error if you don't provide a clause for None when matching an 'a option).

In Topic: Tips on getting motivated to read textbooks...

05 March 2008 - 09:48 PM

Since you learned to program, you must have read books on programming. What is it specifically about these textbooks that you find hard?

Since I don't know, I'm going to try to throw out some general comments anyway. Mathematical things have a certain peculiar quality, where they are hard to learn, but much easier to use once they've been learned and internalized. If you look at what you'll eventually be doing before you've got the previous stuff, it'll seem like you'll never get there.

When you're dealing with reading over a proof, or a presentation of an algorithm, the only way to really get it is to read through the individual steps until you get how each of the individual steps in sequence makes the right answer. You have to go much slower than reading ordinary prose because the density of ideas is much greater. I also find it usually helps me to try a few concrete examples step by step. When you have to learn a proof or algorithm, don't learn the individual steps, but reduce it to some very high level, inadequate verbal description in your head, then try to reconstruct it yourself. If you fail to do so (which you will at first) then go back and see what the 'trick' is (there is at least one non-obvious 'trick' to every interesting proof or algorithm) that allows you to rederive it.

When you have to remember a lot of definitions, I find it helps to paraphrase the definitions in different ways, and relate them to things you already know. This helps me remember, because if I have trouble retrieving a definition I have another choice to go by; it also helps me get a better understanding of when I can use the definition.

When you were learning programming, and you read the next section of something telling you about some programming construct, after you read it you probably thought of several ways you'd like to use it in your own projects. It is possible to do something analogous to this for mathematics as well. When something is introduced, think of how it combines with what you already know and what the consequences might be, or if there are any interesting questions it raises. One thing that thrilled me when I was in high school and I was introduced to the exponential function was that I realized that by using logarithms to change base I could now differentiate anything involving raising stuff to powers. Moreover, since I could already differentiate sums, products and quotients this now meant I could differentiate any function involving elementary arithmetical operations. Another time was when I was learning category theory and I was introduced to how you can define groups as categories with one object and all the arrows isomorphisms, and then I realized that the definition of a functor between two such categories then perfectly coincided with a group homomorphism. Both of these things are considered obvious and trivial to people who know them, but to me at the time it gave me the astounding impression that I was in some sense exploring somewhere only accessible through the mind, but nevertheless fully real and substantial.

A thing I also think helps is that you should always do the exercises in textbooks, if only because they provide a kind of jumping off point for what I described above. I also find they really help cement ideas in my mind.

In Topic: Programming Interview Questions...

05 March 2008 - 08:50 PM

If someone doesn't post a useful answer, could you post what stuff you were asked after it happens? This way you'd at least get the chance to aid other people in your predicament.