furby100

Members
  • Content count

    5409
  • Joined

  • Last visited

Community Reputation

102 Neutral

About furby100

  • Rank
    Contributor
  1. So who is still around these days?

    I'm not around any more, but some strange compulsion caused me to wander back here tonight. I first registered to participate in the MP3-beating compression thread. It was thanks to the GDNet lounge that I discovered that there were still creationists around and that a whole segment of society viewed Microsoft as the ultimate evil. I was once briefly allowed to be a moderator here, but I think Dave came to view that as an error of judgement (on the part of the person who nominated me), mainly because I allowed friends of mine to use my account and they started deleting threads for the thrill of it. I no longer harbour dreams of becoming a game developer, after realising that I did not actually enjoy programming so much when it was all I had to do and without being able to direct what I was doing myself. I am moving from England to the Netherlands soon. That is all.
  2. John Carmack on Ray Tracing

    Quote:Original post by Vilem Otte I think GPUs will be dead, when quatum CPUs will come - they're really parallelised and damn fast. But the best quantum CPU they built is 7 qu-bit CPU (128 operations simuateously - it's like 128 core CPU!!!) Quantum computers don't work that way! If you just create a superposition of all the possibilities, where most of the possibilities are failures, then when you measure the output, with very high probability it will give you one of the wrong answers. You need an algorithm that can alter the quantum state so that the probabilities build up on one of the right answers so that you get a high probability of getting that answer when you measure the qubits. Shor's algorithm for integer factoring is one of the useful examples, as it has a significant speedup (exponential in classical to bounded probabilistic linear in the quantum case). Another example is Grover's algorithm, which can invert any (quantum) computable function in sqrt(n). Here are some links: Explanation of what quantum computers don't do Explanation of Shor's algorithm with approving comment by Peter Shor. Wikipedia also has a thing or two.
  3. Questions about managed languages

    Quote:Original post by Raghar Quote:Original post by Promit Fullscreen mode is a graphics related matter. It won't have any effect on how your app is scheduled. Besides, it's fairly common to run games -- MMOs in particular -- in windowed mode these days. Fullscreen is a signal to OS. OS could: 1. Kill all input hooks non initiated by the application. 2. Ignore refresh requests by all other applications (application has screen in exclusive mode). 3. Swap other applications more aggressively when memory is needed, and schedule threads of the fullscreen application more often. All these three optimizations helps, more or less. I run a dual monitor system, and I won't be best pleased when running a game fullscreen on one monitor screws with how the applications on the other are running. It's annoying enough that the Source engine appears to have a bug that causes it to flip rapidly between resolutions when first started when I have the second monitor connected. It also takes away my mouse cursor. I am the sort of person who thinks that games should allow me to run them and play music chosen by me at the same time on the same machine. On the subject of something related to the first topic, I don't think anyone mentioned the concept of fail-fast by name. Briefly, there are certain kinds of errors which should never ever ever occur at runtime. Reading outside an array is one of these; taking a vararg that isn't there is another (e.g. format string attacks). Thus, if it does happen, the proper response is to stop what you're doing at once, because no good can come of what you're doing. Managed systems are good specifically because they have this property with respect to memory. O'Caml's disallowance of nulls is in some ways the ultimate fail-fast - the program fails before you even write it (although I think it lets you use non-exhaustive pattern matching, in which case you can get a runtime error if you don't provide a clause for None when matching an 'a option).
  4. Tips on getting motivated to read textbooks...

    Since you learned to program, you must have read books on programming. What is it specifically about these textbooks that you find hard? Since I don't know, I'm going to try to throw out some general comments anyway. Mathematical things have a certain peculiar quality, where they are hard to learn, but much easier to use once they've been learned and internalized. If you look at what you'll eventually be doing before you've got the previous stuff, it'll seem like you'll never get there. When you're dealing with reading over a proof, or a presentation of an algorithm, the only way to really get it is to read through the individual steps until you get how each of the individual steps in sequence makes the right answer. You have to go much slower than reading ordinary prose because the density of ideas is much greater. I also find it usually helps me to try a few concrete examples step by step. When you have to learn a proof or algorithm, don't learn the individual steps, but reduce it to some very high level, inadequate verbal description in your head, then try to reconstruct it yourself. If you fail to do so (which you will at first) then go back and see what the 'trick' is (there is at least one non-obvious 'trick' to every interesting proof or algorithm) that allows you to rederive it. When you have to remember a lot of definitions, I find it helps to paraphrase the definitions in different ways, and relate them to things you already know. This helps me remember, because if I have trouble retrieving a definition I have another choice to go by; it also helps me get a better understanding of when I can use the definition. When you were learning programming, and you read the next section of something telling you about some programming construct, after you read it you probably thought of several ways you'd like to use it in your own projects. It is possible to do something analogous to this for mathematics as well. When something is introduced, think of how it combines with what you already know and what the consequences might be, or if there are any interesting questions it raises. One thing that thrilled me when I was in high school and I was introduced to the exponential function was that I realized that by using logarithms to change base I could now differentiate anything involving raising stuff to powers. Moreover, since I could already differentiate sums, products and quotients this now meant I could differentiate any function involving elementary arithmetical operations. Another time was when I was learning category theory and I was introduced to how you can define groups as categories with one object and all the arrows isomorphisms, and then I realized that the definition of a functor between two such categories then perfectly coincided with a group homomorphism. Both of these things are considered obvious and trivial to people who know them, but to me at the time it gave me the astounding impression that I was in some sense exploring somewhere only accessible through the mind, but nevertheless fully real and substantial. A thing I also think helps is that you should always do the exercises in textbooks, if only because they provide a kind of jumping off point for what I described above. I also find they really help cement ideas in my mind.
  5. Programming Interview Questions...

    If someone doesn't post a useful answer, could you post what stuff you were asked after it happens? This way you'd at least get the chance to aid other people in your predicament.
  6. Handy code snippet

    Is the thing "In locus hic, omnes res dementes sunt" yours? If so, shouldn't it be "In loco hoc"? My preferred rendition would be "Omnia hic insania fiunt", as a reference to Descartes' quip that "Omnia apud me mathematica fiunt".
  7. Hey... YOU'RE DUMB!

    Are you familiar with the fact that Socrates was the wisest man, because he knew that he knew nothing, while everybody else mistakenly thought they knew something? It comes from Plato's Apology.
  8. Programming without a degree

    Quote:Original post by stimarco Employers are ordinary people, just like you. Not everyone is a natural at HR: most just wing it and rely on their judgement, but it is just _so_ much easier to choose someone with "Ph.D (Oxon)" after their name over someone whose last educational establishment was "Catford Boys 6th Form Consortium". Small correction: Oxford calls its PhDs DPhils. This is despite the term DPhil being used elsewhere in contradistinction to PhD. The reason for this is probably historical.
  9. Nerd Sniping

    The way of solving it that I know about uses Fourier analysis. However, I think the integral didn't come out nicely. The effective resistance between two opposite corners of one square is something involving pi though, as I recall.
  10. Rotating a plane

    Rotate the normal.
  11. Linus Torvalds and C++

    Quote:Original post by Spoonbender If you want to argue that using goto's like this is a good idea, you'll need better arguments than "lots of people do it". It might be true, but it doesn't really change whether or not it's a good practice. That's only part of the argument (and considering usage is not entirely foolish. Certainly if nobody used it that would be a strong suggestion that it was known to be bad). What's your suggested method of doing mandatory clean-up before return in a function in C? Remember that destructors and RAII are unavailable. I'll argue that using if statements is worse because it doesn't scale (if you have more than a few possible error exits the code for the most common case will be indented far to the right). If you intend to make a function to handle clean-up instead of using a goto: 1. You add extra verboseness for no gain other than satisfying yourself that you don't use gotos. 2. Every time you add a new entity that needs to be cleaned up, you need to add an extra argument to the clean up function, and then search and replace all the calls to the function to call it with the new entity. 3. It's also probably avoided because people are worried about pushing stuff on the stack unnecessarily. However, this doesn't matter with modern compilers, which is why I put it last. Of course, using goto because you don't know what a do-while loop is (which appears to be the source of a goto mentioned above) is silly. The compiler is there for a reason. It's important to remember that the purpose of programming languages is to save thinking, and to save typing. (Everything that can be written in a high level language can in principle be written directly in byte-codes, but not necessarily in your life time.)
  12. Quote:Original post by kyron21 ExcessNeo, ah YES. That's what I used them for in my project last week. To setup an array size based on user input. Normally I use vectors and a loop for that....I'm guessing doing it that way saves some effort. I completely understand references. I've used them for quite some time...but pointers are just giving me a hard time. I'm going to experiment more with them and functions I guess. Thank you. If you understand references, then pointers are a kind of mutable reference. Using the pointer's name with no star reassigns what it points to, and using the star accesses the thing it points to (like what a reference does with no star). That means I can change what a pointer points to if I want to. This is useful if you want to do linked lists, for instance, or build a graph or tree.
  13. Math Usage

    The following list cannot be considered exhaustive: Trigonometry, if you don't already know it. Similarly, solving simultaneous equations in several unknowns, e.g. for getting various intersections of planes and spheres etc for clipping. How to use 3D vectors and 4D homogeneous vectors with matrices to do translations, rotations etc. How the world -> camera transformation works. How the perspective transformation works. Various lighting equations.
  14. Quote:Original post by Anon Mike My next question would be, what's a good book for learning Haskell? I learned from Programming in Haskell by Graham Hutton. It's a slim volume of 184 pages, but monads and using the laziness effectively are covered. It's a good idea to do the exercises, and at some points you need to get the code from the book's web site as occasionally it glosses over what some of the example code is (not nearly as much as you would expect if you're used to C++ or Java). Once you finish you should also look through the Haskell 98 Report. Someone earlier asked if Haskell had an optimising compiler. Rather than make a third post, I'll put it here. There is GHC, which compares favorably with the speed of Java 6 and GNU C++ (it's sometimes faster, and is usually only 2-4 times slower or so, although at a maximum it was 22 times slower than C++). Here is Python vs C++, for comparison.
  15. Quote:Original post by Daerax Epigram is another obscure one. Has anyone done anything useful in Epigram? It's about to change its type theory, and the version I looked at didn't have any ability to use the computer's built in arithmetic (it did all arithmetic by the datatype for Peano arithmetic). However, the prohibition on partial functions and the ability to write datatypes like non-empty lists with an even number of elements is quite fun. Another language that prohibits partial functions is Charity, which does so by only allowing primitive recursion and primitive co-recursion (but on arbitrary datatypes, not just numbers). It uses category theory, with the two things being categorical duals of each other, and has a separation between datatypes and codatatypes. However, it is also a research language in that I don't think any useful programs have been produced (although it's relatively simple to do a lot of the algorithm examples that you get, such as quicksort and towers of Hanoi). Quote: I'd say to go with Haskell. It's the new Lisp (the supposedly mind stretching language whose proponents cant believe there are actually people who can't see its immediate superiority over current programming thinkings). I vote for Haskell too. And not just because I usually use lazy evaluation in real life.