Jump to content

  • Log In with Google      Sign In   
  • Create Account


Inventing on Principle - amazing video!


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
66 replies to this topic

#1 et1337   Members   -  Reputation: 1451

Posted 16 February 2012 - 08:26 AM

Saw this on /r/gamedev... you should at least skip through it briefly, it's definitely worth your time!



Sponsor:

#2 swiftcoder   Senior Moderators   -  Reputation: 9760

Posted 16 February 2012 - 10:01 AM

That's pretty bloody impressive. His programming tools are amazing - I hope that sort of thing starts to filter into the mainstream.

And his philosophy is not bad either. Some great words to live by, right there.

Tristam MacDonald - Software Engineer @Amazon - [swiftcoding]


#3 boolean   Members   -  Reputation: 1706

Posted 18 February 2012 - 09:59 PM

I'm only at the 12 minute mark and my mind is blown. When he started fiddling with "the future", I laughed out loud and said "bugger off" to my computer.

[Android] Stupid Human Castles - If Tetris had monsters with powers and were attacking human castles. "4/5 - frandroid.com"

Full version and Demo Version available on the Android app store.


#4 DenzelM   Members   -  Reputation: 295

Posted 19 February 2012 - 12:29 AM

At first I thought this was a joke, but wow, I am amazed just like everyone else. I couldn't help but laugh at his catchphrase, "but it's not good enough." He's right, those tools are far better than good enough!
Denzel Morris (@drdizzy) :: Software Engineer :: SkyTech Enterprises, Inc.
"When men are most sure and arrogant they are commonly most mistaken, giving views to passion without that proper deliberation which alone can secure them from the grossest absurdities." - David Hume

#5 irreversible   Crossbones+   -  Reputation: 1240

Posted 19 February 2012 - 09:08 AM

First off, I really like how this guy looks at life and I'm going to snag several ideas he mentioned to guide my own judgement in the future.

However, as far as the presentation at hand itself goes, the only problem I have is the applicability of the concept in a real world environment. The part about animation is brilliant and I would expect the next generation of artist tools to employ techniques like the ones presented by him. The part about adjusting code parameters in real-time is really useful for a subset of people (in particular any scripted environment).

However, when it comes to actual non-scripted programming that involves a non-trivial code base, I don't really see a benefit (although the concept in its own sense has been present for a while in debug mode - at least in Visual Studio -, which allows you to adjust values, albeit without a temporal aspect, at runtime).

In a broader sense, the problems I see are:

- any algorithm you write should be one that you foremost understand. True, catching erraneous termination conditions (or a lack of there of) is a really nice thing and can speed up debugging considerably; however, if you really need a complex simulator (which the data-driven concept he advocates is not as it is severely limited by the fixed nature of the input), you will still be stuck with writing your own test cases and custom debugging tools. No one-glove-fits-them-all solution is going to proof your code for you if you, the programmer, don't understand how it works inside your head first
- any algorithm simple enough for a generic debugger like this to handle is in my view, at best, suitable as a learning experience. Writing any more complex piece of code (or algorithm) will quickly reveal that either the feedback mechanism itself becomes too complex to grasp or you will be misjudging the code erraneously either because your approach is "too artistic" (you rely too much on the test data and not the theory behind it) or you simply do not grasp the broader picture in which the algorithm needs to fit. An example would be any sort of recursion, which generally requires a very specific visual "feel" of the algorithm to get down correctly. A tool like that will help you, yes, but only in the most trivial cases

In summary, there is and will always be a barrier between "the artist" and "the programmer". Bridging the gap between the two is a truly noble effort, but I fear is flawed from the start. Simple rewording would help here: eg "expanding the artist's tools to visualize complex work more easily" or "giving the programmer a new perspective on their code". I do realize that I'm narrowing the grand idea behind his talk (which is saving ideas from destruction by inability to realize them due to an a lack of intuition), but he does try to apply the same principle to all walks of life. As for the programmer and the artist - the two will forever remain two different types of people with separate skillsets and purposes.

PS - thanks for sharing the video!

#6 swiftcoder   Senior Moderators   -  Reputation: 9760

Posted 19 February 2012 - 09:42 AM

In summary, there is and will always be a barrier between "the artist" and "the programmer". Bridging the gap between the two is a truly noble effort, but I fear is flawed from the start... As for the programmer and the artist - the two will forever remain two different types of people with separate skillsets and purposes.

As someone who's job it is to teach programming to art students, I say: meh.

You know what my art students have trouble with? Text editors, file system paths, FTP uploads, and remembering syntax. Know what they are really good at? Visualising the execution of algorithms. Describing the various sorting algorithms is possibly easier than doing so to CS students, as long as you draw an example on the whiteboard.

There is a huge empty space to be filled by programming languages that are easier to read/write without complex textual syntax/memorisation, and tools that help us to visualise every component of our program. Over 50% of the human brain is dedicated to visual processing - anything that can be done to help move programming from a textual to a visual medium, is going to be a net win in comprehension of code/algorithms and ease of spotting bugs.

Now personally, I don't think his algorithm visualisation in textual-columns was particularly compelling. But it's a step in the right direction, and someone needs to keep making those steps.

Tristam MacDonald - Software Engineer @Amazon - [swiftcoding]


#7 ddn3   Members   -  Reputation: 1261

Posted 19 February 2012 - 11:00 PM

He has some good ideas but after working with such systems I can't say it's really applicable to many problems or actually accelerate development. The problem with immediate feedback schemes, is it requires a nearly complete solution before you can iterate upon it like how he describes, that's nice but programming is problem solving and when you've reached that state, most of the big design decisions have already been made and implemented, you'd be just iterating over the minutiae. It might give u insight into the edge cases but it won't give you fundamentally new or innovative algorithms or designs.

That and also most problem sets are not easily visualized. Maybe that's the bigger challenge. Writing a good visualizer would take a major effort perhaps better spent on pen and paper, but who knows.. maybe you can make it re-usable for other circumstances.

-ddn

Edit : there are a few other good talks on that channel, this one is also worth watching.. I think it's actually more pertinent and impacting.



#8 Hodgman   Moderators   -  Reputation: 28586

Posted 19 February 2012 - 11:34 PM

+1 thanks for sharing.

It's a fact of life that when programming, everyone makes obvious mistakes all the time, in simple algorithms and complex ones.
One method for dealing with this is pair programming -- with two people watching the one screen, a lot more of these mistakes get immediately caught, as you partner points out "you forgot a divide by 255 there" or "don't you mean minus two?", etc...

That's not very practical most of the time though, so the method that I use regularly is to use the debugger to step through every line of code that I write, and watch the execution flow and all values via the 'watch' window. Just by watching the data being transformed by your code, you can catch the majority of these simple human errors.
So, any IDE that made this process easier on me -- such as being able to instantly "step into" a function in the IDE using dummy data -- would be an amazing help for writing bug-free code.

I'm not sure how the rest of you go about writing bug-free code, but personally, watching the code running like this is a vital step in my ensuring it's correctness. If I've not inspected each line via the debugger/watch, then I don't trust there's not a simple logic error waiting to manifest as a bug at some point in the future. It's like the traditional "desk checking" method, but without the possibility of my "internal computer" making the same logic error I did when writing the code originally.

#9 irreversible   Crossbones+   -  Reputation: 1240

Posted 20 February 2012 - 02:55 AM

+1 thanks for sharing.

It's a fact of life that when programming, everyone makes obvious mistakes all the time, in simple algorithms and complex ones.
One method for dealing with this is pair programming -- with two people watching the one screen, a lot more of these mistakes get immediately caught, as you partner points out "you forgot a divide by 255 there" or "don't you mean minus two?", etc...

That's not very practical most of the time though, so the method that I use regularly is to use the debugger to step through every line of code that I write, and watch the execution flow and all values via the 'watch' window. Just by watching the data being transformed by your code, you can catch the majority of these simple human errors.
So, any IDE that made this process easier on me -- such as being able to instantly "step into" a function in the IDE using dummy data -- would be an amazing help for writing bug-free code.

I'm not sure how the rest of you go about writing bug-free code, but personally, watching the code running like this is a vital step in my ensuring it's correctness. If I've not inspected each line via the debugger/watch, then I don't trust there's not a simple logic error waiting to manifest as a bug at some point in the future. It's like the traditional "desk checking" method, but without the possibility of my "internal computer" making the same logic error I did when writing the code originally.


The problem I find with this is that this kind of debugging is thoroughly incomplete - by picking dummy data or a specific test case without generic testing you essentially invite more mistakes or you need to manually know to introduce them in the test data to catch most of the bugs IMO. While you can, indeed, catch simple execution ("human") errors that way, this kind of debugging for me is always the last resort as it tends to be too specific and has little to do with a real world environment. In fact, it is doubly useless if there is more than mistake in the code - which is often the case. As for me, I:

1) write the algorithm as best I can
2) I spend 5-10 minutes on the initial problems caused by typos, missing early outs etc (I've generally become quite good at catching them and automatically employ certain failsafes to help locate the bug faster: for instance I always use while(1) and include specific termination conditions so I can quickly differentiate and test for infinite loops)
3) if the algorithm has any complexity, I will write very specific debug output and that tells me the information I need in the format that I need and, if needed, a custom test case that executes the algorithm using a configuration that I can control far more easily than typing it in (for instance by using the mouse). I've also spent some time implementing a real-time reflection technique for myself in C++ that allows me to control floats and bools from the UI that often helps FAR more than working with a static dataset (the fact that I had to do this in the first place, of course, is kind of blah)
4) only once I've located the general problem area, I do my best to pinpoint the specific error conditions (eg which iteration, which object, etc), set a breakpoint and step through only that

Most of the time this catches the error almost immediately at the account of having to identify and write the debug output for only the data that I actually need. Got a 2k or 4k line algorithm you're debugging with recursions and/or multiple nested loops running for tens of hundreds of iterations? Yeah, you're going to want to be specific about what you actually see (the approach in the video would help if it allowed to pick specific variables or, better yet, form debug statements on specific lines involving multiple variables and expressions that could be added/removed/manipulated at runtime that the debugger would collate across the execution process and display in proper human-readable fashion).

For instance I spent the past 3 weeks implementing and perfecting my CSG code. At the end of the day I learnt a few things about my implementation that it cannot and is not supposed to handle (eg multiple object overlaps across multiple operations), but more importantly I spent 3 weeks on the code because it's far from trivial to debug and is prone to logic errors rather than coding errors (which only make up a small subset). A debugger like that would not have helped me one bit when dealing with a test case of 80 faces, 300 intermediate vertices, 20-way branching, quadruple-nested loops and two execution paths. In fact, IMO it would've been distracting and misguiding.

For instance I ended up needing far more illustrative debug output and a cold shower to figure out that polygon winding played a huge part in the merging process (a problem I deduced visually that harked way way back to the bottom level deep inside my triangulation code). For a situation like this, which I believe to be much closer to a real-world scenario where the algorithms are far more complex and far more specific, a data-driven debugger would have been useless unless it was really clever about what kind of feedback to actually give me. As such I believe there (at least for now) is no substitute to the human programmer when it comes to implementing non-generic algorithms: the flow NEEDS to be visually present in the programmer's head, the programmer NEEDS to understand all execution paths and fringe cases and the programmer needs to understand that the code will only be as good as he or she is, because no tool will ever fix the biggest problems that cause implementations to fail: logic errors.

#10 SteveDeFacto   Banned   -  Reputation: 109

Posted 20 February 2012 - 03:57 AM

Over 50% of the human brain is dedicated to visual processing


I know you were just speaking figuratively but this is completely untrue. The cerebellum actually houses about 60% of the neurons in the human brain. I would say it's probably closer to 10% of the human brain which is dedicated to visual processing. Neural networks are very good at recognizing patterns no matter how abstract they may be. On the other hand things which involve timing and memorization seem to require many more neurons. The reason visual tasks are so easy has less to do with the amount neurons dedicated to the task and more to do with the very way neurons work.

#11 way2lazy2care   Members   -  Reputation: 782

Posted 20 February 2012 - 08:28 AM

I think a large point he makes that's overshadowed with his demo is how little progress has been made toward making a language that's more inline with current UI developments than text editing. I think a more visual language could be pretty interesting. To my knowledge there aren't any (good) low level visual programming languages yet?

I do think that his code editing visualizer could break down in more complex cases. It's pretty simple to handle functions more in line with functional programming, but I feel like it would either be unusably slow or just broken in more complex cases when there is a little more dependency on external data.

#12 Earthwalker   Members   -  Reputation: 110

Posted 20 February 2012 - 11:53 AM

Wow. Just wow. Thanks for sharing!

#13 dpadam450   Members   -  Reputation: 861

Posted 20 February 2012 - 01:13 PM

I'm confused. The time thing was cool but wasn't he just using a scripting language?

"I can change variables and code at run-time without compiling."

#14 swiftcoder   Senior Moderators   -  Reputation: 9760

Posted 20 February 2012 - 01:56 PM

I'm confused. The time thing was cool but wasn't he just using a scripting language? "I can change variables and code at run-time without compiling."

You try doing that with a scripting language. Go on, fire up Python/C#/JavaScript, and try to do what he's showing there.

There's nothing inherent to the language to what he's demonstrating. In fact, you can do everything he's doing there with a compiled language too (i.e. look at XCode's "Fix and Continue").

But what you do need is tool support. Tools that understand the semantics of your program, not merely the syntax. Tools that understand that you are building an animation, and thus can provide timing and prediction support like his does.

Tristam MacDonald - Software Engineer @Amazon - [swiftcoding]


#15 ddn3   Members   -  Reputation: 1261

Posted 20 February 2012 - 02:43 PM

I doubt one could write bug free code, bugs in a round about way are our limitions. A human being is only so smart we can only hold so many variables in our heads and map out so many states. My coding method is research, gather up examples of existing code, deconstruct the problem, construct a solution.. but more importantly I try to construct a short solutions. All said and done, smaller code is easier to maintain and debug, and i'm not talking about those trying to remove all variable names and spaces, small concise algorithms.

-ddn

#16 way2lazy2care   Members   -  Reputation: 782

Posted 20 February 2012 - 03:20 PM

I doubt one could write bug free code, bugs in a round about way are our limitions. A human being is only so smart we can only hold so many variables in our heads and map out so many states. My coding method is research, gather up examples of existing code, deconstruct the problem, construct a solution.. but more importantly I try to construct a short solutions. All said and done, smaller code is easier to maintain and debug, and i'm not talking about those trying to remove all variable names and spaces, small concise algorithms.

-ddn


functional programming is your bff?

#17 Mussi   Crossbones+   -  Reputation: 1736

Posted 20 February 2012 - 03:54 PM

What an absolute boss. I love his principle, I hope too see a wide range of tools going mainstream with this in the near future. Making tools based on this principle can be damn hard and/or time consuming though and might not always be worth it.

#18 ddn3   Members   -  Reputation: 1261

Posted 20 February 2012 - 04:01 PM

functional programming is your bff?


No I never got into functional programming, my brain doesn't work that way and it's actually harder to understand, for me anyways. I'm sure if your mathematically inclined, functional programming would come naturally but I'm not.. Closest I get to FP is some stuff i do in Lua..

-ddn

#19 swiftcoder   Senior Moderators   -  Reputation: 9760

Posted 20 February 2012 - 04:08 PM

No I never got into functional programming, my brain doesn't work that way and it's actually harder to understand, for me anyways. I'm sure if your mathematically inclined, functional programming would come naturally but I'm not.. Closest I get to FP is some stuff i do in Lua.

Functional programming doesn't have to mean LISP. If you've ever used the STL algorithms, a foreach loop, or a lambda function, you are using functional programming techniques.

And the STL algorithms are a great example of just how much trouble can be saved by applying functional techniques to a problem, even in a language that is generally not all that functionally-inclined.

Tristam MacDonald - Software Engineer @Amazon - [swiftcoding]


#20 tstrimple   Prime Members   -  Reputation: 1718

Posted 20 February 2012 - 09:46 PM


No I never got into functional programming, my brain doesn't work that way and it's actually harder to understand, for me anyways. I'm sure if your mathematically inclined, functional programming would come naturally but I'm not.. Closest I get to FP is some stuff i do in Lua.

Functional programming doesn't have to mean LISP. If you've ever used the STL algorithms, a foreach loop, or a lambda function, you are using functional programming techniques.


That's one of the things I love about javascript and especially jQuery. It's teaching people functional programming techniques without them really knowing it.




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS