Jump to content

  • Log In with Google      Sign In   
  • Create Account


Inventing on Principle - amazing video!


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
66 replies to this topic

#41 ddn3   Members   -  Reputation: 1248

Posted 25 February 2012 - 02:15 PM

Such a free form tool which encompasses the ideas of immediate feedback, reversible execution, built in execution safety nets like memory overruns, infinite loops, in execution memory modification etc.. Is it really needed? It's nice in theory but do people think or develop in such a fashion? It's a like sandbox for programs for programmers, but every example he shows, is example of a pre-solved system after which u then apply foresight into the problem space and then expand upon the exist solution through the use of selective paramaterzaion, customized input and output..

That's nice but it requires foresight into the soultion space after the fact. I'm not sure how others come about with their software solutions but for me, program space is too large to just mess around with a bunch of parameters to come up with a solution ,( for humans anyways, a machine learning algorithm is another thing all together ). Even a trivial problem expands with recombination, beyond human comprehension. You will still need insight to guide those random walks to produce anything eventful, imo..

I think UI is very important and is the limiting factor for most systems but I don't think creating a open ended sand box for programs for programmers, is the answer. I'm not even sure what was the question.

-ddn

Sponsor:

#42 shurcool   Members   -  Reputation: 439

Posted 25 February 2012 - 09:15 PM

Such an incredible, excellent and inspiring talk, and I'm at a loss for better words. Thanks so much for sharing! I saw this on twitter, but didn't get a chance to see it cuz of 1 hour length. I wish I knew watching the first 7 minutes would be enough (to understand that you must see everything).

I really like his approach to life, and I'm attempting to do something similar.

I think a large point he makes that's overshadowed with his demo is how little progress has been made toward making a language that's more inline with current UI developments than text editing. I think a more visual language could be pretty interesting. To my knowledge there aren't any (good) low level visual programming languages yet?

I do think that his code editing visualizer could break down in more complex cases. It's pretty simple to handle functions more in line with functional programming, but I feel like it would either be unusably slow or just broken in more complex cases when there is a little more dependency on external data.


I'm actually working on one right now. I started a few weeks ago, so it's very early without much to show yet, but I'll see how it goes.

I'm doing it precisely because I believe there is potential in the area (after all, if you remove your restrictions, such as code must be "plain text", then you should be able to achieve at least as much or more, right?), and I didn't see any existing solutions. I'm making something I want to use myself, and if it's good I'm sure others would want to also.

And considering I do have all the latest tools, I don't see anything that has moved even an inch beyond the Turbo Pascal era of tooling. But apparently, VS 2011 will be gray instead of purple?

Don't forget the achievements.

To be fair, there are actually some very nice incremental improvements in VS 11. I'd say more so going from 2010 to 11 than 2008 to 2010. And I think VS needs to exist, because it fulfills a specific purpose for the time being, and cannot be replaced instantly. But I believe there should be work done on alternative approaches, and perhaps they can one day overshadow VS (if they're good enough).

Where is the time-travelling debugger though? We need to have that asap please.

#43 shurcool   Members   -  Reputation: 439

Posted 25 February 2012 - 11:40 PM

Actually, I just found a project that does something very similar to what I was about to build, Code Bubbles.

It seems it branched off into an experimental thing at Microsoft for Visual Studio named Debugger Canvas.

http://msdn.microsoft.com/en-us/devlabs/debuggercanvas

Check out the video at the bottom. Very cool!

#44 shurcool   Members   -  Reputation: 439

Posted 29 February 2012 - 11:39 AM

Where is the time-travelling debugger though? We need to have that asap please.

Hmm, is this it?

Posted Image

http://www.microsoft.com/visualstudio/11/en-us/products/compare

#45 Antheus   Members   -  Reputation: 2393

Posted 29 February 2012 - 12:10 PM

Hmm, is this it?


To a degree.

Now extrapolate this into fictional territory.

User submits a bug. You click a button and end up with their application state. You move forward and backward to diagnose that.

Not inside debugging session, but regular user, perhaps visiting a website or running an app. Doesn't really matter, the concept of "application" or "exe" is not all that important anymore, one gets a snapshot of user's workflow and all that affects it, be it current process, driver, OS, hardware.


Debugging history today is limited to capturing a small set of events inside a specially crafted debugging session. It works with limited set of managed languages which can establish such a separation.


But more on topic of direct modification of application (or game) state. Debugging crashes is still a pain. Rather than focusing on technical widgets, imagine usability only. When a user experiences an issue (not necessarily a crash), they may opt to send the application state directly. No more bug trackers, filling out issues, writing "steps to reproduce". They simply say "the button should not be here".

As developer, you receive such issue, then just move a bit back or forth through execution to pin point the error.

Next, there's diagnosis tools. Mark a logical item in such application snapshot, "find all references" to see how it got into that state (when was button.visible set to true). This isn't manual reference chasing, but automated analysis of program flow. Most of this type of programming is nothing but reference chasing. A very manual task for majority of part.


A tool like this could be built, but only given sufficient hardware and even then it would be pushing everything, from storage to computational capacity.

We have small and limited tools that help with individual parts of the process, but not the next step. Just like 20 years ago building a dictionary was a considerable technical challenge due to hardware constraints, it's trivial today (set<String>).

Hence I mentioned the unpopular CS topics which do hint that such tools will likely never be possible or that a particular step will become possible in a given number of years when hardware advances to the point where problems we're solving are smaller than that.

Instead of P vs. NP, it's closer to economy and exponential growth. For most human endaevors, there is an upper limit. Dictionary has finite size and hardware outgrew that. Maybe we'll reach an upper limit on computer-assisted tasks as well.

#46 Hodgman   Moderators   -  Reputation: 27622

Posted 29 February 2012 - 11:16 PM

Now extrapolate this into fictional territory.

That's not fiction - that's modern debugging...

User submits a bug. You click a button and end up with their application state. You move forward and backward to diagnose that.
Not inside debugging session, but regular user, running an app. One gets a snapshot of user's workflow and all that affects it, be it current process, driver, OS, hardware.
When a user experiences an issue (not necessarily a crash), they may opt to send the application state directly. No more bug trackers, filling out issues, writing "steps to reproduce". They simply say "the button should not be here".

This has been mainstream for 10 years now...

Debugging history today is limited to capturing a small set of events inside a specially crafted debugging session. It works with limited set of managed languages which can establish such a separation.

No it's not -- in a modern C/C++ game engine, from when the user starts the game to when they encounter a bug, every frame of their entire session can be reported and analysed (and imported into a "special debugging session" if required).

But more on topic of direct modification of application (or game) state. Debugging crashes is still a pain.
As developer, you receive such issue, then just move a bit back or forth through execution to pin point the error.

We can do this easily these days. Modern game engines need to support NUMA architectures, which means the lowest levels operate on the "input->process->output" black box concept. Inputs and outputs are buffers, and processes are scheduled so they only run at safe times.
At any time, you know exactly what buffers are potentially being read or written (global state is dead). Furthermore, before/after each process, you can store the state of any mutable buffers, allowing perfect rewind functionality.
When the user submits an issue, you can use the submitted replay to end up in the problematic state (deterministic replay of the user's session) and then rewind/replay to find the cause / bug. This is real and is easy to implement, not some far-fetched impossible idea.

Next, there's diagnosis tools. Mark a logical item in such application snapshot, "find all references" to see how it got into that state (when was button.visible set to true). This isn't manual reference chasing, but automated analysis of program flow. Most of this type of programming is nothing but reference chasing. A very manual task for majority of part.

We've had tools to automate this for a long time. Both for general reference-chasing, and for change-detection.

A tool like this could be built, but only given sufficient hardware and even then it would be pushing everything, from storage to computational capacity.
Hence I mentioned the unpopular CS topics which do hint that such tools will likely never be possible or that a particular step will become possible in a given number of years when hardware advances to the point where problems we're solving are smaller than that.

Uh, no. Get with the times, grandpa. This shit is real.

#47 Antheus   Members   -  Reputation: 2393

Posted 01 March 2012 - 09:36 AM

in a modern C/C++ game engine


I'm not talking about game engines or C/C++.

I'm typing this in Firefox. How do I click a button to send you current state of my application so you can stack trace it. While you're at it, fix the DX bug which causes corruption when I resize google maps at over 1024 width (here, I opened the problem page in another window, look at the computer state to find out if it's a driver or application bug)

Because for me, just building FF takes 2 hours and breaking to enter debugger and load symbols at a single point takes 15 seconds.

What am I missing here?

We can do this easily these days.


OK. Why aren't you making billions with it and collect Turing prize for it? Because tool like that would be revolutionary. Why doesn't anyone know about this? Why do you let others file patents if its so trivial and a done deal?

Is it possible we're talking about something different?

#48 swiftcoder   Senior Moderators   -  Reputation: 9587

Posted 02 March 2012 - 02:59 AM

I'm typing this in Firefox. How do I click a button to send you current state of my application so you can stack trace it.

Worth pointing out that this is pretty trivial in any language with serialisable continuations. Plus, somebody has actually implemented it in SmallTalk.

Tristam MacDonald - Software Engineer @Amazon - [swiftcoding]


#49 Antheus   Members   -  Reputation: 2393

Posted 02 March 2012 - 07:35 AM

Worth pointing out that this is pretty trivial in any language with serialisable continuations. Plus, somebody has actually implemented it in SmallTalk.

Apparently, Smalltalk is for grampas, while the cool kids are doing this today with everything.

But it doesn't answer my original question - if this has been a solved problem for 10 years, where is button to do that when I'm debugging Firefox and driver issues running on Windows?

But the thread has come full circle back to my original claim. The design of OSes and other components solidified around proprietary blobs, making development of such tools impossible at any meaningful scale, which is why interest and research, academic or otherwise died down.

Small steps are being made within sandboxes (CLR, JVM, JS engines), but these are still a far cry from stuff that was already done decades ago.


It's also why I feel presentations like this are important. Even if they don't show anything that is technically new, they remind about ideas. Telling someone to go learn Smalltalk won't produce results. But showing a tool like that makes people ask why we don't have stuff like that already may cause them to discover not just that, but also other technologies and different approaches. And it sparks interest into more than just banging out some API calls with Vendor Tool 2012.

Node.js and NoSql, the two most overhyped words recently aren't new or even innovative. But they revived some old ideas in practical form available to developers who do not want or cannot reason about underlying scientific and engineering aspects of concurrency theory, yet use these tools to great effect for actual value in their work. What was once constrained by tool no longer is. Dissemination of ideas and interest is much more important than siloed pinnacles of knowledge.

#50 swiftcoder   Senior Moderators   -  Reputation: 9587

Posted 02 March 2012 - 09:15 AM

All right, I have an article I think you'd enjoy: "Free your technical aesthetic from the 2010s: A rejection of the rejection of the 1970s".

Tristam MacDonald - Software Engineer @Amazon - [swiftcoding]


#51 kordova   Members   -  Reputation: 138

Posted 04 March 2012 - 10:54 AM


No I never got into functional programming, my brain doesn't work that way and it's actually harder to understand, for me anyways. I'm sure if your mathematically inclined, functional programming would come naturally but I'm not.. Closest I get to FP is some stuff i do in Lua.

Functional programming doesn't have to mean LISP. If you've ever used the STL algorithms, a foreach loop, or a lambda function, you are using functional programming techniques.

And the STL algorithms are a great example of just how much trouble can be saved by applying functional techniques to a problem, even in a language that is generally not all that functionally-inclined.

No one has used LISP since the 1980's. Lisp, or Common Lisp, is not functional.

#52 swiftcoder   Senior Moderators   -  Reputation: 9587

Posted 04 March 2012 - 12:53 PM

No one has used LISP since the 1980's.

We still teach Scheme, and there is still a large body of projects using various dialects of Lisp - close enough. The capitalisation (or lack thereof) is irrelevant.

Lisp, or Common Lisp, is not functional.

That was kind of my point. You can write in a 'functional style' in any language with rudimentary support for first-class functions. Pure functional languages themselves are a tad unpleasant to work with - nobody likes pedantry, after all.

Tristam MacDonald - Software Engineer @Amazon - [swiftcoding]


#53 kordova   Members   -  Reputation: 138

Posted 06 March 2012 - 07:32 PM


A system like that exists. (insert obligatory comment about Lisp doing it before everyone).

It's a Turing machine, we get it. But quite frankly, whether in software or hardware, a Turing machine is still a Turing machine.

What does Turing-completeness have to do with anything? That we can accomplish this with any language/machine? Are we going to skip over the barrier to implementing something like this in most languages/OSs?

If that's all you have to say, I'd argue that you don't get it. LISP machines were (are) far from perfect, but there's something magical about being able to redefine/add/extend functionality to the editor you're using _in_ the editor you're using. Or your documentation browser, web browser or just about any other application, not to mention significant portions of the OS. Outside of tinkering with an ancient LISP OS, you can still get a feel for it from modern products like LispWorks and Allegro. I use MatLab and Visual Studio regularly, but some of the things I am able to do with LispWorks, despite a lot of clunkiness, can save a lot of time and allow me to bend the tool to me/the project as opposed to the opposite (or dealing with very chunky SDKs/APIs that still require re-compilation, etc). I once used a SmallTalk system, whose name escapes me, that encompassed similar ideas.

It's not necessarily dependent on Lisp, Common Lisp is far from my ideal Lisp, but there are not too many alternatives, yet. More languages are becoming "alive", in the sense that the compiler exists at runtime and you can poke around just about anything while it is executing, but we're not quite there. Certainly, most languages still do not encourage "dangerous," otherwise "unnecessary" functionality that compiler macros and vanilla macros provide for. I'm amazed that we don't see more of these living systems in Ruby, Python and Javascript. Demos like in the OP come up pretty frequently in smaller circles, but I'm still surprised we don't see more concerted efforts at "living" IDEs and the like.

Yeah, you _can_ do that in a lot of isolated contexts on Linux, OS X and Windows, but it is no where near as easy or as significant as it was when it was a genuine focus. Turing-completeness is entirely irrelevant to that line of discussion.


No one has used LISP since the 1980's.

We still teach Scheme, and there is still a large body of projects using various dialects of Lisp - close enough. The capitalisation (or lack thereof) is irrelevant.

I know, I use Common Lisp professionally. Even in the CL, Scheme and Clojure communities, LISP is understood to be the historical thing the past and Lisp is generally accepted to mean CL, unless otherwise stated.

#54 swiftcoder   Senior Moderators   -  Reputation: 9587

Posted 07 March 2012 - 08:56 AM

there's something magical about being able to redefine/add/extend functionality to the editor you're using _in_ the editor you're using.

And in the few areas that turned out to be a reasonable end-user feature, I can still do that (i.e. emacs).

I really don't get your and Antheus' fascination with being able to do this at every level of the OS - it just strikes me as a maintenance/stability nightmare. What exactly is wrong with having this sort of functionality implemented in user space (and entirely ignoring the underlying kernel/hardware layers)?

Tristam MacDonald - Software Engineer @Amazon - [swiftcoding]


#55 Antheus   Members   -  Reputation: 2393

Posted 07 March 2012 - 01:44 PM

I really don't get your and Antheus' fascination with being able to do this at every level of the OS - it just strikes me as a maintenance/stability nightmare. What exactly is wrong with having this sort of functionality implemented in user space (and entirely ignoring the underlying kernel/hardware layers)?


Same reason why I want ability to put together my own PC from scratch. Or car. Or house. Or being able to take one of those apart.

Doesn't mean I'll do it, but I like to know ability like that exists.

In software, that's almost completely gone, even for developers.

#56 kordova   Members   -  Reputation: 138

Posted 07 March 2012 - 02:40 PM

And in the few areas that turned out to be a reasonable end-user feature, I can still do that (i.e. emacs).

I don't think this was some logical progression where the other ones were weeded out. It's more that for various reasons (not all technical, though those were there as well) that avenue of exploration just ceased almost altogether. Emacs isn't really a great example, because without integration, you end up with what it is now. An independent mini-OS sitting in your terminal. It does not play nicely nor is it consistent with the window manager,


I really don't get your and Antheus' fascination with being able to do this at every level of the OS - it just strikes me as a maintenance/stability nightmare. What exactly is wrong with having this sort of functionality implemented in user space (and entirely ignoring the underlying kernel/hardware layers)?

Well, I'd also point it out as security nightmare, though I'd estimate most machines are effectively single-user anyway. To answer your question, there's nothing necessarily wrong with what you are proposing, though when we mention the entire OS, we are probably including the UI which may run but is not actually modifiable in user space. Depending on your profession, access to exactly how interrupts are handled, redefining/hooking into any/all system calls and the underlying hardware might have non-academic benefits.

#57 Antheus   Members   -  Reputation: 2393

Posted 07 March 2012 - 02:54 PM

though when we mention the entire OS, we are probably including the UI which may run but is not actually modifiable in user space


That is just an arbitrary division. Hiding the code that performs file write through to disk doesn't affect security.

I'd also point it out as security nightmare


Security is a process. No tech makes anything secure by itself. For consumer, access to that might not matter. But most of this tech isn't available to developers. Even Google couldn't gain access to Flash, for example, and it would be hard to argue that they lack motivation, money or leverage.

With that said - kernels today are opaque and buffer overflows are still an important vector of attack, they are discovered even in JS engines.

Only thing that holds is that security through obscurity (aka let's hide this under layers of abstraction) doesn't solve the problem no more than exposing these parts weakens it (aka open source).

If security matters, all of these parts are opened anyway, even proprietary software needs to often be disclosed for review.

#58 ddn3   Members   -  Reputation: 1248

Posted 07 March 2012 - 04:06 PM


I really don't get your and Antheus' fascination with being able to do this at every level of the OS - it just strikes me as a maintenance/stability nightmare. What exactly is wrong with having this sort of functionality implemented in user space (and entirely ignoring the underlying kernel/hardware layers)?


Same reason why I want ability to put together my own PC from scratch. Or car. Or house. Or being able to take one of those apart.

Doesn't mean I'll do it, but I like to know ability like that exists.

In software, that's almost completely gone, even for developers.


I don't think it's gone, you could get a commodity level blank PC like the Raspberry or build one and flash it's bios with your own custom bios to boot up your own custom OS which allows you to run code in ring 0, but that's alot of work, for very dubious gain. Having the ability to "hack" the execution flow, permutate data and code on fly and have execution safety all on one, that's a major technological challenge, if your not willing to run VM sandboxes everywhere. And even then it's all based on a premise that this is a "better" way to develop programs, which I don't even think it is.. Truthfully there is nothing stopping anyone from doing this, if they wanted too..

Good Luck!

-ddn

#59 Antheus   Members   -  Reputation: 2393

Posted 07 March 2012 - 06:34 PM

I don't think it's gone, you could get a commodity level blank PC like the Raspberry or build one and flash it's bios with your own custom bios to boot up your own custom OS which allows you to run code in ring 0, but that's alot of work, for very dubious gain.


Yes, for the fifth time in this thread, one needs to go to extreme measures to gain sufficient insight, because most widely available platforms (aka cheapest and ubiquitous) no longer provide it.

It's also the same thing if you build your own engine that supports discussed functionality - it will be built from scratch for that particular purpose. Instead of messing with BIOS, one will be messing with integrating an interpreter or resource manager, but there are plenty of those available today - completely open, not as binary dll.

And even then it's all based on a premise that this is a "better" way to develop programs, which I don't even think it is..


Web is built on open stacks. I find it would be hard to advocate that choosing a closed stack there would offer more benefits and numbers support that. The reason is simple - when your service goes down, waiting for vendor to integrate their next fix into service pack 3 months from now won't help.

It also raises an interesting proposition. While those companies clearly benefit from open nature, the services they provide are not necessarily open.


Ironically, many VS developers swear by Resharper and Eclipse community is all about plugins. Microsoft is going to great lengths to open their development stack while struggling to retain complete control. So apparently there is commercial interest, even though these tools are used predominantly in IT which isn't known for technical prowess.

#60 kordova   Members   -  Reputation: 138

Posted 07 March 2012 - 11:50 PM

Security is a process. No tech makes anything secure by itself. For consumer, access to that might not matter. But most of this tech isn't available to developers.

I agree with this and your other statements. I meant this more in the context of swiftcoder's more conventional concerns about maintenance and stability. The notion of complete look through and ability to recompile on-the-fly any component from hardware drivers up on a multi-user system lends itself to security concerns. LISP OSs avoided this discussion by being single-user systems.

I wouldn't personally actually be concerned about security in the domains where I'd find this useful.




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS