Tim Sweeney on programming languages

Started by
55 comments, last by maximAL 18 years ago
I would see it using mathematical set notation.

// where [n] <=> [0, n)// note the "Indices:[0, n)", which means all the indices in the array fall under// the [0, n) range. I could have written "Indices:[n]", but chose not to for showset<n> Vertex[n] Transform (Vertex[n] Vertices, int[n] Indices:[0, n), Matrix m){  if (!Vertices || !Indices) return NULL;  if (!m) m = new Matrix;   Vertex[n] Result = new Vertex[n];   for (int i = 0; i < n; ++i)  {    // we have already (at compile time) verified that all the indices in Indices    // lie in [0, n), so the below check is superfluous    // if (Indices < 0 || Indices > Vertices.length)    Result = m * Vertices[ Indices ];  }  return Result;}


Edit: On second thought, that would make it terribly hard to parse. An easier way could be:

set< n | n >= 0 > Vertex:n Transform (Vertex:n Vertices, int:n Indices:n, Matrix m)
[ search: google ][ programming: msdn | boost | opengl ][ languages: nihongo ]
Advertisement
Quote:Original post by ApochPiQ
I'd love to see a language with the library support and rapid-development potential of the .Net family, but with the contract expressivity of something like Eiffel.
I wonder if, just maybe, Eiffel.NET would fit the bill [grin]
SlimDX | Ventspace Blog | Twitter | Diverse teams make better games. I am currently hiring capable C++ engine developers in Baltimore, MD.
Quote:Original post by Nathan Baum

This is called dimensional analysis.

Some math-related programming languages support it automatically, and there is research in other areas. Microsoft research have looked into it. A Zerksis D. Umrigar implemented it in C++, as did some other people. The Emacs calculator comes with dimensional analysis.


Bah. If I read that ZDU link right, his library is non-standard C++.

Meh.


From what I gather, he wrote it before compilers standarised upon standardising with the C++ standard.
Quote:Original post by Anonymous Poster
Yeah it would be nice if "utopia programming language" existed.


/me dreams on...


It's called Ada.
Quote:Original post by Nathan Baum
This is called dimensional analysis.
...
However, I do see a flaw in your approach. Strictly, the dimensionality of an expression is distinct from its domain. You can measure metres with integers, but also with rationals.


Well, yeah. That was a bit of an oversimplification just to illustrate a point.

However, I think there are some cases where domain should be explicitly constrained by the data type itself (as in the case of percentages) with additional, non-conflicting restrictions provided by individual functions that operate on that type. To nick an example from Sweeney, we need a way to express stuff like: function Foo(array a, index i [Where 0 <= i < a.count])

Ta for the proper terminology, by the way [smile]


Quote:Original post by Promit
Quote:Original post by ApochPiQ
I'd love to see a language with the library support and rapid-development potential of the .Net family, but with the contract expressivity of something like Eiffel.
I wonder if, just maybe, Eiffel.NET would fit the bill [grin]


I'll have to read that sometime when I'm less tired [grin]




Really, though, I think we need to quit trying to patch existing languages and technologies and start over from the beginning with a good philosophy in mind. Specifically, an emphasis on expressing abstract semantics will be important, and not just in the realm of data types - DBC, dimensional analysis, parallel processing, SIMD and the rest of the multi-data approach to parallelism... there's only so many hacks we can try to munge into C++.

Unfortunately, as many years of long, hard experience with languages has taught us, just having a brilliant language doesn't fix the problem. Languages like Eiffel and Lisp have their (often overly vocal) adherents, but don't take over the world. On the flip side, though, C# has taught us that, with the right hype machine and backing (i.e. probably Microsoft) and the right library (in C#'s case, .Net), a from-the-ground-up language certainly can make significant inroads and acheive widespread usage.

Because of this, my personal feeling is that we're not going to see The Next Killer Language come from a garage, an open source project, or even a small consortium of developers who are looking to move beyond C++. Frankly I think C++ is useful because it confers so much power and freedom on the programmer (contrast with Java, for instance) while retaining enough low-level semantic expressivity to be useful for low-level and performance-critical applications. The language has had massive staying power despite its glaring flaws simply because it lets us get stuff done in the way we find most effective (within obvious boundaries of course).

I think where we'll see the Next Big Thing is going to be in a language that is designed for maximum freedom of expressivity, not maximum adherence to "good principles." By allowing for flexibility, while still subtly and powerfully encouraging good design (similar to, say, the way C++ encourages resource management via RAII), we can shape a language that remains suitable for low-level development in the right hands, and yet remains safe and potent for high-level, intensely abstract applications - which is, of course, where productivity and reliability comes from.


OK, time to turn off the Brain Dump Switch [smile]

Wielder of the Sacred Wands
[Work - ArenaNet] [Epoch Language] [Scribblings]

Well I was looking at some stuff on the same issue and stumbled upon this: Mozart-OZ

I think that's the basic idea behind what most people want. A small, but flexible kernal language that's extended through libraries to provide a programming suite that allows you to program in whatever paradigim you want to, be it functional, imperitive, constraint/logic, dataflow, etc... It also supports parallel processing and network processing very well. The only problem is the current implementation is a VM, but I'm sure it's possible to improve this with a blend of JIT and static compilation.

The evenutual goal, as I see it, is that you take the compiler out of the compiler, relgating it to nothing more then a bootloader that starts execution of your program. That program has implicitly understood executable elements, which then provide a more general understanding of other elements. The program may then compose elements to generate a seperate executable (static compilation), generate executable in chunks and then run it (JIT compilation), execute pre-defined machine code as required (VM), or other possibilites. The whole idea is to boil programming down to the composition of a small set of implicitly understood elements, or "axioms". These compositions produce more complex elements, then libraries, then entire programs, etc...
[s] [/s]
I can see the fnords.
Quote:Original post by snk_kid
Quote:Original post by gregs
See my comment in this thread.


See my comments where I show gregs is just talking rubbish and completely out of context.


Grow up.
He misspells verifiable as verifyable twice.
Quote:Original post by CoffeeMug
Quote:Original post by Promit
Because god knows Tim Sweeney wouldn't know anything about designing scalable, robust, extensible, and stable software that can survive major changes in technology and requirements.

Does that in any way imply that he's a programming language design guru and cannot possibly make statements that he pulled out of his ass?


He posts over on Lambda the Ultimate, the programming language design weblog. Read some of his posts there if you wish to gauge how much he knows (that place is crazy, some of the features they put in their languages are mind blowing).

This topic is closed to new replies.

Advertisement