So I've decided to design a programming language, mostly for fun and learning before I start doing anything serious next year when I get in to collage. Right now I'm having problems figuring out if this type system would work - so I thought maybe you could help me out, share some ideas.
Basically the language would be strong/static typed (types must be determinable at compile time), but types would be implicit by default.
Example :
class Foo:
def bar
def someint as int
def __init__():
someint = 1
# this would generate an error since bar is not assigned to and the type of 'bar' can't be determined
def __init__(x):
bar = x # here the bar has the same type as x parameter which can only be determined by the compiler
someint = 0
def __init__(x as int):
bar = x # here the bar is int
someint = 1.0f # error - someint is defined to int
Class members don't need to have types specified until object construction. After the constructor all members must be assigned to and have their types determined.
The new object will have a type that will be a instantiation of the partially defined class.
So if i construct Foo like this
The typeof(a) would be Foo<bar=X>.
I could also use this notation to specify types, eg:
a = new Foo<bar=int>(1.0f)
This would generate an error since bar member is explicitly typed to int
It's obvious that for Foo<bar=x> the constructor using int would not work, and would generate compile-time error, but this does not seem like a problem IMO (it would be like specialization in c++ templates).
My questions :
A) would this type system work, and/or what problems do you see with this approach ? Any other languages featuring similar types system ?
B) Since this code would be statically compiled the compiler would have to generate code for each type combination of some class/function/method. This would probably generate large binary files, but what concerns me more is how would this affect runtime performance.
Is the CPU smart enough with instruction loading that it can efficiently cache code parallel to the program execution or is the loading serial,
eg:
-------predict calls and load----------
----execute-----------------call-------
or
execute -> call -> cache miss, load -> continue
Would this cause big performance penalties ?
C) One obvious problem is that virtual members need to have fully defined types, but if I am already creating a specialized function for each type combination can I ignore the vtable and inheritance and treat classes as structures at the code generation level ?
Eg.
class X:
...
class Y(X):
...
def foo(object as X):
...
y = new Y()
foo(y as X)
When I call foo(y as X) the compiler 'knows' that y is actually type Y.
Instead of generating code for one function 'foo' that takes a X type object and calls methods trough vtable, compiler generates a specialized instance of function 'foo' which accepts and works with Y. Only the semantic analyser "pretends" foo works with type X.
Then classes, inheritance, etc. become a feature on the level of semantics analysis, but in the code generation X and Y have nothing with each other except that X members are a subset of Y members.
Anyway, I have this strange feeling that there are some problems with this approach which would prevent it from working but I can't really tell what it is - so any comments/ideas/suggestions/corrections/further reading links are welcome :)