Jump to content

  • Log In with Google      Sign In   
  • Create Account

valderman

Member Since 02 Aug 2002
Offline Last Active Feb 27 2014 07:19 AM

#4879591 Starting to hate Google...

Posted by valderman on 02 November 2011 - 03:26 AM



Regarding concurrency:

An isolate is a unit of concurrency. It has its own memory and its own thread of control. Isolates communicate by message passing (10.14.4). No state is ever shared between isolates.

They're just taking the stupid idea of webworkers which was a useless feature and just throwing it into their language instead of implementing actual language locks and mutices.

I can't agree with you there. The locks & mutices threading model you are talking about is vastly inferior to Erlang-style concurrency (i.e. message passing, no shared mutable state).

Dart's Isolate is likely the result of the work they have done on coroutines in Google's Go - they quite closely resemble one another, although isolate appears to be a little more general (can spawn OS threads as well coroutines).

Seems like a good idea in theory until you actually do it. Pretend you have a set of data you want to work on with multiple threads. Like for multithreaded pathfinding or image processing. Suddenly you're forced to copy data. (In webworker's case you can't copy more than 1 MB of data). Doesn't matter really though since the time waste copying is so expensive that you've effectively negated any performance increase. That's not even counting the copying of the data back. The alternative is to just pass in a reference and go "hey you 2 threads work on half the data each and get back to me when you're done". No copying overhead. You'll notice in that situation that neither thread even had to touch the other's data at the same time so no lock was needed.

Care to clarify why you believe references and message passing are mutually exclusive?

The only time it makes sense is when your passing off like a few values or a small array for heavy processing. Then again you lock out the bigger uses of threads that way. The lock and mutex allows everything without restrictions.

Concurrency is an abstraction, not an optimization; the basic bronze age concurrency primitives are quite obviously useless in this regard, when compared to Erlang-style message passing.


#4857878 "Scripting languages"

Posted by valderman on 05 September 2011 - 10:49 AM



Also, I wasn't aware that Galois, Jane Street Capital, Firaxis, Blizzard, Standard Chartered, Ericsson, AT&T, Bank of America, Bluespec, Barclays, Credit Suisse, Deutsche Bank, Qualcomm, Facebook, Google, Adobe, EA, Crytek, Relic, Adobe, Cisco and MySQL, among others, don't exist.


What does this mean?

I disagree that Erlang, Haskell, O'Caml and Lua aren't used in industry. All the companies I listed use one or more of those four languages. Of course, if your definition of existence goes along the lines of "no technology that makes up less than 10% of its respective field really exists," then I have to agree that they don't exist. But on the other hand, then video games or image manipulation software doesn't exist either.


#4857788 "Scripting languages"

Posted by valderman on 05 September 2011 - 06:01 AM

Lots of interesting opinions on the subject. Obviously, in the context of a particular game or application, any language is a "scripting language" if it's used to script said game/application, and I guess it could make sense to define a "scripting language" as a language the primary use of which is to provide that sort of functionality for applications. That would still disqualify both Python and PHP from "scripting language" status though.



So, what exactly is a scripting languages?

Not C, C++, Java, C#.

This is how they are defined.

Erlang, Haskell, OCaml, Lua and such do not count, since they do not exist as far as mainstream programming world is concerned.


The distinctions here also matter only for commodity programming. There is very little money involved, a fraction of a percent, compared to software industry in general. So the bikeshed principle applies.

Well, let's pretend for a second that proof-by-shitloads-of-money isn't available; I think that'll make a more interesting discussion.

Also, I wasn't aware that Galois, Jane Street Capital, Firaxis, Blizzard, Standard Chartered, Ericsson, AT&T, Bank of America, Bluespec, Barclays, Credit Suisse, Deutsche Bank, Qualcomm, Facebook, Google, Adobe, EA, Crytek, Relic, Adobe, Cisco and MySQL, among others, don't exist.


#4857167 "Scripting languages"

Posted by valderman on 03 September 2011 - 09:08 AM

The local IT press has recently taken a break from evangelizing cloud computing and "noSQL" as the second coming of Christ to push their new favorite fad: "scripting languages." Allegedly, "scripting languages" are "easier to work with" than "other languages" and are generally awesome in a generic, not-quite-specified way. Of course, I got intrigued about these "scripting languages" - what are they, and why are they so awesome?

Evangelists of "scripting languages" tend to name Javascript, PHP and Python as examples of new, awesome scripting languages, relegating Java and the .NET languages to the status as old, boring and "hard to work with." So, what exactly is a scripting languages?

At first glance, could it be an interpreted language? Apparently not, because compilers exist for both PHP and Python, and all of the three "scripting language" poster children also have implementations using JIT compilation. Meanwhile, both Java and the .NET languages is commonly executed through bytecode interpretation, and REPL interpreters are integral to working with F# or Haskell.

Perhaps it's dynamic typing? After all, dynamic typing is supposed to totally awesome for increasing productivity. Turns out, C# supports dynamic types and Erlang, which I doubt anyone would call a "scripting language," is completely dynamically typed. Weak typing doesn't seem to be it either, as Python employs strong typing.

So what I'm trying to say here is that the term "scripting language" annoys me because it seems to be synonymous with "a language I think is cool and modern." Can anyone else come up with a good, consistent definition of "scripting language?"


#4791660 Random source code from the web. Who gets the cred?

Posted by valderman on 29 March 2011 - 04:27 AM



Posting random code on the web should be the same as leaving random furniture at the curb.


Legally you must get permission. For furniture on the curb that means verifying with the owner that they are actually giving it away and weren't moving it to the curb in preparation for loading onto a truck.


That's not entirely accurate. Once it hits the curb it's either the property of the city or public domain depending on where you live. Different cities have different laws for it, but I don't think any of them in the US leave it as the property of the original owner.

So basically, stealing a bicycle is perfectly legal in the US if the owner isn't careful about where he parks it? Wow.


#4790915 Can I get Windows 7 to overcommit memory?

Posted by valderman on 27 March 2011 - 03:02 AM

I guess this isn't the answer that you're looking for, but why not buy a bigger hard drive? Even if you're a bit hard up on money, a 1 TB hard drive isn't quite inexpensive, maybe $100AUD or less.

It's not that much of a practical problem really, it's more of an annoyance.


Unfortunately, no such luck in Windows 7. I get constant nag screens about the system being low on memory, and occasionally programs commit suicide because of failed allocations.

I've been running without a page file on many systems since XP times.

I've encountered out-of-memory problem precisely once, when a version of Flash had a nasty memory leak that gobbled up gigabytes.

Unless there is something fishy going on, then the explanation is quite simply that you don't have enough RAM.

I used to do that under XP too, and it worked just fine. However, everything I've read indicates that Windows requires all allocated memory pages to be backed by physical storage, so I sort of wonder how it did work. Perhaps programs were less prone to pre-allocate huge (in relation to that time's memory sizes) chunkds of memory back then.

some programs like to lie

They don't lie, they don't know.

They aren't telling the truth, isn't that the definition of lie, regardless of reason?

If you're a heavy user, you'll simply need to buy more RAM. And is this a 64-bit version of Windows?

A freshly started copy of Eclipse with one three-class project loaded plus Firefox is enough to cause the problem. I don't think they and W7 between them actually touch enough pages to cause memory to run out. Of course buying more RAM is a solution, but huge chunks of it are going to remain unused "just in case," which seems like a waste of perfectly good memory to me.



W7 requires that every allocated page of memory actually exists.

That's not true. VirtualAlloc can reserve a virtual address range without actually allocating physical RAM/pagefile space for it. The problem is simply that programmers WANT a huge pile of just-in-case RAM already there for them; it makes it faster to use when they actually decide to use it.

And physical storage gets allocated on first access to a page? That's the behavior I want to enforce for all allocations, but I guess Windows wouldn't let me? I'm a bit sceptical of the actual performance hit incurred by overcommitting, as you'll only get a page fault the first time you hit a previously untouched page of memory; seems more like a case of "my application is more important than the rest of the system" than actual performance concerns to me.


#4782221 Linux is a LIE

Posted by valderman on 05 March 2011 - 04:04 PM

getting Linux to do what you want is trivial if you have the right knowledge.




Wherein lies the crux. I know I wouldnt be ranting here if my first computer happened to have linux installed on it. Id have the 'right knowledge'. Nonetheless, windows 7 does generally 'just work'. My granny uses it. Linux has a far steeper learning curve. Thats fine; it caters to people who actually like to know how their operating system works and want to tinker with it. Im not one of them. I mean, im sure its fascinating, and given infinite time, id love to know more about it; in the same way id like to know more about the inner workings of my car, for instance. But both are tools to me, as far as getting my job done is concerned; I expect my operating system to offer some level of abstraction for interfacing with my hardware; im not looking forward to hacking it together myself. And its not what im getting paid for either.

Linux doesnt seem to offer as high-level an interface. Thats great, but claims that it 'just works' are seriously overstated, in my limited experience. Windows7 'just works'. Visual studio does. While im sure linux has changed in this direction, its not in the same ballpark.

I highlighted the central concept of your post. It's not a matter of deeply specialized knowledge or extensive tinkering, but of what you're used to and of the multitude of ways of working around or getting rid of annoyanced that you accumulate by working in an environment. Windows is also hell to work with if you're used to something that works differently; just ask any smug Mac user about it. If your Windows machine BSODs you know exactly what to do about it, and you probably know how to troubleshoot a broken Windows driver - when those things happen (and yes, they happen; all the time, in fact) it's my turn to whine about horribly broken inferior operating systems. But I don't, because I realize that if I had spent the same time on W7 that I have on Debian, I would have been able to resolve the problem.


#4782185 Linux is a LIE

Posted by valderman on 05 March 2011 - 02:06 PM


Bah,

What do you want for nothing? Rubber Biscuit?

Wanting it to just work, mac: $1500.
Getting it work in a usable environment only to get spyware from a porn site, windows: $300
Pulling your head out of your ass and learning to install home-brew drivers, Linux: Priceless.

Money can buy you ease, but freedom is worth more than pennies.


Yes I will agree linux has driver issues but...
A) it's free
B) it's free
C) it's open, for you to fix it.
D) it's free


You learn to either work around your problems, fix it your-self, deal with it, or fork out $300 for latest windows.
That being said Ubuntu sucks, get a real man's linux like Arch or Slackware :)


I paid $140 for the OEM version of Win7 Pro when I built this computer. And magically it just worked. I don't spend all day looking at porn or downloading torrents so malware hasn't been an issue. I payed $700 for a Mac mini and it worked even easier then Win7. I can see if you live at home, have a crappy part time job, and lots of free time that endless screwing around with Linux might seem 'fun'. I have a fulltime job and a social life, paying money for something that works makes a lot more sense. And as was stated, "free" in the corporate world generally ends up costing a lot more then just buying something that works.

I'd like to point out that the "endless screwing around" that a lot of people connect to Linux simply isn't there if you know what you're doing and make sure to get supported hardware (which is quite frankly trivial nowadays.) I've been using Debian extensively for eight years or so, and the only time I ever do any "screwing around" is when I get a new machine (or my boot drive dies and needs to get replaced, which happened last week.) The rest of the time, "it just works." If something needs to be installed, upgraded or configured, it's literally a matter of seconds.
Meanwhile, on my Windows 7 machine, it takes quite a lot of "screwing around" just to upgrade Visual Studio or install Erlang, and with all of W7's moronic quirks, I'd say it's nowhere near "ready for the desktop." Not because that's undisputable fact, but because I'm used to and enjoy the way Debian works.

Boldly proclaiming that GNU/Linux (let's avoid further interjections!) is teh bestest because it's free of cost is stupid because, as someone previously said, that just isn't true unless your time is worthless. Claiming superiority for Windows and whining about how hard it is to get something working on GNU/Linux is similarly retarded because getting Linux to do what you want is trivial if you have the right knowledge.

From a developer perspective both have their pros and cons and, excepting situations where your tools are only available or significantly better on one platform or another, which one is "better" is just a matter of experience and personal taste.

tl;dr OS arguments are stupid.


#4782116 Linux is a LIE

Posted by valderman on 05 March 2011 - 11:18 AM

2/10, trolled enough to get slightly annoyed. Well chosen subject, but execution needs more work.


#4781718 Differences between Game Programming & Software Engineering

Posted by valderman on 04 March 2011 - 05:53 AM

For NEXUS' comment:

The flow of developing a game is also very different to the development of a common application, in an application the requirements of an implementation are often quite clear so it is often (although not always) possible to just design the application in one pass and have an army of monkeys write rather known code. A game that gets designed in one pass and then implemented is almost inextricably bound to be a bad game, the process of making a game good is much more iterative and creative than applications, that is what makes the field so appealing to its professionals, the nature of a game mutates and evolves as it is developed.


This is simply not truth. Or at least not fully truth. "Often" is a strong exaggeration in the context of "in an application the requirements of an implementation are often quite clear."

Ever worked for a customer or on a technology that's fresh and you need to compete with other developers? I guess not....
A application idea will change a lot before the application is released. It's possible that a totally different application will be released, and it has almost nothing to do with the original idea. Market changes as fast as games' market.

But maybe I'm wrong, I'm not a real programmer

I agree that the comment quoted is rubbish; not only do specs change, but even the tardiest of enterprises have realized by now that the waterfall model is simply not a good way to design software. It sounds like the Nexus fellow doesn't know jack about neither game nor other application development.


#4780564 reason i love my wife

Posted by valderman on 01 March 2011 - 09:25 AM

Which brings us back to this thread where the OP's tone in the first is along the same lines; tried windows, it was broken and thus was implied that because his wife liked it those of us who don't use it are somehow 'wrong' for using Win7 which he seems to think is cluttered etc. which smacks of classic Linux User Superiority which many people (probably on both sides of the fence) are pretty sick of.

OP basically says "my wife liked Windows before, but she didn't like W7 and I didn't either for reasons X, Y and Z, so now we use Ubuntu and that's awesome" - Linux User Superiority? Give me a break! Calling his post "pointless bashing" is just lame; if someone had said that Ubuntu is slow, clunky and has an inferior way of managing software, I think you'd have called that relevant criticism rather than "pointless bashing."

While many Linux users - especially new Ubuntu converts - tend to be quite vocal and extremely annoying about their OS of choice, claiming that there is no anti-Linux bias in the lounge is ridiculous.


#4779789 reason i love my wife

Posted by valderman on 27 February 2011 - 01:55 PM

With regard to the people calling MikeP's post harsh:
Boy it must suck to live in a world where honest commentary that was implicitly asked for was delivered. Does it make you sad to know that people probably hate you for no reason, and some probably hate you for a reason, and some of those people and reasons might actually be valid? Welcome to the internet.

I don't know what world you live in, but where I live, responding to someone sharing an anecdote and a couple of reasons why they prefer a certain operating system over another with "good thing your wife is as dumb as you are, or she'd leave you in a heartbeat" would usually earn you a broken nose, and with good reason.

But then again, the lounge is just 4chan plus "like"-buttons after all, so I guess being the Internet tough guy is the cool thing to do, eh?


#4779698 reason i love my wife

Posted by valderman on 27 February 2011 - 10:39 AM


Wow, when i woke up this morning i only expected like one or two posts. Maybe something like: I like windows, or i like ubuntu too. Not someone who says that my wife and i are dumb-asses and my wife is going to dumb my ass for some stupid reason.

So SORRY for liking my wife's preference of operating systems. And SORRY you can't accept other's people choices even if they are in this 90% over generalized range you speak of.


I would think you were half intelligent if you did a duel install and played your games on windows but if you honestly think using wine makes more sense than windows for video games then I'm going to have to jump on the bandwagon and tell you that your logic fails hard...

The next time you're going to insult someone's intelligence, I suggest you think hard about the difference between dual and duel unless you feel like embarrassing yourself further.

Your argument also depends on OP wanting to run the very latest games; many old gems run like crap or not at all under Windows 7 or even Windows XP, whereas Wine handles them perfectly.

Funny you should mention python

Disregarding the irony of a developer depending on Python 2.x complaining about distros are shipping old versions, why not simply put Python 2.6 as a dependency and explicitly specify #!/usr/bin/python2.6 rather than #!/usr/bin/python as the first line of any executable script?

(Also, complaining about incompatible "broken" Python versions when your application is written in Ruby, a language with no official specification, seems... odd.)


#4779657 reason i love my wife

Posted by valderman on 27 February 2011 - 08:25 AM

I miss the good old days when the lounge consisted of political flamewars, MindWipe's drunk posts and references to the place that does not exist. I also miss being able to inflict horrible internet retribution on people who are being dicks without reason by downrating them.


#4770828 Compiling GLIBC on Linux. Oh golly gosh!

Posted by valderman on 07 February 2011 - 05:32 AM



He has 64 bit. But that's another good point.. obviously I want my game to be playable on both 32 and 64 bit installations. What's the best way to deal with this? Is it possible to create 1 binary that will work on both? Or do I need to build 2 separate binaries?

Create an LSB-compliant 32-bit binary, make sure your program depends on the proper 32-bit packages. Unless you're going to use 2+ gigs of RAM, there's no point in also creating a 64-bit binary.


Except all those additional juicy registers and a better ABI, and SSE2 minspec. No reason at all :)

Then it's better to just compile a second binary with -msse2. As demonkoryu says, more registers and fancier instructions do nothing for you if you hit RAM all the time. Add in the porting issues you get for using a low-level language and maintaining two binaries suddenly seems a lot less attractive.




PARTNERS