WebGL and JS are not modern 3D hardware. They are, in many ways, worse than PS1.
Why is WebGL worse than PS1? I have thrown a good bit at my GPU using WebGL and have had no problems. From multiple animations with ~60bones, to large triangle counts. I could send 1025X1025 terrain with no LoD to my GPU and it would chug it down without being my bottleneck. Why do you think that WebGL is worse than PS1 GPU?
Occam's razor - why isn't anyone making anything with them. Because, beyond hype and single-effect demos, they are useless for real world problems. And even then, if one were to put enough effort in them, the range they need to cover is incredible - from integrated graphics cards running on atoms to water cooled overclocked i7 running multiple 6970s.
[/quote]
I agree here, but the engine would essentially be platform independent (not browser independent however) so couldn't one just pick some processor cap and everyone under it is out of luck?
In WebGL and JS, there is none of that. It's a virtual machine. If HTML5 local storage is ever implemented, you will have 5 MB of disk, some 200MB of memory total and serial IO, which drives everything, from GPU to loading to any other external stuff that you have no control over.
[/quote]
Not being able to access the HD definitely does suck, but PS1/N64 didn't have built in HD's either. Of course they could stream things from disk/cartridge, but content can to some extent be streamed from a server these days as well. Also, my engine is already using up 400MB of DRAM, so why is there a 200MB cap?
I know that anything modern is out of the picture, but I think there is still some market for browser games. Zynga is proof of that, they turned silly 2D games into a (they were going to have a 1 billion dollar IPO the last I heard) successful business. I'm sure a lot of us wouldn't mind paying a dollar to play Ocarina of Time, or any other great nostalgic games.
Do you believe that even PS1 quality games are not possible with JS/WebGL?