Javascript is a horrible language and even worse environment. The semi-functional style is a redeeming quality, but concurrency is not supported so it's not very useful.
The standard implementation is shit. You're interpreting from scratch every time you load. Not only that, you're also downloading the entire code from the server. Which leads to unsafe hacks like GETting code and evaling incrementally.
There have been efforts to fix this--Flash, Java applets, and Silverlight, which will still always be the tools of choice for demanding web applications. But there has been a big push as of late to put Javascript and "HTML5" everywhere.
Ideally we need a standard bytecode that browsers implement with consistent behavior. Browsers become VMs. Then we can write as many compilers as we want. With Lisp, C, Python, C++, or [insert language] compilers, we can easily port current codebases to the web.
This, then the handling of HTML*/JS/etc (old content using shitty ideas) would be defined by the presenter, in bytecode including interpreters and renderers. Probably even exists real world generic APIs for low level primitives for graphics, sound and generic hardware interfaces that could be used in such systems, if not the theory to do so would at least be there. Not adhering to the standard which would have to define the primitives, by different implementations of this system, could be rendered an impossibility by design.