are there any web browsers out there with full support for javascript, and can render html normally? I wouldn't need anything like flash or java applets, but full support for fancy javascript would be nice for getting through fancy web sites and such. If one doesn't exist, I think I might try to make one, and shoot for 5 - 15MB of ram usage.
yeah, I've got time on my hands, so I'm down. Thanks! I've actually been using edbrowse, which is interesting. It is basically using ed to browse the internet.
uzbl is interesting, it's simple and extensible, but unfortunately the code quality is rather low, it's being developed by some skiddies and has multiple, pretty devastatingly exploitable vulnerabilites
>>1
5-15MB of RAM usage on what? If it's 5-15MB with a 10MB page that would be very impressive. If it's 5-15MB with a 1KB page, that's not great.
>>5
uzbl uses WebKit, which means around the same amount of bloat as Chome and Safari, except with a different interface.
You could try adding JS to NetSurf.
I'll admit I've been writing a web browser myself too, in C. I've gotten not much more than the HTML tokenizer done, because it's a time-killing thing. If only there were 64KB web browser competitions...
5-15MB of RAM usage on what? If it's 5-15MB with a 10MB page that would be very impressive. If it's 5-15MB with a 1KB page, that's not great.
Hmm, that's a good point, I haven't thought about the sizes of the pages, or the images and all that. I know I could do better than firefox though. A multithreaded wget with https, with a html parser, and a simple renderer for positioned text and images, and a graphical interface, with a java script interpreter can't be that intense.
You could try adding JS to NetSurf.
I will check it out. I know there are quite a few browsers out there that work decently well but have no support for javascript. I would aim for full support of html4, or whatever. I know it is massive, but once the engine is together, it would just be filling the blanks. It might take a year of on and off work on weekends, but it would get done eventually. I might even go for html5 if I could leverage external libraries and keep overhead down, but that would be optional.
I'll admit I've been writing a web browser myself too, in C. I've gotten not much more than the HTML tokenizer done, because it's a time-killing thing. If only there were 64KB web browser competitions...
Lets start one on prog! grand prize for the one that can view youtube from a TI-83!
It is automatic memory management which means it MUST use some sort of garbage collection. You can still keep memory usage sane by implementing exponential collection strategy: start off with a small limit like 1MB. Once memory usage hits 1MB, run GC. You should have free space left over. If you don't, then double the limit to 2MB and allocate. Keep going until 4MB gets filled up, then run GC. If memory usage drops below 1/4 the current limit after any GC run then shrink (release memory back to OS).
That would be convenient for a copying collector. It might be better to use a generational collector, and when you garbage collect and find that you no longer have enough memory for the live set, allocate a new generation.
There is also mark and sweep style, which can be nice since you don't need to copy memory around. You do need to sweep through the entire heap to free the unreachable objects though, rather than just the live set. Here is lua's implementation:
why not? jpegs and pngs aren't that big, and seem to decode fast. If space for images was really a concern, one could use lower resolutions for the rendered images, but I don't think it would be a problem. And javascript isn't that complex.. And the dom is just a tree structure.
Name:
Anonymous2011-11-23 3:41
>>16
You know what, I actually just did some quick calculations in how much memory would be required to represent these pages in memory and it seems that 5-15mb per page is quite achievable.
Name:
Anonymous2011-11-23 4:05
this is currently the 3rd google result for "low resource web browser"
cool, that's encouraging. I'd probably only support one active tab at a time. Or maybe tabs could be supported, but the unviewed pages would be written explicitly to disk in a format that is quick to read and preserves the state of the running javascript. Alternatively, if running on a machine with virtual memory and an OS that can page stuff out to disk, then care could be taken such that a single web page and all state of its executing javascript would be contained within a single virtual memory page, or the fewest amount of pages possible. Then the browser could suspend the javascript execution of the web page and never refer to memory on those (memory) pages, allowing the OS to page the web page data to swap if needed. Running javascript and javascript timers would need to be suspended, which might break a few websites. But if they aren't designed to recover under such circumstances, then those websites are complete shit, although that doesn't change the fact that you may be forced to use them, so I don't know...
>>13
I was thinking of mark-compact: http://en.wikipedia.org/wiki/Mark-compact_algorithm
This allows constant-time allocations and doesn't waste half the memory doing so. Essentially the heap becomes a huge dynamic array that gets resized in amortized constant time.
>>17
You mean this thread? It's around 20KB right now. 5MB for that would be huge --- 250 bytes per byte. (I'm assuming you mean 5MB for the HTML DOM and associated data structures, not including the code.)
>>19 I'd probably only support one active tab at a time
What UI supports multiple active tabs?
The biggest source of complexity is probably the interaction of the JS with the DOM, since you want the DOM to be represented in some memory-efficient format they would not correspond to JS objects necessarily. JS can manipulate the DOM in many different ways and you would have to handle memory allocation/deallocation appropriately (e.g. a node that was created only by JS and not in the original page, would need to be freed once JS disposes of it.) Writing a browser that just renders static HTML is almost trivial in comparison.
>>21
Just have a sufficiently large browser, expansion with be 0, according to Newton
>>20
Play a youtube video. Open a new tab. Your youtube video will still be playing.
Name:
Anonymous2011-11-23 8:57
>>20 You mean this thread?
Pick any random bookmark and check the memory usage of that page. I'm assuming you mean 5MB for the HTML DOM and associated data structures, not including the code.
That's what I meant. The document itself should only be 10k-400k and the calculated form of that document shouldn't be more than 15MB in most cases.
Well that gc technique sure is nifty. Thanks for the link.
Yes, internet explorer uses reference counting for dom objects I believe. I think I'll try to leverage as much open source software as possible, but it none meet the low memory usage requirements, I'll take one and start replacing parts of it.
>>27
HTML 3.2 only, that's not too useful. At least 4.01 + CSS2.1 is common now. Expansion is the delta (i.e. the memory taken by the loaded document) divided by the size of the document itself. As of that posting this thread was 21411 bytes of HTML which gives an expansion of ~23, not bad.
Here's Dillo 3.0, and this page is now ~28KB: 3668K 4780K 1112K 39.1
(This browser is a single 1.2MB file. Unfortunately it has tons of statically linked dependencies due to being a Windows port instead of native Win32 which could make it much smaller.)
Name:
Anonymous2011-11-24 5:53
Dillo
Mor elike Dildo amirite
Name:
F R O Z E N C U D D E R !anusl5eyYc2011-11-24 5:58
>>40
Because when you think that way, no amount of memory and CPU speed is ever going to be enough? Because people want to be able to open 1000 tabs at once?
http is just a protocol. a compact and efficient implementation is possible. Caching could just be a hashmap from "domain-name/path/to/file" to the spawned structure that was created from the fetched data. Elements not recently used could be saved to a file, or destroyed, to save space.
html is easy to parse. You can always ignore tags that you don't understand (although hopefully enough of the main ones are supported).
CSS isn't that difficult to parse. It looks like the entire language is outlined below:
Once the html renderer can support the styles in CSS, it wouldn't be difficult to tie all that together.
Forms are just textual boxes. All of these things may take programming effort, but there is no reason for it to consume lots of memory.
The amount of memory used will be proportional to the size of the web page, there is no getting around that. So if the running javascript tries to allocate an array of length 6000000, then I'm kind of screwed. Although memory resources used by scripts should be capped anyway to prevent malicious pages from crashing your browser.
Not all devices have a Gig of ram. Even if it's cheap, it might not fit.
Name:
Anonymous2011-11-24 16:30
>>46
Here's a hint: there are less than 256 CSS properties, HTML elements, and HTML attributes. The in-memory representation of the DOM might even be smaller than the document, with the right implementation choices.
But spend too much effort on "compressing" the DOM and it might be impractically slow.
Name:
Anonymous2011-11-24 16:33
>>46 So if the running javascript tries to allocate an array of length 6000000, then I'm kind of screwed.
First, limit JS memory usage to some sane limit, such as 5 MB per page or less. Second, use an infinite compression algorithm to sacrifice CPU time for memory.
Although inserting elements in certain positions would involve reallocating an array, and could get slow if there are lots of sibling nodes. Alternatively:
struct dom_node {
enum dom_type;
struct dom_node* parent;
struct dom_node* first_child;
struct dom_node* last_child;
struct dom_node* left_sibling;
struct dom_node* right_sibling;
// Type specific data located at the end of this struct.
};
>>39
I'll use Windows' platform to compare against with Chrome since that's the most "consistent" system (i.e. relatively standard set of libraries available.)
GUI: Chrome's... chrome is rendered using their own custom UI library (just to be platform-independent). It wouldn't surprise me if several hundred KB of data were taken up just by the gradients everywhere. Making it native Win32 would shave a lot of unnecessary abstraction off.
HTTP client: Windows already has one: WinInet.
html5 compliant parsing algorithm: I have written a tokenizer in less than 12KB of binary. The entire parser won't be that much bigger, probably <64KB.
CSS parser: CSS is simpler than HTML in structure and thus easier to parse. Another thing <64KB.
Layout engine: Here is where OOP concepts make sense: each DOM node is associated with a CSS box object that has positions and other properties. Each one has a virtual draw() method. All the layout engine has to do is assign the positions to the boxes appropriately (the tricky part is doing it dynamically as nodes and CSS properties are changed by either the page loading or JS). The window message loop then triggers the appropriate draw()'s on the boxes visible in the current viewport on WM_PAINT and they'll paint themselves on the window. Other events like mouse clicks etc. (e.g. change the cursor to a middle finger when it goes into the box for the 'a' element, and back to an arrow when it leaves) could also be handled by other virtual methods on the boxes. My (very loose) estimate: <256KB.
JS interpreter: This should be the largest. There's this: http://code.google.com/p/tiny-js/
which is a little over 128KB (the majority of it is C++ bloat.)
Also this, didn't try to compile it (too many files): http://code.google.com/p/quad-wheel/
My (again very loose) estimate: ~512KB.
That amounts to a browser with <1MB of binary.
Although inserting elements in certain positions would involve reallocating an array, and could get slow if there are lots of sibling nodes.
Reads will be more often than writes, so I'd go with a dynamic array. If you use exponential reallocation then inserts and deletes occur in amortized constant time. A gap buffer or even a rope ( http://en.wikipedia.org/wiki/Rope_(computer_science) ) might be worth thinking about too.
>>51 It wouldn't surprise me if several hundred KB of data were taken up just by the gradients everywhere.
So, one color for the gradient start and one for the gradient end, which comes out at about 8 byte in ARGB? Yeah, hundreds of KB alright.
>>56
Heya. nope. I've been working on two other projects though. I'll get to it eventually, once I finish the things at the top of the queue, or give up.
Name:
Anonymous2012-02-27 22:55
>>58
Do you mean at the top of the stack? Because this project is really old.
Name:
Anonymous2012-02-27 23:04
>low resource web browser
>javascript
the "java" is in its name for a reason, ``faggot''
cool yeah. It probably wouldn't be too hard to integrate an existing minimal open source browser with good open source javascript implementation/library. It would just be matter of finding the best candidate, if one is suitable.
I'll bump this thread when I actually start something.
You can use existing resources on the system: C libraries, graphics APIs (GDI, DirectX, SDL, etc.), network APIs (Winsock, *nix sockets), and other miscellaneous stuff, but NOT something that will parse and render HTML/CSS, if you're claiming to have written a web browser.
Name:
Anonymous2012-11-18 7:42
Maybe this should be a /prog/ject. Fund it! Or set up a github. Whatever works.
If a sacrifice was made to use V8 or some other pre-existing javascript engine, it wouldn't be too hard to write almost the entire browser in JS. Not that it would be necessary or efficient, but it might be neat.
>>85,86
The suckless guys were actually disappointed with surf because of its reliance on WebKit and GTK+. The author said he wouldn't maintain it and recommended NetSurf or something instead.
Then again, they updated it recently, so perhaps they had a change of heart.
who cares? i'm not a poorfag from le africa buy more CPU ROFL XDDDDDDD
LOL I JUST LITERALLY
PEED
MY
PANTS
JUST A LITTE THOUGH
I MEAN ITS A LITTLE SPOT NOT LIKE IT RUINED MY CHAIR R NYTHING LOL BUT FOR REAL EPIC LULZ *HIGH FIVES* XDDDDDDDDDDDDDD
U FRUSTRATED U FRUSTRATED BRO U SO MAD WHY ARE YOU SO MAAAAD I CAN POST ANYTHING I WANT THAT IS HOW IT SAYS IN THE RULES I DONT CARE ABOUT YOUR FAGGOTRY RULES Y SO MAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAD
WHATA FUCK MAN xD i just fall of my chair kuz i couldnt and i CANT stop laugh xDXDXDXDXDDDDDDDDDDDDXXXXXXXXXXXXXXXXXXXDDDDDDDDDDDDDDDDDDD OMGOSH DDDDDXXXXXXXXXXXXXXXXXXXXXXXDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDD DDDDDD LOOOOOOOOOLLLLL THIS IS A SHIT XDDDDDDDDDDDDDDDDDDDDXDDDDDDDDDDDDDDDDDDDDD A BIG ONE XDDDDDDDD A GRAT ONE XXXXXXDDDD CONGRATS MAN XD
U FRUSTRATED U FRUSTRATED BRO U SO MAD WHY ARE YOU SO MAAAAD I CAN POST ANYTHING I WANT THAT IS HOW IT SAYS IN THE RULES I DONT CARE ABOUT YOUR FAGGOTRY RULES Y SO MAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAD
WHATA FUCK MAN xD i just fall of my chair kuz i couldnt and i CANT stop laugh xDXDXDXDXDDDDDDDDDDDDXXXXXXXXXXXXXXXXXXXDDDDDDDDDDDDDDDDDDD OMGOSH DDDDDXXXXXXXXXXXXXXXXXXXXXXXDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDD DDDDDD LOOOOOOOOOLLLLL THIS IS A SHIT hgXDDDDDDDDDDDDDDDDDDDDXDDDDDDDDDDDDDDDDDDDDD A BIG ONE XDDDDDDDD A GRAT ONE XXXXXXDDDD CONGRATS MAN XD
HOOOOOOOOLLLLLLYYYYY SHIT
whatr the HELL
WHATA FUCK MAN xD
i just fall of my chair kuz i couldnt and i CANT stop laugh
>>78
The core of the complexity is the HTML/CSS rendering engine. HTTP and everything else is auxillary, so you can use what's existing. Most OSs will have a network and graphics stack.
>>82 Dillo
Yes. K-Meleon
Shell for Firefox w3m
Yes. links/lynx/elinks
Yes, perhaps a bit too minimal.
Taking an existing rendering engine and giving it a new shell doesn't really change much. Rendering speed will be the same, and so will the rendering engine's memory usage.
>>101
Wow. That's a really nice come back. I'm so overcome with your humor that I have lost control my limbs, slipped out from my chair and embedded a pen into my eye socket. Fortunately your humor has distracted me from the pain, and if given the choice, I would pay the price again to view your wonderful post.
I'll learn fucking braille and use a speech synthesizer if I have to. FUCK IT'S WORTH IT. I DON'T CARE IF MY EAR DRUMS ARE PUNCTURED AND MY FINGERS ARE SEVERED. THIS INFORMATION IS TOO FUCKING FUNNYYYYYYYYY
are there any ray tracers out there with full support for voxels, and can render refraction normally? I wouldn't need anything like volumetric shaders or non-uniform refractive index, but full support for fancy refraction would be nice for getting through fancy images and such. If one doesn't exist, I think I might try to make one, and shoot for 5 - 15KB of ram usage.
are there any JAVA compilers out there with full support for bigints, and can do whole program optimization? I wouldn't need anything like quantum computation or 128-bit floating point, but full support for whole program optimization would be nice for getting through fancy applets and such. If one doesn't exist, I think I might try to make one, and shoot for 5 - 15 bytes of ram usage.
A little more than 2 years later and it seems my estimates in >>51 were quite pessimistic. Seeing the multi-megabyte beasts of browsers today probably bloated my own thoughts on how much code we really need...
I now have a HTML5 parser + DOM tree construction in under 24KB of binary, and this is before even having done any real attempt at optimisation on the code - the tokenizer is simpler than what's in the spec (but isn't a state machine) although should accept the same stuff, and the tree construction is almost exactly following spec, just ignoring all error detection (what's a parse error? that which by any other name would render just as well...) - and it's written in 32-bit C.
Entropy calculations suggest a lower bound for the parser and tree construction somewhere around 12KB, I'm guessing it maybe achievable with Asm and closer to 16-20KB with just factoring out the duplicated code in C (the 12KB tokenizer I mentioned above is a dumb state machine as per the spec, so if I rewrote this one in Asm it would likely turn out much smaller.)
Now all we need is just a CSS parser/box generator and renderer, and some miscellaneous UI and other bits, and WE'VE WRITTEN A FUCKING HTML5-COMPLIANT WEB BROWSER IN A <1MB EXECUTABLE!!!
Name:
Anonymous2014-01-01 6:25
But if it doesn't grind your computer to a halt how do you know it's working?
shoot for 5 - 15MB of ram usage
Why do people think that less RAM usage is a good thing? You buy RAM so you can use it. If you're not using RAM, you're reading from the disk, and your program is slow as shit. Let's say you render the visual part of the page then clear all memory. Every time the user scrolls, you read and parse the page from disk then clear the memory again. Now you have a stuttering clusterfuck of shit software killing your disk so your sacred RAM doesn't get wasted by doing something useful. Gee, why not add arbitrary sleep(1)'s to your code so you don't waste CPU cycles. Now you're a real EXPERT PROGRAMMER!!
>>118
I know what he's talking about. You ps chrome and it uses hundreds of megs of RAM. That would be fine if it was doing something, but it's mostly just sitting there. I agree with >>1 that there might be a better balance.
Name:
Anonymous2014-01-01 22:17
I have this problem too. The javascript enabled browser is the most ludicrous piece of software I run. Everything else is fine.
>>121
So called "lightweight" browsers based on webkit appear every now and then (uzbl, vimprobable, xombrero, etc), and all of them are useless. When it comes to just browsing, they are mostly fine, but they fail to block all those crappy ads, slow scripts, other shit. I tried them all, and switched back to firefox with several decent plugins. Besides, webkit is JUST FUCKING BLOATED.
Name:
Anonymous2014-01-02 0:24
>>122
I know, I've also tried them all and I also switched back to firefox with several decent plugins. The project I linked in >>121 does not depend on gtk, unlike uzbl, surf, jumanji, luakit and xombrero.
It doesn't make such a big difference, but might be incentive enough for me to build all the filtering scripts myself. uzbl was almost ok.
Name:
Anonymous2014-01-02 0:36
In many cases "lightweight" in the "suckless" or UNIX sense means "does not do anything useful".
Small is not beautiful. Useful and understandable is beautiful (which may also be small, but does not need to be).
I like Firefox.
Name:
Anonymous2014-01-02 1:01
>>124
I have been using Firefox from the beginning (it took me only a couple of minutes to download Phoenix back then, on my dial-up modem and it fitted on a floppy disk)
I don't like it anymore. I want to switch before it's dead (which is coming sooner than you think)
>>113
Not yet... it's not even got a UI at the moment. Maybe once it passes Acid2.
>>114
1.2MB? HTML3.2 only (NO CSS)? I don't want to be too optimistic here but I'm going to say even that is bloated compared to my revised estimates. I could take what I have now (which is HTML5 compliant) and add a dumb renderer that doesn't care about CSS at all and just uses the defaults, and it'd probably still stay below 256KB. What I'm aiming at is 1MB for HTML5 + CSS2.1 (maybe some of 3, we'll see) + ECMAScript 5.1.
>>118
Use != waste. And your process is not the only process on the system. It shares that memory along with all the others, so it makes sense to use as little as needed, otherwise you start using virtual memory, and that is when things slow down.
>>122
Look above you in the thread, there are dozens of these "browser shells" that just change the UI and nothing about the core rendering engine itself.
>>124
Why else do you think I'm aiming for HTML5 (other than the fact that its parsing algorithm is detailed in the spec)? This isn't like NetSurf or Dillo or any of the other original browsers out there - it's going to be far smaller than anything else, but at the same time more featured and compatible with more websites. It's designed to be the simplest thing that can possibly work. I don't really care about JS speed as long as it's "good enough" for most websites; besides, what's worse than all your other apps thrashing VM because your browser decided to eat all the RAM for its JS "optimisation"? Cut out all the "abstraction layer" cruft and FUCK "portability" (we're in a Windows world, I may do X11 in the future if I move there exclusively but they will NOT be the same codebase.) Don't have libraries upon libraries piled on top of what OS provides. Make the rendering path as direct as possible. No silly enterprise design patterns soup.
This is also going to be the web browser that puts YOU in control. Per-site/per-domain/per-path settings for security and privacy; a UI that doesn't treat users like idiots by hiding everything; total control over JS execution environment and rendered page contents (although I don't really want to turn it into a full interactive HTML editor); choose what plugins you want to run on which pages, and what they can do.
Ambitious? Definitely. Crazy? Insane? Maybe. But it's clear the only users browser vendors are listening to these days are the mindless idiots who want to be mollycoddled, and if us power user's complaints aren't going to be observed, we must do it ourselves. I realised this many years ago. Staying passive and complaining won't change anything. Failure or not, there is no other option.
When it gets to the point of passing Acid2, it will be released into the public domain. After all, this will become your browser, the people's browser, not mine. I may share further improvements to mine and I will encourage you to share yours, but unlike the forced-update mentality of other browser vendors, there will be no obligation. It will be yours to do with as you wish.
This young Lisp puppy knows a fellow EXPERT programmer when he sees one (don't treat the user like a retard, direct rendering path, public domain) and looks forward to seeing what you come up with :)
I like my Firefox but it's sad that the choice is Firefox, IE, Chrome, Safari and Opera and various "hardcore" (i.e. unusable) U.I skins (they're all samey).
Nothing. You make nice speeches, Cudder but fail to deliver.
Anonix, the anonymous Operating System ended up being a trivial rewriting of cat and echo. Your NASA-grade decompiler is too dangerous to be released. You're giving up fame and money for a safer world, I presume. And your 64k html5 compliant browser is just you waving your dick in the air. As usual.
I think unfortunately there are a lot of talented programmers who have "project ADHD" and end up getting nothing done. Doesn't mean they shouldn't be trying.
I do want to add to my positive previous comments that I also think saying "as long as it's good enough" is a sign of a not very mature programmer but pobody's nerfect.
>>129 Anonix, the anonymous Operating System ended up being a trivial rewriting of cat and echo.
Oh G-d, I remember that. That's when Cudder was pretending to be three girls instead of one. I'm surprised he's still around and even more so that he's still putting up the pretense of having any skill as a programmer whatsoever.
Name:
Anonymous2014-01-02 2:50
Cudder is essentially Terry A. Davis with all the delusions and none of the talent.
Name:
Anonymous2014-01-02 3:10
>>134
I wouldn't go as far as calling him non-skilled. Nikita was probably right when he described him as "a kike writing viri for shady russian business"
Name:
Anonymous2014-01-02 6:59
>>136 non-skilled
writing viruses
Are you... an idiot?
>>128
Chrome and Safari are both WebKit, so the current set of mainstream browser rendering engines really reduces down to IE's Trident, Firefox's Gekko, and all the WebKit shells.
>>129
Anonix reached the point where anoncoreutils was 50% complete (all my *nix systems have the utilities that were completed), but what lead to its demise was a decline in participation and motivation; it was a community project but interest waned. This (currently) isn't a community project, it's something I've been working on intermittently for the past couple of years. More importantly, the motivation for this is strong: every time I'm irritated by the browsers I use, I work on it for a little while. Anoncoreutils came out of the observation that GNU coreutils is bloated, and it was a source of irritation, but far less than that of the browsers available today. There are far more users of browsers, including those who have no idea what a command line is.
The decompiler/analysis system is a source of income for me and quite a few others, so it wouldn't make sense to release that (yet). I am not aiming for 64k, I am aiming for 1MB. Didn't I say this before?
>>130
There is a lot of software I wish to rewrite, but the one that stands out to me due to how much use it gets and annoyance it has caused me is the browser. Unlike a bloated GNUtility that really ticks me off only when Chrome and Safari are both WebKit, so the current set of mainstream browser rendering engines really reduces down to IE, Firefox, and I do an ls on the filesize or stumble on the sourcecode, the browser stands out whenever I use it.
I'm not specifically aiming at "low resource", but it just happens to be the easiest way of doing it, as a side-effect of omitting all useless complexity. I'm not asking for community help or anything (unless OP is still here; ideas would be nice). Maybe in another 2 years' time something interesting will appear, maybe not. That's the way that it goes, and it's what nobody knows.
>>142
Not public domain. But that also shows, the license isn't really all that important (to anyone besides RMS or Theo).
>>143
The benchmarks done on GNU coreutils vs Anoncoreutils has shown otherwise... ACU is all several times smaller than GNU but none of them are several times slower.
https://code.google.com/p/chromium/issues/detail?id=279464
"Since we have too many developers who would otherwise be doing nothing, let's reinvent scrollbars, and make them harder to use. When the users complain, ignore them until enough threaten to leave, then 'fix' the most obvious complaint and leave the others so we can have more work to do in the future."