Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon. Entire thread

ls/gnu bloat what?

Name: Anonymous 2012-11-14 12:42

As an enterprise programmer for the last ten years, being occupied writing services in high level languages with infrastructure grade availability, I was shocked by the sort of "lack of regard" to anything in the ANSI C and Posix API's. I was faced with a system where everything supposedly is a file but its primary programming language has no concept of a file system, where OS specific interfaces to the FS where ridiculously unmaintained as if it had come straight from the stone age...Then, referencing man pages just left a bad taste in my mouth because I found multiple errors (one even wrote that "some systems does not have this", where "some systems" would be my system containing the man page)! I guess it's easier to loose face than fixing a man page by adding relevant documentation, or better yet, fixing the interfaces. This was making me doubting the whole endeavor. Nevertheless, after two days of agony I had an 'ls' clone (equalivent in usage to the ls from coreutils 8.13 with the same optimization as my version).

Now, I decided to have a test and what I found was that my implementation was in average 63% quicker. I can not understand that because my implementation is huge and far from a quick hack, internally it has multiple modes of API's to access the directory structure, it has single linked lists, fixed arrays, a data reader with positioning, multiple sorting algorithms and frickin AA-tree just for fun. Yet, at it's slowest moment it still out-performs GNU ls.

I really can't explain that, what on earth is the coreutils team doing, what are the secrets of ls, how did they make it so slow?

Name: Anonymous 2012-11-14 13:42

>>2
And why stop reading? I detested what I saw and implemented what I'd normally implement as a serious programmer and it still out performed GNU ls, and frankly I didn't even try to do so, so I was genuinely surprised. So, tell me what is there in it, for me to not doing it the right way*, but the "not giving a fuck"-unix or -GNU way (while different by their own description they seem to be the same in principle)?.

*As in considering any problems naturally, caring about what the user might do and on what type of data, what that means to the performance of the application (considering low level programming, such as C, that opened many other questions that needed to be solved but not a big deal, to give optimal performance, or viable opportunities for further optimization), then what scenarios other programmers who might need to peek at files would prefer to access an API in their problem domain, etc. All this seems to be lost in unix&GNU. But that's not really what I'm wondering about. I just have one question, how 'ls' can be so slow if you take the freedom to disregard anything but ... I don't know, writing good programs? I don't know what motivates GNU people besides GNU Law principles...

Sorry for my english, I have to use translation for this.

Newer Posts
Don't change these.
Name: Email:
Entire Thread Thread List